sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10028441250324249,
0.12375818938016891,
-0.002191120758652687,
0.08993220329284668,
0.14039431512355804,
0.03116488642990589,
0.0871712788939476,
0.13473212718963623,
-0.0846666544675827,
0.05135582387447357,
0.0700601190328598,
0.07098131626844406,
0.04508204758167267,
0.1308826208114624,
-0.04175858199596405,
-0.21292941272258759,
0.006362016778439283,
-0.012617981992661953,
-0.06550116091966629,
0.10328768193721771,
0.08922118693590164,
-0.11559608578681946,
0.06751952320337296,
-0.020332781597971916,
-0.15732455253601074,
0.007596482522785664,
-0.03216404467821121,
-0.025159291923046112,
0.1056666150689125,
-0.017478369176387787,
0.09366632252931595,
0.019015248864889145,
0.13757918775081635,
-0.2151263952255249,
0.001178185222670436,
0.07643335312604904,
0.04704313725233078,
0.08716952055692673,
0.041878119111061096,
0.021256031468510628,
0.03793083503842354,
-0.14823661744594574,
0.09693970531225204,
0.025361552834510803,
-0.08841563016176224,
-0.13717220723628998,
-0.09159789234399796,
0.019911980256438255,
0.08321616053581238,
0.07908771187067032,
0.0069813658483326435,
0.1371794193983078,
-0.10461708903312683,
0.08064636588096619,
0.1996423453092575,
-0.2849069833755493,
-0.06662395596504211,
0.044468723237514496,
0.054437585175037384,
0.07192259281873703,
-0.11726407706737518,
-0.01502507459372282,
0.020528823137283325,
0.03216544911265373,
0.11184632778167725,
-0.020286714658141136,
-0.12954674661159515,
0.0062853554263710976,
-0.12809211015701294,
-0.011964105069637299,
0.10333763808012009,
0.03571397811174393,
-0.046321801841259,
-0.07846835255622864,
-0.06030309945344925,
-0.08673930913209915,
-0.03484198823571205,
-0.017342180013656616,
0.055657852441072464,
-0.05685284733772278,
-0.0630711019039154,
-0.04615127667784691,
-0.05678735673427582,
-0.08749117702245712,
0.00036531160003505647,
0.13632424175739288,
0.03976886346936226,
0.020507214590907097,
-0.03922482952475548,
0.11316628009080887,
0.000959483440965414,
-0.12896893918514252,
-0.0046386197209358215,
0.0032758954912424088,
-0.11279270052909851,
-0.05504448339343071,
-0.03648627921938896,
0.011798186227679253,
0.016592474654316902,
0.1488477736711502,
-0.04341704398393631,
0.0759493038058281,
0.01539455633610487,
-0.017304042354226112,
-0.01729416847229004,
0.15308694541454315,
-0.03319425880908966,
-0.040446940809488297,
-0.0016420260071754456,
0.09950568526983261,
-0.0014237662544474006,
-0.006722940132021904,
-0.08032292127609253,
-0.014854774810373783,
0.07486169785261154,
0.06916370987892151,
-0.052920613437891006,
0.03704547509551048,
-0.045756757259368896,
-0.024277670308947563,
0.02880491316318512,
-0.1257050335407257,
0.0389174185693264,
0.005731695331633091,
-0.08413106948137283,
-0.061534274369478226,
0.006286982912570238,
-0.009298662655055523,
-0.02629808522760868,
0.08285964280366898,
-0.07348649948835373,
-0.00046243221731856465,
-0.08625256270170212,
-0.07929235696792603,
0.0009101650211960077,
-0.1475541889667511,
-0.010962112806737423,
-0.04640187323093414,
-0.1933853030204773,
-0.033509526401758194,
0.05102559179067612,
-0.07901516556739807,
-0.03976820781826973,
-0.051809780299663544,
-0.07731981575489044,
0.010340075939893723,
-0.006834110245108604,
0.19161063432693481,
-0.060359515249729156,
0.0790058895945549,
-0.014551538042724133,
0.04910316690802574,
0.018718328326940536,
0.0493198037147522,
-0.08501818031072617,
0.02594643086194992,
-0.15237519145011902,
0.08246289938688278,
-0.10075763612985611,
0.022418059408664703,
-0.1247517317533493,
-0.08795904368162155,
0.05198616907000542,
-0.021936528384685516,
0.07738024741411209,
0.13903260231018066,
-0.19951856136322021,
0.0007121503003872931,
0.11544626206159592,
-0.04166330024600029,
-0.055713143199682236,
0.07612639665603638,
-0.058469612151384354,
0.04554407671093941,
0.0531652607023716,
0.19093091785907745,
0.07144773006439209,
-0.14694978296756744,
0.017359726130962372,
0.032346177846193314,
0.056354790925979614,
0.007305578328669071,
0.035453345626592636,
-0.0017002638196572661,
0.027251126244664192,
0.00867234542965889,
-0.08388524502515793,
-0.02396341785788536,
-0.09215575456619263,
-0.07356204837560654,
-0.05099920928478241,
-0.0975211039185524,
0.032206300646066666,
0.014327509328722954,
0.022834956645965576,
-0.07201976329088211,
-0.09793167561292648,
0.10326720774173737,
0.11837143450975418,
-0.050449054688215256,
0.016547244042158127,
-0.07750397175550461,
0.026665261015295982,
-0.016773583367466927,
-0.028629347681999207,
-0.20082105696201324,
-0.12030813843011856,
0.05143566429615021,
-0.04772740229964256,
0.02632436715066433,
0.02113407850265503,
0.06656275689601898,
0.05413065478205681,
-0.04328133538365364,
-0.018858682364225388,
-0.06698931753635406,
0.0035752314142882824,
-0.11509779095649719,
-0.20670467615127563,
-0.06650806963443756,
-0.03773369640111923,
0.13329510390758514,
-0.19347694516181946,
0.002924105618149042,
-0.02699512057006359,
0.11181516945362091,
0.01734784245491028,
-0.05084935575723648,
0.012521624565124512,
0.03732632100582123,
0.019310085102915764,
-0.09546254575252533,
0.056411951780319214,
0.010987131856381893,
-0.06893165409564972,
-0.04247535020112991,
-0.1099427193403244,
-0.002385299652814865,
0.06149021163582802,
0.07100928574800491,
-0.09955695271492004,
-0.015962574630975723,
-0.05354491248726845,
-0.03671155497431755,
-0.07456046342849731,
0.024201586842536926,
0.20408272743225098,
0.02919267863035202,
0.12367422133684158,
-0.0659838616847992,
-0.08027581870555878,
-0.001450671348720789,
0.016830146312713623,
0.03298559784889221,
0.10120511054992676,
0.07667453587055206,
-0.07858529686927795,
0.07975219935178757,
0.09311807155609131,
-0.03520488366484642,
0.11654629558324814,
-0.05360018089413643,
-0.07546983659267426,
-0.015904685482382774,
-0.0022808127105236053,
-0.029037583619356155,
0.14920803904533386,
-0.08403047919273376,
0.00441389437764883,
0.03872526064515114,
0.02502051368355751,
0.012918468564748764,
-0.17826537787914276,
-0.007151106838136911,
0.013517203740775585,
-0.06242544576525688,
-0.05726265162229538,
-0.026561971753835678,
0.03692645579576492,
0.09787136316299438,
0.03135789558291435,
-0.03017495945096016,
0.0205534640699625,
-0.012303248047828674,
-0.060233477503061295,
0.19145458936691284,
-0.12070660293102264,
-0.09020604193210602,
-0.0840672180056572,
0.02179826609790325,
-0.038172949105501175,
-0.035609759390354156,
0.008271722123026848,
-0.09956339746713638,
-0.054137278348207474,
-0.0855768546462059,
-0.023814840242266655,
-0.007164860609918833,
-0.006758806295692921,
0.029723703861236572,
-0.012537581846117973,
0.07845547050237656,
-0.13315603137016296,
0.0019837250001728535,
-0.03597906604409218,
-0.09651938825845718,
0.0188998281955719,
0.06783001869916916,
0.08275216072797775,
0.10246492177248001,
-0.013231166638433933,
0.015454787760972977,
-0.0277544055134058,
0.23933760821819305,
-0.06711409240961075,
0.016364706680178642,
0.08980517089366913,
-0.00010960135841742158,
0.05398384481668472,
0.13795939087867737,
0.03628145530819893,
-0.11425621062517166,
0.026743007823824883,
0.07736178487539291,
-0.018198097124695778,
-0.25097107887268066,
-0.031473707407712936,
-0.01986900344491005,
-0.0821155533194542,
0.0824962854385376,
0.03624055162072182,
-0.053781941533088684,
0.03368591144680977,
0.019282981753349304,
-0.005885117221623659,
-0.0334637351334095,
0.06599872559309006,
0.08586985617876053,
0.04150775447487831,
0.10080281645059586,
-0.024464773014187813,
-0.010576861910521984,
0.06368366628885269,
0.02043159306049347,
0.26446548104286194,
-0.03859327360987663,
0.12419314682483673,
0.03794121369719505,
0.1439976841211319,
-0.028075603768229485,
0.05151776969432831,
0.009063784033060074,
-0.006615749094635248,
-0.005363343749195337,
-0.0490308552980423,
-0.017400091513991356,
-0.0023681814782321453,
-0.03519604355096817,
0.021747073158621788,
-0.07693230360746384,
0.0272244643419981,
0.027683354914188385,
0.30392417311668396,
0.03971470147371292,
-0.2653544247150421,
-0.06770079582929611,
0.0028298606630414724,
-0.042405348271131516,
-0.07680964469909668,
0.004890234209597111,
0.137825608253479,
-0.12626610696315765,
0.03550542891025543,
-0.05065314844250679,
0.09322161972522736,
-0.034155163913965225,
0.011514496989548206,
0.06671514362096786,
0.14512202143669128,
-0.01175596285611391,
0.07098646461963654,
-0.21535354852676392,
0.23497594892978668,
0.026748215779662132,
0.1109868586063385,
-0.0637122318148613,
0.009345694445073605,
0.0036619447637349367,
0.04740789160132408,
0.11196709424257278,
0.007151155732572079,
-0.0025125998072326183,
-0.1751227229833603,
-0.09681528061628342,
0.05550288036465645,
0.11902055144309998,
-0.023929964751005173,
0.08655839413404465,
-0.03584418445825577,
-0.0027306354604661465,
0.03340434283018112,
-0.0817178413271904,
-0.1249099150300026,
-0.089972585439682,
-0.005728224758058786,
-0.002722048433497548,
-0.034531209617853165,
-0.05693644657731056,
-0.09017866849899292,
-0.02800966240465641,
0.13422466814517975,
0.020742861554026604,
-0.054465312510728836,
-0.13647285103797913,
0.04173703119158745,
0.13819696009159088,
-0.04634389281272888,
0.01785787008702755,
0.006602331064641476,
0.09976546466350555,
0.04599221050739288,
-0.07712894678115845,
0.06648661196231842,
-0.07527527958154678,
-0.16743142902851105,
-0.058680929243564606,
0.12020519375801086,
0.08281388878822327,
0.056144535541534424,
0.0023015926126390696,
0.03194583207368851,
-0.00553720910102129,
-0.08351123332977295,
0.020298315212130547,
0.04311516880989075,
0.09182561188936234,
0.03408350422978401,
-0.09581460058689117,
0.06732422858476639,
-0.04123963788151741,
-0.012073671445250511,
0.1229579895734787,
0.21745309233665466,
-0.09458383172750473,
0.10755281150341034,
0.07769415527582169,
-0.08106383681297302,
-0.1843664050102234,
0.07195539772510529,
0.12233839184045792,
0.014146721921861172,
0.03978200629353523,
-0.207720085978508,
0.1317032277584076,
0.1023070365190506,
-0.015460536815226078,
0.04325389862060547,
-0.30062785744667053,
-0.13073347508907318,
0.0753425657749176,
0.10602730512619019,
0.03947602957487106,
-0.12563270330429077,
-0.02167017012834549,
-0.01190720684826374,
-0.12975788116455078,
0.1401810646057129,
-0.08325441926717758,
0.1150650829076767,
-0.006572331767529249,
0.11597099155187607,
0.026764467358589172,
-0.038553059101104736,
0.13395892083644867,
0.06946727633476257,
0.09447982907295227,
-0.04308554530143738,
0.010939120315015316,
0.05425962805747986,
-0.06403356045484543,
0.03639901056885719,
-0.030868221074342728,
0.0698181763291359,
-0.16105327010154724,
0.00003524993371684104,
-0.09460673481225967,
0.036068521440029144,
-0.0519532710313797,
-0.055721674114465714,
-0.01969592645764351,
0.05458633974194527,
0.06928750872612,
-0.03950861841440201,
0.02848772518336773,
0.005361577961593866,
0.0774199441075325,
0.08785448968410492,
0.10805957019329071,
-0.03692229837179184,
-0.10150643438100815,
0.01936250925064087,
-0.008649758994579315,
0.05445464700460434,
-0.11045960336923599,
0.024493427947163582,
0.1304643750190735,
0.05754679813981056,
0.1152721419930458,
0.029591284692287445,
-0.03192920237779617,
-0.017457300797104836,
0.015638576820492744,
-0.12274625897407532,
-0.11700651049613953,
0.04458121210336685,
-0.0384088009595871,
-0.13834084570407867,
0.008422806859016418,
0.09254568815231323,
-0.02982456609606743,
-0.020625103265047073,
-0.01088833436369896,
0.021661188453435898,
-0.019002480432391167,
0.2012145221233368,
0.04369547590613365,
0.07091771066188812,
-0.1083175390958786,
0.12603901326656342,
0.04993327707052231,
-0.05983754247426987,
0.04991649091243744,
0.06771412491798401,
-0.10431340336799622,
-0.013285062275826931,
0.12134124338626862,
0.16539789736270905,
-0.027968155220150948,
-0.017925942316651344,
-0.08060858398675919,
-0.09304691851139069,
0.06252967566251755,
0.15358738601207733,
0.05145840346813202,
-0.017209308221936226,
-0.04693928360939026,
0.027653822675347328,
-0.12567800283432007,
0.0759158730506897,
0.044169120490550995,
0.06264229863882065,
-0.09120163321495056,
0.10876110196113586,
-0.00429157167673111,
0.043358806520700455,
-0.01903516799211502,
0.02582119219005108,
-0.1040634885430336,
-0.010481953620910645,
-0.14214180409908295,
0.0008597790729254484,
-0.004279355052858591,
0.012688173912465572,
-0.02098788321018219,
-0.053053341805934906,
-0.024827001616358757,
0.026599440723657608,
-0.09335062652826309,
-0.04986688867211342,
0.02316269651055336,
0.0381513349711895,
-0.1449967473745346,
-0.01590498350560665,
0.021828025579452515,
-0.09292908757925034,
0.08094721287488937,
0.05987381562590599,
0.011815610341727734,
0.029724016785621643,
-0.1083456501364708,
-0.04909722879528999,
0.002813837956637144,
0.023776454851031303,
0.0884898230433464,
-0.09343377500772476,
-0.021860670298337936,
-0.032024942338466644,
0.04636120796203613,
0.018154162913560867,
0.09552284330129623,
-0.11542998999357224,
0.009524418041110039,
-0.04243413358926773,
-0.04599306359887123,
-0.06216221675276756,
0.04068829119205475,
0.10818739235401154,
0.045283690094947815,
0.16051240265369415,
-0.07267971336841583,
0.036582574248313904,
-0.18595260381698608,
-0.04426776245236397,
-0.00032366058439947665,
-0.04844518005847931,
-0.08926483988761902,
-0.04706355184316635,
0.09877857565879822,
-0.05291171371936798,
0.10831722617149353,
-0.0006330640753731132,
0.09935937076807022,
0.032346319407224655,
-0.009923397563397884,
-0.05591154098510742,
0.010125667788088322,
0.15198567509651184,
0.04843670874834061,
-0.013847560621798038,
0.11107207089662552,
0.0012820997508242726,
0.04372406750917435,
0.06810284405946732,
0.21391040086746216,
0.15469373762607574,
0.010259022004902363,
0.04860934615135193,
0.060335516929626465,
-0.11774436384439468,
-0.13539279997348785,
0.13542991876602173,
-0.04794517159461975,
0.12360593676567078,
-0.07227100431919098,
0.2158624529838562,
0.02453301101922989,
-0.1851816475391388,
0.07016431540250778,
-0.0626361146569252,
-0.12230408936738968,
-0.11668746173381805,
-0.023559536784887314,
-0.07043465226888657,
-0.10982085764408112,
0.020787334069609642,
-0.1214752197265625,
0.06351582705974579,
0.12304262816905975,
0.01618804596364498,
0.0237907562404871,
0.1640734225511551,
-0.03456202149391174,
0.02117963880300522,
0.06801017373800278,
0.016577109694480896,
-0.00544528104364872,
-0.061400678008794785,
-0.0635564774274826,
0.04416004940867424,
0.023409660905599594,
0.0716337338089943,
-0.0396854393184185,
-0.00009367541497340426,
0.018863463774323463,
-0.015587100759148598,
-0.07380515336990356,
0.01742791198194027,
0.02424738183617592,
0.04484032094478607,
0.058352358639240265,
0.052046433091163635,
0.009259819984436035,
-0.03774765133857727,
0.2910687327384949,
-0.07435039430856705,
-0.09829413145780563,
-0.12957583367824554,
0.22655273973941803,
0.010849042795598507,
-0.022927608340978622,
0.0736304521560669,
-0.09486768394708633,
-0.028876863420009613,
0.18060791492462158,
0.1475575864315033,
-0.1067502349615097,
-0.02256220020353794,
-0.02002178318798542,
-0.011259319260716438,
-0.04288043826818466,
0.13516421616077423,
0.1086823046207428,
-0.015125972218811512,
-0.07961264252662659,
-0.01905132457613945,
-0.017807822674512863,
-0.04818309098482132,
-0.07360817492008209,
0.0628264769911766,
0.026695460081100464,
-0.0026762387715280056,
-0.04436107724905014,
0.0593579038977623,
-0.006232480052858591,
-0.23702962696552277,
0.030921803787350655,
-0.1636282205581665,
-0.1776723861694336,
-0.03959301486611366,
0.05910090357065201,
-0.0077471742406487465,
0.042761240154504776,
-0.009003271348774433,
0.010454080067574978,
0.13827957212924957,
-0.03256360441446304,
-0.025634802877902985,
-0.12614810466766357,
0.11679733544588089,
-0.11145234853029251,
0.20271220803260803,
0.0030176748987287283,
0.07177060097455978,
0.09597166627645493,
0.018899662420153618,
-0.13551290333271027,
0.04255502671003342,
0.0732569769024849,
-0.1044282540678978,
0.01134493201971054,
0.14668087661266327,
-0.04949520155787468,
0.06865845620632172,
0.024626443162560463,
-0.11069697886705399,
-0.006868886295706034,
-0.04647819325327873,
-0.038645464926958084,
-0.07408901304006577,
-0.01294746808707714,
-0.06445810198783875,
0.15848322212696075,
0.2233985960483551,
-0.02457842230796814,
0.01765170507133007,
-0.09351455420255661,
0.01412047166377306,
0.041098445653915405,
0.0467468798160553,
-0.053074952214956284,
-0.20111048221588135,
0.02869061380624771,
0.04250587895512581,
0.01869903691112995,
-0.20735712349414825,
-0.07903922349214554,
0.04512142390012741,
-0.03135678917169571,
-0.0503307469189167,
0.10240962356328964,
0.030163828283548355,
0.044134657829999924,
-0.03601668402552605,
-0.10708046704530716,
-0.03698904067277908,
0.14956921339035034,
-0.1564708948135376,
-0.04005146399140358
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.1011810377240181,
0.12495400756597519,
-0.002202283823862672,
0.09034115076065063,
0.1403326541185379,
0.031782157719135284,
0.08669615536928177,
0.1345941722393036,
-0.08428491652011871,
0.051100775599479675,
0.07047345489263535,
0.06972163915634155,
0.04486009478569031,
0.13003197312355042,
-0.04205455258488655,
-0.21238857507705688,
0.00606575096026063,
-0.011948831379413605,
-0.06338289380073547,
0.10273602604866028,
0.0891326516866684,
-0.11565915495157242,
0.06742925941944122,
-0.020354721695184708,
-0.1564399003982544,
0.007431984879076481,
-0.03161210939288139,
-0.025721287354826927,
0.10604699701070786,
-0.016672030091285706,
0.09393125772476196,
0.018240652978420258,
0.13771525025367737,
-0.21522244811058044,
0.0009197426261380315,
0.0766429752111435,
0.047003619372844696,
0.08728188276290894,
0.04067028686404228,
0.022517021745443344,
0.036701176315546036,
-0.14848104119300842,
0.09701626747846603,
0.02480778470635414,
-0.08807248622179031,
-0.13697408139705658,
-0.09084688872098923,
0.020956406369805336,
0.08257894217967987,
0.07854019850492477,
0.007984987460076809,
0.13868364691734314,
-0.10340394824743271,
0.08103293180465698,
0.199638232588768,
-0.2846390902996063,
-0.06584601104259491,
0.04342098534107208,
0.05439910665154457,
0.0731610357761383,
-0.11612055450677872,
-0.01513229962438345,
0.02049199678003788,
0.03192942216992378,
0.11197465658187866,
-0.020969979465007782,
-0.13119980692863464,
0.006212795618921518,
-0.12771223485469818,
-0.01248770672827959,
0.10390526801347733,
0.035802192986011505,
-0.04680031165480614,
-0.07804060727357864,
-0.061347778886556625,
-0.08808960765600204,
-0.03513314947485924,
-0.017693370580673218,
0.05505627021193504,
-0.05637166276574135,
-0.061715997755527496,
-0.046218305826187134,
-0.05632768198847771,
-0.08667346090078354,
0.0005045883008278906,
0.1365533173084259,
0.039910122752189636,
0.02046574465930462,
-0.038439176976680756,
0.11286337673664093,
-0.0013225453440099955,
-0.12868177890777588,
-0.004073815885931253,
0.0037101388443261385,
-0.11280561238527298,
-0.05515206977725029,
-0.03689001128077507,
0.010662205517292023,
0.015962131321430206,
0.15053172409534454,
-0.042995601892471313,
0.07561162859201431,
0.015849987044930458,
-0.016642309725284576,
-0.017167886719107628,
0.1541379988193512,
-0.03408944979310036,
-0.04130256921052933,
-0.0006238521309569478,
0.0995434820652008,
-0.0006022973684594035,
-0.007563686929643154,
-0.08137679845094681,
-0.015943296253681183,
0.07546184211969376,
0.06827043741941452,
-0.053111352026462555,
0.03659326210618019,
-0.04578760266304016,
-0.02439950220286846,
0.029885275289416313,
-0.1254989057779312,
0.038896966725587845,
0.0055091893300414085,
-0.08385921269655228,
-0.06058071553707123,
0.00737534137442708,
-0.008679338730871677,
-0.02650572545826435,
0.08203175663948059,
-0.07287877798080444,
-0.00020476202189456671,
-0.08600867539644241,
-0.07953838258981705,
0.001042269403114915,
-0.14752647280693054,
-0.010343039408326149,
-0.04767861217260361,
-0.19357405602931976,
-0.03308362513780594,
0.05109976977109909,
-0.07822103798389435,
-0.03955472633242607,
-0.05084891989827156,
-0.07609286904335022,
0.010126381181180477,
-0.007432238198816776,
0.18945768475532532,
-0.06053905189037323,
0.07865379750728607,
-0.014474987983703613,
0.04875163733959198,
0.01694491133093834,
0.04946904629468918,
-0.0845244824886322,
0.02593078464269638,
-0.15213418006896973,
0.08212394267320633,
-0.09986994415521622,
0.022345198318362236,
-0.12327105551958084,
-0.0878133699297905,
0.05318071320652962,
-0.021267829462885857,
0.07841084152460098,
0.13815945386886597,
-0.19936096668243408,
0.001183273852802813,
0.11435738205909729,
-0.04108419269323349,
-0.055636387318372726,
0.07802100479602814,
-0.05854625627398491,
0.045194435864686966,
0.05346156656742096,
0.1906212419271469,
0.07286877185106277,
-0.14708127081394196,
0.018243854865431786,
0.0338515006005764,
0.057013992220163345,
0.00595087232068181,
0.03535342961549759,
-0.0013201727997511625,
0.02730422466993332,
0.008879425935447216,
-0.08295103907585144,
-0.023257993161678314,
-0.09219330549240112,
-0.07384105771780014,
-0.05122021585702896,
-0.09736598283052444,
0.03191496059298515,
0.014755580574274063,
0.023002222180366516,
-0.07179341465234756,
-0.09851332008838654,
0.10458113253116608,
0.1183253824710846,
-0.050754331052303314,
0.016645794734358788,
-0.07739704102277756,
0.027246663346886635,
-0.016156084835529327,
-0.028869474306702614,
-0.20022261142730713,
-0.11862362176179886,
0.05158331245183945,
-0.047235727310180664,
0.0256055798381567,
0.021681763231754303,
0.06578397750854492,
0.05485079810023308,
-0.04258996993303299,
-0.018101617693901062,
-0.0663348138332367,
0.003718162188306451,
-0.11628902703523636,
-0.20536954700946808,
-0.066639244556427,
-0.03713441267609596,
0.13325966894626617,
-0.19444510340690613,
0.0033656260930001736,
-0.027815816923975945,
0.11109142750501633,
0.01687983237206936,
-0.05072091892361641,
0.012243659235537052,
0.03821370750665665,
0.018927618861198425,
-0.0955962985754013,
0.05650137737393379,
0.011746099218726158,
-0.06968878209590912,
-0.04372358322143555,
-0.11027226597070694,
-0.0013746278127655387,
0.061146825551986694,
0.07008963823318481,
-0.09946882724761963,
-0.01633383147418499,
-0.05299990624189377,
-0.03668168559670448,
-0.0738384872674942,
0.023289881646633148,
0.2050335556268692,
0.029391547664999962,
0.12493671476840973,
-0.06583602726459503,
-0.07952369004487991,
-0.0014329769182950258,
0.0166931115090847,
0.03372976556420326,
0.1002860814332962,
0.07651293277740479,
-0.07603645324707031,
0.07876133173704147,
0.09204217791557312,
-0.036719419062137604,
0.11590433120727539,
-0.05314343050122261,
-0.07512197643518448,
-0.016142161563038826,
-0.002373531460762024,
-0.028798570856451988,
0.14915211498737335,
-0.08589253574609756,
0.0037794492673128843,
0.0389409102499485,
0.024562403559684753,
0.013340635225176811,
-0.17848411202430725,
-0.007335633970797062,
0.014388266019523144,
-0.06181815266609192,
-0.056387558579444885,
-0.0273300651460886,
0.03660360723733902,
0.09717690944671631,
0.03090100921690464,
-0.030904095619916916,
0.02114449441432953,
-0.011906838044524193,
-0.06054127588868141,
0.1906633973121643,
-0.12056142836809158,
-0.0912434458732605,
-0.08584827929735184,
0.02225293219089508,
-0.03815635293722153,
-0.03554186969995499,
0.00857186783105135,
-0.09774184226989746,
-0.05397271737456322,
-0.08603999763727188,
-0.02600860968232155,
-0.006846528500318527,
-0.006602408364415169,
0.030209537595510483,
-0.013234205543994904,
0.07911565899848938,
-0.13297909498214722,
0.0017474413616582751,
-0.03533776476979256,
-0.09556246548891068,
0.01939687319099903,
0.06761568784713745,
0.08357050269842148,
0.10293658822774887,
-0.014203319326043129,
0.014815209433436394,
-0.027395687997341156,
0.23893095552921295,
-0.06644455343484879,
0.01633203588426113,
0.09020674973726273,
-0.0013413273263722658,
0.05411602184176445,
0.13728028535842896,
0.03643684461712837,
-0.11429952085018158,
0.026265226304531097,
0.07617500424385071,
-0.018675509840250015,
-0.2500510513782501,
-0.031420331448316574,
-0.019994767382740974,
-0.0815524235367775,
0.08269303292036057,
0.036014873534440994,
-0.05339889973402023,
0.03413098305463791,
0.019484931603074074,
-0.0037276619113981724,
-0.03477111831307411,
0.06566557288169861,
0.08631235361099243,
0.04123949632048607,
0.10067371279001236,
-0.02391042560338974,
-0.010647617280483246,
0.06399595737457275,
0.01965884119272232,
0.2625407576560974,
-0.03892078623175621,
0.12497304379940033,
0.036559637635946274,
0.14499671757221222,
-0.02782086655497551,
0.0518353208899498,
0.008705556392669678,
-0.0068665980361402035,
-0.005566325969994068,
-0.04905780032277107,
-0.01914854906499386,
-0.0017604934982955456,
-0.03566577658057213,
0.02211545966565609,
-0.07717384397983551,
0.02822810597717762,
0.02714305929839611,
0.3037053346633911,
0.039428431540727615,
-0.2658859193325043,
-0.06819155812263489,
0.0021639501210302114,
-0.042218003422021866,
-0.07754013687372208,
0.005048459395766258,
0.13927282392978668,
-0.12571845948696136,
0.0345824733376503,
-0.05000700801610947,
0.09331565350294113,
-0.03528577834367752,
0.011893939226865768,
0.06615229696035385,
0.1450570672750473,
-0.01149977557361126,
0.07110828906297684,
-0.21573753654956818,
0.23359696567058563,
0.02705836482346058,
0.11140912026166916,
-0.06384432315826416,
0.009798425249755383,
0.003369159298017621,
0.049108829349279404,
0.11116490513086319,
0.00767676904797554,
-0.0012240780051797628,
-0.17692676186561584,
-0.09736450761556625,
0.05560645833611488,
0.117665134370327,
-0.022019490599632263,
0.08637722581624985,
-0.03597939386963844,
-0.0024771837051957846,
0.033662281930446625,
-0.08091674000024796,
-0.12460057437419891,
-0.09029082208871841,
-0.006106715649366379,
-0.0012337128864601254,
-0.03433069586753845,
-0.05681731551885605,
-0.08981924504041672,
-0.02962062880396843,
0.13427585363388062,
0.021233122795820236,
-0.05446961522102356,
-0.13630075752735138,
0.04222700372338295,
0.13743345439434052,
-0.04641236737370491,
0.017696306109428406,
0.006676944438368082,
0.09957640618085861,
0.045635875314474106,
-0.07629682868719101,
0.06646154075860977,
-0.07511436939239502,
-0.16655701398849487,
-0.05924416333436966,
0.11902546882629395,
0.08258885145187378,
0.05624225363135338,
0.0024874054361134768,
0.031558189541101456,
-0.005688184406608343,
-0.08337269723415375,
0.018991032615303993,
0.04407472163438797,
0.09091086685657501,
0.0340084470808506,
-0.09571319818496704,
0.06813372671604156,
-0.04068875312805176,
-0.011966279707849026,
0.12393306940793991,
0.21729938685894012,
-0.09484317898750305,
0.10635087639093399,
0.07858563214540482,
-0.08095400780439377,
-0.1839701235294342,
0.07157479971647263,
0.12190243601799011,
0.0145229771733284,
0.03958046808838844,
-0.20692168176174164,
0.13164690136909485,
0.10361292213201523,
-0.015001069754362106,
0.04317827522754669,
-0.30052968859672546,
-0.13092392683029175,
0.07625392079353333,
0.10583730787038803,
0.042283911257982254,
-0.1260545402765274,
-0.021563591435551643,
-0.012484835460782051,
-0.13121165335178375,
0.1389268934726715,
-0.08244949579238892,
0.11496774852275848,
-0.006514165084809065,
0.11452055722475052,
0.026950940489768982,
-0.03879386559128761,
0.1344834566116333,
0.07001020759344101,
0.09443206340074539,
-0.04311510547995567,
0.010960109531879425,
0.053888920694589615,
-0.06415075808763504,
0.03713383898139,
-0.030413327738642693,
0.06980140507221222,
-0.1632012277841568,
-0.00020135080558247864,
-0.09322109073400497,
0.03630708530545235,
-0.05152842402458191,
-0.05605683848261833,
-0.019640745595097542,
0.05348798260092735,
0.06904150545597076,
-0.03937383368611336,
0.029023200273513794,
0.005829984787851572,
0.07632485032081604,
0.0896209254860878,
0.10650931298732758,
-0.0401805080473423,
-0.10139065980911255,
0.019506828859448433,
-0.008784661069512367,
0.05512566491961479,
-0.10983149707317352,
0.02479415200650692,
0.13092683255672455,
0.05764469876885414,
0.11554065346717834,
0.02852984517812729,
-0.03130298852920532,
-0.017112109810113907,
0.015041492879390717,
-0.12277961522340775,
-0.11592905223369598,
0.04396532103419304,
-0.03875699266791344,
-0.1378357857465744,
0.007368802092969418,
0.09331566840410233,
-0.030865969136357307,
-0.019977357238531113,
-0.011312087066471577,
0.020295854657888412,
-0.0189502015709877,
0.20086798071861267,
0.04411998391151428,
0.07068251818418503,
-0.10802068561315536,
0.1254631131887436,
0.050253354012966156,
-0.059143126010894775,
0.0502941720187664,
0.06694088131189346,
-0.10470990836620331,
-0.013251136057078838,
0.12102313339710236,
0.16537010669708252,
-0.029088115319609642,
-0.019040020182728767,
-0.08132999390363693,
-0.0916576087474823,
0.06226133182644844,
0.1524571180343628,
0.051885321736335754,
-0.017473682761192322,
-0.047351766377687454,
0.02691151387989521,
-0.12575305998325348,
0.07524587213993073,
0.044142697006464005,
0.06292729824781418,
-0.09146089851856232,
0.11086807399988174,
-0.0039028304163366556,
0.04330422729253769,
-0.01902136765420437,
0.025619128718972206,
-0.10385317355394363,
-0.010227832943201065,
-0.14412784576416016,
0.0012568770907819271,
-0.004111141432076693,
0.013059549033641815,
-0.020942173898220062,
-0.05262868106365204,
-0.0249747633934021,
0.02642008289694786,
-0.09259426593780518,
-0.04946122691035271,
0.02350051887333393,
0.03762856870889664,
-0.14424914121627808,
-0.016118880361318588,
0.02143070660531521,
-0.09282372891902924,
0.08110283315181732,
0.059414561837911606,
0.011407430283725262,
0.029171418398618698,
-0.10834381729364395,
-0.04902096837759018,
0.0028497944585978985,
0.024731528013944626,
0.08854014426469803,
-0.09145676344633102,
-0.02101288177073002,
-0.031584545969963074,
0.04630783572793007,
0.01791648007929325,
0.09599252790212631,
-0.11563751846551895,
0.009165779687464237,
-0.041820134967565536,
-0.04563421756029129,
-0.06281445175409317,
0.04040738195180893,
0.10797442495822906,
0.04499697685241699,
0.16033877432346344,
-0.072736456990242,
0.03626243770122528,
-0.18602316081523895,
-0.04449893534183502,
-0.00005914671055506915,
-0.04800815507769585,
-0.08891457319259644,
-0.04787513241171837,
0.09860135614871979,
-0.05228149890899658,
0.10996402055025101,
-0.0007788233342580497,
0.09978347271680832,
0.03178619593381882,
-0.011042931117117405,
-0.055811550468206406,
0.00932331196963787,
0.1529206931591034,
0.049452926963567734,
-0.013813630677759647,
0.10995516926050186,
0.0014620223082602024,
0.04503938928246498,
0.06785105168819427,
0.2114715278148651,
0.15458643436431885,
0.009656286798417568,
0.048973649740219116,
0.05987720564007759,
-0.11733423918485641,
-0.13568778336048126,
0.1348271518945694,
-0.04830546677112579,
0.12424154579639435,
-0.07206534594297409,
0.21605104207992554,
0.02447989024221897,
-0.1847631186246872,
0.06976804882287979,
-0.06211861968040466,
-0.12253842502832413,
-0.11648224294185638,
-0.02287432923913002,
-0.07044824957847595,
-0.10977070778608322,
0.020343126729130745,
-0.12095802277326584,
0.06326460093259811,
0.12370898574590683,
0.01590358093380928,
0.023556750267744064,
0.16329270601272583,
-0.034809403121471405,
0.021535750478506088,
0.06783431768417358,
0.01650761254131794,
-0.005008730571717024,
-0.06257719546556473,
-0.06482342630624771,
0.044427353888750076,
0.023446425795555115,
0.07226205617189407,
-0.03991934284567833,
0.001871998654678464,
0.019383160397410393,
-0.015028928406536579,
-0.07412945479154587,
0.017289813607931137,
0.023568859323859215,
0.04469933733344078,
0.05703406408429146,
0.05237948149442673,
0.009613745845854282,
-0.03787723183631897,
0.2892577052116394,
-0.07381480932235718,
-0.09871095418930054,
-0.12973283231258392,
0.22532135248184204,
0.01110092829912901,
-0.02254789136350155,
0.07364611327648163,
-0.09483125060796738,
-0.028175214305520058,
0.18102329969406128,
0.14640866219997406,
-0.1076650470495224,
-0.02269643358886242,
-0.019634870812296867,
-0.011444210074841976,
-0.04405467212200165,
0.1355927288532257,
0.10887053608894348,
-0.01626414805650711,
-0.07876763492822647,
-0.019667042419314384,
-0.01808236353099346,
-0.048018328845500946,
-0.07444643974304199,
0.06180547550320625,
0.026883801445364952,
-0.0019208319718018174,
-0.04412437230348587,
0.0584319606423378,
-0.005224235355854034,
-0.2376594990491867,
0.03126765787601471,
-0.1636718213558197,
-0.17756590247154236,
-0.03921743109822273,
0.05928214266896248,
-0.007582790218293667,
0.042335394769907,
-0.009247872978448868,
0.01095319539308548,
0.13911515474319458,
-0.03308584913611412,
-0.026064779609441757,
-0.1249968558549881,
0.11586083471775055,
-0.10990367084741592,
0.2024778425693512,
0.003191656433045864,
0.07245781272649765,
0.09577895700931549,
0.01949000544846058,
-0.13466450572013855,
0.04293017461895943,
0.07294513285160065,
-0.10344065725803375,
0.010955365374684334,
0.14512740075588226,
-0.04946796968579292,
0.06873271614313126,
0.024825958535075188,
-0.1105891764163971,
-0.007562595419585705,
-0.04744870588183403,
-0.03872370347380638,
-0.07382559031248093,
-0.014655848033726215,
-0.06401863694190979,
0.1587122231721878,
0.22258467972278595,
-0.024312470108270645,
0.017121488228440285,
-0.09354541450738907,
0.013948335312306881,
0.041648998856544495,
0.04660120606422424,
-0.053137362003326416,
-0.20102626085281372,
0.02923477068543434,
0.04204264283180237,
0.01884194277226925,
-0.20691485702991486,
-0.07944758981466293,
0.04481959715485573,
-0.03171474114060402,
-0.050013087689876556,
0.10250561684370041,
0.03008568473160267,
0.04449782520532608,
-0.03607882186770439,
-0.10709910094738007,
-0.03747682645916939,
0.1494520902633667,
-0.1562909036874771,
-0.04057097062468529
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09819600731134415,
0.15401935577392578,
-0.002162632066756487,
0.0914003849029541,
0.1376272290945053,
0.039442483335733414,
0.08760891109704971,
0.13564392924308777,
-0.08111811429262161,
0.0744476318359375,
0.07875176519155502,
0.05380730330944061,
0.05175633355975151,
0.12349574267864227,
-0.04741187021136284,
-0.20877979695796967,
0.013499371707439423,
-0.01912582665681839,
-0.05696224048733711,
0.09852772951126099,
0.08201973885297775,
-0.10477408766746521,
0.0800950676202774,
-0.019909100607037544,
-0.1529119461774826,
0.00964865181595087,
-0.03902575746178627,
-0.02412307634949684,
0.09529120475053787,
-0.006497287191450596,
0.09686737507581711,
0.010490470565855503,
0.13798178732395172,
-0.21231293678283691,
-0.00011678216833388433,
0.0794285386800766,
0.03669172525405884,
0.08976474404335022,
0.045213986188173294,
0.025108853355050087,
0.05180106684565544,
-0.1530008167028427,
0.09013760089874268,
0.026205023750662804,
-0.07636341452598572,
-0.09079311788082123,
-0.0904654711484909,
0.018438680097460747,
0.07636591792106628,
0.08375480771064758,
0.008835987187922001,
0.13484551012516022,
-0.0912407711148262,
0.08176009356975555,
0.1992817372083664,
-0.27246126532554626,
-0.0663103386759758,
0.04548696428537369,
0.0511784553527832,
0.06638211011886597,
-0.11849767714738846,
-0.028029225766658783,
0.028421098366379738,
0.03258559852838516,
0.09433808922767639,
-0.018847621977329254,
-0.140041783452034,
0.0017183029558509588,
-0.13215236365795135,
-0.014297103509306908,
0.10903444141149521,
0.04870586097240448,
-0.043536122888326645,
-0.06427418440580368,
-0.07751569896936417,
-0.10751312226057053,
-0.02382783591747284,
-0.020166965201497078,
0.052155233919620514,
-0.0549185685813427,
-0.05778342857956886,
-0.03893502429127693,
-0.05866706371307373,
-0.08683983236551285,
-0.003809453686699271,
0.12089026719331741,
0.04417106509208679,
0.024480758234858513,
-0.03482413664460182,
0.10251227766275406,
0.0006259824149310589,
-0.1337493360042572,
-0.01764618419110775,
0.008108502253890038,
-0.12259963154792786,
-0.05614696815609932,
-0.027924610301852226,
0.011796251870691776,
0.017084142193198204,
0.1479143351316452,
-0.04163888469338417,
0.0818413719534874,
0.016628412529826164,
-0.020951373502612114,
-0.011522716842591763,
0.14356307685375214,
-0.03987361118197441,
-0.05239371582865715,
0.0057005551643669605,
0.09649783372879028,
0.00017177642439492047,
-0.005291367415338755,
-0.07310429960489273,
-0.01840147189795971,
0.09234675019979477,
0.054802119731903076,
-0.054011180996894836,
0.03386601060628891,
-0.03137640655040741,
-0.025421665981411934,
0.023746246472001076,
-0.12403237074613571,
0.03586210310459137,
0.009549056179821491,
-0.083847776055336,
-0.03763534873723984,
0.012441213242709637,
-0.01839769445359707,
-0.021304551512002945,
0.09011843800544739,
-0.08720403164625168,
-0.006132813636213541,
-0.07772817462682724,
-0.07334695756435394,
0.015066767111420631,
-0.1426093429327011,
-0.014706814661622047,
-0.049562107771635056,
-0.2009187638759613,
-0.037063296884298325,
0.04336051642894745,
-0.07656542956829071,
-0.04152190685272217,
-0.057109273970127106,
-0.08389181643724442,
0.015544634312391281,
-0.004691467620432377,
0.16607876121997833,
-0.06047708913683891,
0.0774814635515213,
-0.014472114853560925,
0.04046276584267616,
0.010561464354395866,
0.04916905611753464,
-0.08846136182546616,
0.022484896704554558,
-0.14178726077079773,
0.07919548451900482,
-0.09635947644710541,
0.009152255952358246,
-0.12450727820396423,
-0.08833577483892441,
0.039695966988801956,
-0.02774994820356369,
0.070114366710186,
0.14391587674617767,
-0.1919531226158142,
0.0013534470926970243,
0.11604030430316925,
-0.04770471155643463,
-0.046922869980335236,
0.07862796634435654,
-0.05637575685977936,
0.027983564883470535,
0.05130254477262497,
0.1662900149822235,
0.07463746517896652,
-0.15001437067985535,
-0.018694456666707993,
0.025439219549298286,
0.049705345183610916,
0.0057834237813949585,
0.04488947242498398,
0.005695257801562548,
0.02572481520473957,
0.004306034184992313,
-0.07729434221982956,
-0.02538680098950863,
-0.09004228562116623,
-0.07072761654853821,
-0.05323841795325279,
-0.08831042796373367,
0.02245214395225048,
0.012485754676163197,
0.02060096338391304,
-0.058131612837314606,
-0.10357216745615005,
0.10298489034175873,
0.12892434000968933,
-0.05319339409470558,
0.007187388371676207,
-0.06925629079341888,
0.020693309605121613,
-0.030327249318361282,
-0.035240426659584045,
-0.1953132599592209,
-0.13691958785057068,
0.05324004963040352,
-0.05280066281557083,
0.03613044321537018,
0.03389989584684372,
0.06361115723848343,
0.06296207010746002,
-0.033272888511419296,
-0.01844250224530697,
-0.0699523314833641,
-0.003401534864678979,
-0.11263573914766312,
-0.19844132661819458,
-0.06334403902292252,
-0.03285635635256767,
0.14130133390426636,
-0.2015025019645691,
-0.000806764408480376,
-0.025030365213751793,
0.121707022190094,
0.016282375901937485,
-0.057649437338113785,
0.008576922118663788,
0.029330860823392868,
0.005883561447262764,
-0.09945719689130783,
0.04057636111974716,
0.008624283596873283,
-0.054889895021915436,
-0.05996844917535782,
-0.1084141805768013,
-0.003868286730721593,
0.05917350947856903,
0.07921812683343887,
-0.10454869270324707,
-0.005998486187309027,
-0.05068036541342735,
-0.04540237411856651,
-0.08835210651159286,
0.009637638926506042,
0.19816964864730835,
0.03373580053448677,
0.12147054076194763,
-0.061095885932445526,
-0.07065247744321823,
0.001815360737964511,
0.023906204849481583,
0.022763244807720184,
0.09109438210725784,
0.10545506328344345,
-0.09819739311933517,
0.08827969431877136,
0.0692470446228981,
-0.04838727414608002,
0.1192622259259224,
-0.04502665996551514,
-0.08185356110334396,
-0.01890561170876026,
0.004973171278834343,
-0.03403493016958237,
0.1398988664150238,
-0.08399658650159836,
0.011954526416957378,
0.03409988060593605,
0.024708133190870285,
0.018000829964876175,
-0.1692870408296585,
-0.00031750445486977696,
0.012164575979113579,
-0.06417292356491089,
-0.03829468414187431,
-0.023722654208540916,
0.037002161145210266,
0.09377506375312805,
0.026155926287174225,
-0.04350472614169121,
0.016626553609967232,
-0.011949453502893448,
-0.06907415390014648,
0.19206799566745758,
-0.10840759426355362,
-0.09102515876293182,
-0.10882622748613358,
0.027072643861174583,
-0.045662131160497665,
-0.036711741238832474,
0.004594081547111273,
-0.08306977152824402,
-0.055795613676309586,
-0.08862902969121933,
-0.02283882349729538,
-0.016928095370531082,
-0.002609659917652607,
0.023075643926858902,
-0.01563708856701851,
0.07272297888994217,
-0.13164706528186798,
0.004841977264732122,
-0.02936120145022869,
-0.10261473059654236,
0.008212125860154629,
0.06352902948856354,
0.0856301337480545,
0.09942629933357239,
-0.014798972755670547,
0.012849792838096619,
-0.027502665296196938,
0.23862400650978088,
-0.05518990010023117,
0.014755172654986382,
0.08337981253862381,
-0.006644173990935087,
0.06081174314022064,
0.14227618277072906,
0.03168250620365143,
-0.10340439528226852,
0.02661030739545822,
0.08345972746610641,
-0.011883707717061043,
-0.24791845679283142,
-0.028946829959750175,
-0.02183734066784382,
-0.07184523344039917,
0.08847548812627792,
0.04000632464885712,
-0.04628554359078407,
0.038211461156606674,
0.009897831827402115,
0.012948152609169483,
-0.05347009003162384,
0.07294807583093643,
0.08700501173734665,
0.045423801988363266,
0.0942206010222435,
-0.022170206531882286,
-0.012272288091480732,
0.0660434439778328,
0.015956765040755272,
0.2692907750606537,
-0.031017674133181572,
0.12685707211494446,
0.02646602690219879,
0.14875783026218414,
-0.029446817934513092,
0.04679897427558899,
0.01592198573052883,
0.005892704240977764,
-0.006517650559544563,
-0.05304516479372978,
-0.030477499589323997,
0.010056505911052227,
-0.0338994599878788,
0.032069869339466095,
-0.07286926358938217,
0.04672945290803909,
0.013347169384360313,
0.3012692332267761,
0.04461236670613289,
-0.28258126974105835,
-0.061845604330301285,
-0.002843074733391404,
-0.04586615785956383,
-0.07446242868900299,
0.004749499727040529,
0.14560672640800476,
-0.12440206855535507,
0.038346339017152786,
-0.05497106909751892,
0.08814899623394012,
-0.04799135774374008,
0.0023809128906577826,
0.06644199043512344,
0.14749589562416077,
-0.009479111060500145,
0.06942708045244217,
-0.18855875730514526,
0.22568103671073914,
0.02845834195613861,
0.10978766530752182,
-0.06194800138473511,
0.013588291592895985,
0.010121467523276806,
0.026138700544834137,
0.10826633870601654,
-0.0007420439505949616,
-0.02512289211153984,
-0.16898851096630096,
-0.11489233374595642,
0.05907158926129341,
0.1164623573422432,
-0.015546518377959728,
0.09657063335180283,
-0.04188564792275429,
-0.002494972897693515,
0.03759072348475456,
-0.07353385537862778,
-0.12793931365013123,
-0.08759579062461853,
0.0017299412284046412,
0.019749635830521584,
-0.040375832468271255,
-0.04829923436045647,
-0.09177621454000473,
-0.024930628016591072,
0.13487844169139862,
-0.001582660828717053,
-0.0431680753827095,
-0.1353696882724762,
0.05562152341008186,
0.1435694545507431,
-0.05846220254898071,
0.019326789304614067,
0.003260588739067316,
0.10218895971775055,
0.05196874216198921,
-0.0837063267827034,
0.052101943641901016,
-0.06666036695241928,
-0.1596611887216568,
-0.06107359007000923,
0.1157301664352417,
0.07957466691732407,
0.05350336804986,
-0.0034586102701723576,
0.032741621136665344,
0.0008768316474743187,
-0.08646469563245773,
0.007117283996194601,
0.05360502004623413,
0.08987344801425934,
0.05060350522398949,
-0.09434588253498077,
0.04792342334985733,
-0.0343131422996521,
-0.00277906097471714,
0.12688221037387848,
0.21466679871082306,
-0.08628435432910919,
0.08754286915063858,
0.0707404762506485,
-0.07849154621362686,
-0.17758172750473022,
0.06954570859670639,
0.13147930800914764,
0.016199585050344467,
0.03545819967985153,
-0.2051147073507309,
0.13704445958137512,
0.11299524456262589,
-0.014885315671563148,
0.05431809276342392,
-0.29679733514785767,
-0.1235114112496376,
0.07071138918399811,
0.10384707152843475,
0.04199618846178055,
-0.1281914860010147,
-0.02741219475865364,
-0.010598069988191128,
-0.14567922055721283,
0.14624570310115814,
-0.07275210320949554,
0.12257270514965057,
-0.008198246359825134,
0.12375647574663162,
0.022397536784410477,
-0.04155586659908295,
0.13047745823860168,
0.0752844288945198,
0.08718730509281158,
-0.038980089128017426,
-0.0015943869948387146,
0.048772282898426056,
-0.06623132526874542,
0.04861699417233467,
-0.04220542684197426,
0.06714271008968353,
-0.16426865756511688,
0.0004041751381009817,
-0.08391641825437546,
0.04368025064468384,
-0.047734055668115616,
-0.04689745977520943,
-0.029140369966626167,
0.04707567021250725,
0.06787559390068054,
-0.034991681575775146,
0.03089943528175354,
0.021050870418548584,
0.05545903742313385,
0.09318748116493225,
0.0878266766667366,
-0.018165601417422295,
-0.10910268127918243,
0.011372736655175686,
-0.0055466205812990665,
0.05284140631556511,
-0.10145966708660126,
0.01630689576268196,
0.13514436781406403,
0.05824561044573784,
0.1257028430700302,
0.02569677121937275,
-0.033281803131103516,
-0.015502572990953922,
0.015356785617768764,
-0.12564992904663086,
-0.11793103069067001,
0.0375950001180172,
-0.04370252043008804,
-0.15552504360675812,
0.005575826391577721,
0.10222597420215607,
-0.037922367453575134,
-0.012225257232785225,
-0.01042212639003992,
0.024026930332183838,
-0.012828191742300987,
0.2017975002527237,
0.041305478662252426,
0.06217173486948013,
-0.10264046490192413,
0.12626071274280548,
0.057863749563694,
-0.04617207497358322,
0.05692310631275177,
0.06755880266427994,
-0.09362863749265671,
-0.005304123740643263,
0.10788540542125702,
0.16965503990650177,
-0.04637343809008598,
-0.019556652754545212,
-0.07239942252635956,
-0.07619135081768036,
0.05760735645890236,
0.1607881486415863,
0.04932352900505066,
-0.006157172843813896,
-0.04424959793686867,
0.024982823058962822,
-0.12660649418830872,
0.0744875967502594,
0.04729773476719856,
0.06757184863090515,
-0.10598773509263992,
0.1086457148194313,
-0.008761907927691936,
0.03520260751247406,
-0.01582617685198784,
0.02882334589958191,
-0.09670279920101166,
-0.025216180831193924,
-0.12395156919956207,
0.020192399621009827,
-0.010134195908904076,
0.00743618281558156,
-0.010721050202846527,
-0.053650692105293274,
-0.03793444484472275,
0.025748785585165024,
-0.08085265010595322,
-0.05493757873773575,
0.016270704567432404,
0.045725926756858826,
-0.15568804740905762,
-0.012066548690199852,
0.023811141029000282,
-0.09351561963558197,
0.07686268538236618,
0.0651814267039299,
0.01637360081076622,
0.027631012722849846,
-0.10668185353279114,
-0.046590905636548996,
0.013035599142313004,
0.02899736724793911,
0.0842423141002655,
-0.08770842850208282,
-0.013902553357183933,
-0.033852338790893555,
0.049074314534664154,
0.014181500300765038,
0.08895569294691086,
-0.1160491332411766,
-0.003820637706667185,
-0.05685551464557648,
-0.03338665887713432,
-0.05965958535671234,
0.0343770906329155,
0.11598197370767593,
0.03547497093677521,
0.16918472945690155,
-0.06824197620153427,
0.038763437420129776,
-0.19385649263858795,
-0.03359712287783623,
0.0029116207733750343,
-0.04136328399181366,
-0.08534237742424011,
-0.04223744943737984,
0.09251725673675537,
-0.05131055414676666,
0.09317445755004883,
-0.007347883656620979,
0.09048344194889069,
0.032261040061712265,
-0.007265829481184483,
-0.05733870714902878,
0.0017611492658033967,
0.14763200283050537,
0.0597204715013504,
-0.018902191892266273,
0.10300745815038681,
-0.008646717295050621,
0.054928041994571686,
0.04345012456178665,
0.22635295987129211,
0.14895829558372498,
-0.019144758582115173,
0.0640442743897438,
0.07066897302865982,
-0.1208527684211731,
-0.12473534792661667,
0.14304426312446594,
-0.04668189585208893,
0.12317105382680893,
-0.054939206689596176,
0.21987082064151764,
0.0227162167429924,
-0.17554716765880585,
0.05240674689412117,
-0.05672750994563103,
-0.11991328746080399,
-0.12006925046443939,
-0.014877759851515293,
-0.08093324303627014,
-0.10009211301803589,
0.026066740974783897,
-0.1216055229306221,
0.0646291971206665,
0.11937996745109558,
0.016235873103141785,
0.021403349936008453,
0.1543084979057312,
-0.04202384874224663,
0.017466692253947258,
0.06070782616734505,
0.023918604478240013,
-0.006204670760780573,
-0.06041610240936279,
-0.06457378715276718,
0.05101453512907028,
0.0354006290435791,
0.08574428409337997,
-0.04593770578503609,
0.018587850034236908,
0.03363526612520218,
-0.015311641618609428,
-0.0736336037516594,
0.012456521391868591,
0.02634706348180771,
0.04002200439572334,
0.0558859147131443,
0.05312859266996384,
0.016749342903494835,
-0.03577176108956337,
0.26667848229408264,
-0.07270786166191101,
-0.08422647416591644,
-0.1313142329454422,
0.19909687340259552,
0.016834300011396408,
-0.018646661192178726,
0.07583948969841003,
-0.10400285571813583,
-0.02409570850431919,
0.16807156801223755,
0.1246197298169136,
-0.10310599952936172,
-0.02975832298398018,
-0.01592084765434265,
-0.011792131699621677,
-0.03887273743748665,
0.1169530525803566,
0.09363944083452225,
0.008634845726191998,
-0.0769449844956398,
-0.02945549413561821,
-0.017371295019984245,
-0.04387956112623215,
-0.06331046670675278,
0.03730476647615433,
0.014988914132118225,
0.0016101571964100003,
-0.03924361243844032,
0.05118577554821968,
-0.012710613198578358,
-0.24431802332401276,
0.030797164887189865,
-0.15699723362922668,
-0.18074873089790344,
-0.03137269243597984,
0.06295957416296005,
-0.005625128746032715,
0.039072539657354355,
-0.018671656027436256,
0.003997810184955597,
0.14943765103816986,
-0.035234738141298294,
-0.04835937172174454,
-0.1214136928319931,
0.1021135225892067,
-0.09831856191158295,
0.20543627440929413,
0.007282991427928209,
0.08362080901861191,
0.09772340953350067,
0.022782064974308014,
-0.13468682765960693,
0.03440191596746445,
0.07474088668823242,
-0.11046332865953445,
0.013809367083013058,
0.15163861215114594,
-0.05499950423836708,
0.08134657144546509,
0.026311585679650307,
-0.10347211360931396,
-0.018637975677847862,
-0.02751612663269043,
-0.03221513703465462,
-0.07973327487707138,
-0.01567693240940571,
-0.06388749927282333,
0.1656382828950882,
0.22119325399398804,
-0.024204058572649956,
0.0160406194627285,
-0.0867454782128334,
0.016784850507974625,
0.04416806250810623,
0.051875725388526917,
-0.04290994629263878,
-0.20519496500492096,
0.030049443244934082,
0.022811170667409897,
0.02368064783513546,
-0.19848501682281494,
-0.0837051197886467,
0.04710150882601738,
-0.02864154428243637,
-0.05170806869864464,
0.10083667933940887,
0.023135384544730186,
0.04460886865854263,
-0.03483352065086365,
-0.09501584619283676,
-0.044431980699300766,
0.14318566024303436,
-0.1624550223350525,
-0.05394567549228668
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09899627417325974,
0.15302281081676483,
-0.0021053943783044815,
0.09092573821544647,
0.13762281835079193,
0.03939445689320564,
0.0889071449637413,
0.13429999351501465,
-0.07999730110168457,
0.07368205487728119,
0.07819972187280655,
0.05526336655020714,
0.05221593752503395,
0.12480566650629044,
-0.04650094732642174,
-0.21005815267562866,
0.013523543253540993,
-0.018251841887831688,
-0.05573718249797821,
0.0986824482679367,
0.08213745057582855,
-0.10505453497171402,
0.0790775790810585,
-0.020477456972002983,
-0.15242452919483185,
0.009393660351634026,
-0.03959355503320694,
-0.023880524560809135,
0.09542108327150345,
-0.008157174102962017,
0.0964844673871994,
0.010479043237864971,
0.14024098217487335,
-0.21173404157161713,
-0.00017075761570595205,
0.07831589132547379,
0.0364689938724041,
0.08949188888072968,
0.04517526552081108,
0.026909714564681053,
0.05150032043457031,
-0.15399277210235596,
0.09121526032686234,
0.025443224236369133,
-0.07637745887041092,
-0.08897635340690613,
-0.09116912633180618,
0.019406110048294067,
0.07962069660425186,
0.08279047906398773,
0.007943220436573029,
0.13411375880241394,
-0.09190289676189423,
0.08249826729297638,
0.20007188618183136,
-0.2739221155643463,
-0.06647656112909317,
0.047842588275671005,
0.05424175783991814,
0.06693224608898163,
-0.11863245815038681,
-0.028978154063224792,
0.02915368787944317,
0.03195108473300934,
0.09476032853126526,
-0.019670823588967323,
-0.1386813223361969,
0.001190963084809482,
-0.13358648121356964,
-0.013905758038163185,
0.10846369713544846,
0.0502241775393486,
-0.04384920001029968,
-0.06537485122680664,
-0.07756505161523819,
-0.10896530747413635,
-0.02518966607749462,
-0.021876150742173195,
0.05192534253001213,
-0.055940356105566025,
-0.05721732974052429,
-0.03932026028633118,
-0.05772010609507561,
-0.08741914480924606,
-0.0012633905280381441,
0.11863384395837784,
0.044714927673339844,
0.023080267012119293,
-0.03390788286924362,
0.10210143774747849,
-0.0011536668753251433,
-0.13311143219470978,
-0.016401229426264763,
0.008345700800418854,
-0.1244770959019661,
-0.05755538493394852,
-0.027660872787237167,
0.01289716362953186,
0.01573987491428852,
0.1492445468902588,
-0.038544412702322006,
0.08216668665409088,
0.018553901463747025,
-0.0213436558842659,
-0.011973089538514614,
0.14355264604091644,
-0.04093053191900253,
-0.05493209511041641,
0.004875771701335907,
0.09770579636096954,
-0.000518447021022439,
-0.00408442597836256,
-0.0737590491771698,
-0.01896153762936592,
0.093265600502491,
0.054986197501420975,
-0.055510520935058594,
0.03398463502526283,
-0.030423760414123535,
-0.024868464097380638,
0.0241323783993721,
-0.12443681061267853,
0.03615438565611839,
0.008612506091594696,
-0.08477028459310532,
-0.03863224759697914,
0.010915540158748627,
-0.017214544117450714,
-0.021372821182012558,
0.08904625475406647,
-0.08685074746608734,
-0.006309602875262499,
-0.07798763364553452,
-0.07471465319395065,
0.01574871689081192,
-0.14718970656394958,
-0.013854924589395523,
-0.04809610918164253,
-0.2042650431394577,
-0.03840448334813118,
0.042709749191999435,
-0.07547125965356827,
-0.04206688702106476,
-0.05920036509633064,
-0.08426850289106369,
0.01373106800019741,
-0.0038233904633671045,
0.16746772825717926,
-0.059144385159015656,
0.07674827426671982,
-0.014795972034335136,
0.042066238820552826,
0.011088416911661625,
0.04935106262564659,
-0.08732178807258606,
0.022905468940734863,
-0.14071513712406158,
0.07903455197811127,
-0.0963820144534111,
0.01200397964566946,
-0.12463647127151489,
-0.08888174593448639,
0.03865473344922066,
-0.02755005657672882,
0.07043907046318054,
0.14392207562923431,
-0.19369401037693024,
0.002742190146818757,
0.11709319800138474,
-0.046680793166160583,
-0.04623895511031151,
0.07785172760486603,
-0.05682481825351715,
0.029845064505934715,
0.05286651477217674,
0.16612876951694489,
0.07597649097442627,
-0.14866065979003906,
-0.01913694106042385,
0.025003183633089066,
0.04964578524231911,
0.004322868771851063,
0.045140739530324936,
0.006826214026659727,
0.028093744069337845,
0.00397498020902276,
-0.07605808973312378,
-0.02562078647315502,
-0.08993145823478699,
-0.0711744949221611,
-0.05249299108982086,
-0.08885139226913452,
0.02237427979707718,
0.012828964740037918,
0.02079441025853157,
-0.05827576294541359,
-0.10342048108577728,
0.10333225876092911,
0.12909458577632904,
-0.05363203212618828,
0.006323893554508686,
-0.06912866234779358,
0.020131411030888557,
-0.0308748297393322,
-0.03490259870886803,
-0.1952667385339737,
-0.13445602357387543,
0.05314752086997032,
-0.05325330048799515,
0.035620130598545074,
0.036355070769786835,
0.06428494304418564,
0.062757208943367,
-0.03269492834806442,
-0.019445903599262238,
-0.07003313302993774,
-0.003575644688680768,
-0.11442979425191879,
-0.19741056859493256,
-0.06515629589557648,
-0.03354595974087715,
0.14146311581134796,
-0.2033098340034485,
0.0004802021139767021,
-0.027093712240457535,
0.1206178069114685,
0.015217864885926247,
-0.05783617123961449,
0.009148809127509594,
0.0289671141654253,
0.0066162049770355225,
-0.10031194984912872,
0.040668535977602005,
0.00742187537252903,
-0.05465174838900566,
-0.06212439760565758,
-0.1099858507514,
-0.006376493722200394,
0.05840635672211647,
0.08098134398460388,
-0.10441944003105164,
-0.006396980490535498,
-0.050733111798763275,
-0.04487714171409607,
-0.08649390935897827,
0.00932491198182106,
0.1959572583436966,
0.03302992880344391,
0.12081310153007507,
-0.061333075165748596,
-0.07173848152160645,
0.002154299523681402,
0.025051547214388847,
0.023766521364450455,
0.09142587333917618,
0.1049867644906044,
-0.09825760126113892,
0.08771086484193802,
0.07032904028892517,
-0.047687046229839325,
0.11798457056283951,
-0.04506777971982956,
-0.08259644359350204,
-0.018757836893200874,
0.0055314237251877785,
-0.03467220813035965,
0.13939374685287476,
-0.08505252003669739,
0.01018617209047079,
0.03356180712580681,
0.023391954600811005,
0.017527226358652115,
-0.16999395191669464,
0.00008712000271771103,
0.011902322992682457,
-0.0633949339389801,
-0.03968235105276108,
-0.024439603090286255,
0.0353957898914814,
0.09307844191789627,
0.02565716579556465,
-0.04462975263595581,
0.01613513194024563,
-0.011862719431519508,
-0.06885240226984024,
0.19264322519302368,
-0.10671446472406387,
-0.08983124792575836,
-0.10738562047481537,
0.02645525522530079,
-0.04328130930662155,
-0.03697836771607399,
0.004035945516079664,
-0.08377157151699066,
-0.055101729929447174,
-0.08796148002147675,
-0.02397453784942627,
-0.01667884923517704,
-0.0030517964623868465,
0.02495797723531723,
-0.01505275722593069,
0.07088243216276169,
-0.13151364028453827,
0.005387657321989536,
-0.030126672238111496,
-0.10256537795066833,
0.007569135632365942,
0.0625382661819458,
0.08572302758693695,
0.10011370480060577,
-0.015211588703095913,
0.01312902383506298,
-0.02790243551135063,
0.2384514957666397,
-0.05566997826099396,
0.016162127256393433,
0.08291159570217133,
-0.006549317389726639,
0.06077995151281357,
0.14332421123981476,
0.031107354909181595,
-0.10344966500997543,
0.026239804923534393,
0.08279740810394287,
-0.011387096717953682,
-0.24905337393283844,
-0.027795718982815742,
-0.021104369312524796,
-0.07223430275917053,
0.08937817811965942,
0.039268217980861664,
-0.04408739507198334,
0.039784226566553116,
0.009505187161266804,
0.015019292011857033,
-0.05227162316441536,
0.07300569862127304,
0.0843420922756195,
0.044309213757514954,
0.09430700540542603,
-0.02246721461415291,
-0.01326454896479845,
0.06375598162412643,
0.01653946191072464,
0.27089476585388184,
-0.02912173978984356,
0.1265098750591278,
0.02577902562916279,
0.14757762849330902,
-0.03046802617609501,
0.0497719906270504,
0.016993021592497826,
0.005236736964434385,
-0.006469224579632282,
-0.05238493159413338,
-0.028716323897242546,
0.010623561218380928,
-0.033559005707502365,
0.03145766630768776,
-0.072422094643116,
0.04650338739156723,
0.012846926227211952,
0.30026358366012573,
0.046353355050086975,
-0.28377917408943176,
-0.060969606041908264,
-0.0025149229913949966,
-0.04769912362098694,
-0.07458866387605667,
0.0042066858150064945,
0.14444918930530548,
-0.12471005320549011,
0.038944706320762634,
-0.05555421859025955,
0.0886315107345581,
-0.047133252024650574,
0.002745990874245763,
0.06521934270858765,
0.14671555161476135,
-0.00930049829185009,
0.07033123075962067,
-0.19015176594257355,
0.22535064816474915,
0.02806742489337921,
0.11107238382101059,
-0.06268732994794846,
0.01371737290173769,
0.009696751832962036,
0.024894634261727333,
0.11012186855077744,
-0.0009790185140445828,
-0.025152239948511124,
-0.16991205513477325,
-0.11471299082040787,
0.0588594451546669,
0.11703379452228546,
-0.014559203758835793,
0.09779727458953857,
-0.04065265133976936,
-0.003476441837847233,
0.037613872438669205,
-0.07453273236751556,
-0.12917862832546234,
-0.0869995504617691,
0.0011966468300670385,
0.018022233620285988,
-0.040157563984394073,
-0.04793335869908333,
-0.09175283461809158,
-0.027769658714532852,
0.13335666060447693,
-0.0011145997559651732,
-0.042929355055093765,
-0.13485918939113617,
0.05789497494697571,
0.14413927495479584,
-0.057899221777915955,
0.020717989653348923,
0.004597014747560024,
0.10189645737409592,
0.050958067178726196,
-0.08211290091276169,
0.05166389048099518,
-0.066646508872509,
-0.159434512257576,
-0.060267701745033264,
0.11733788996934891,
0.0798548087477684,
0.05372031405568123,
-0.0018634116277098656,
0.03195776790380478,
0.001087200827896595,
-0.08624793589115143,
0.0065838187001645565,
0.05284169316291809,
0.08949671685695648,
0.05000724270939827,
-0.09485629200935364,
0.04709721356630325,
-0.03529861569404602,
-0.0008067346061579883,
0.1276463270187378,
0.2132015973329544,
-0.08589248359203339,
0.08634769916534424,
0.0705694928765297,
-0.07876066863536835,
-0.17684504389762878,
0.07033498585224152,
0.13213099539279938,
0.016934474930167198,
0.03543206676840782,
-0.20451582968235016,
0.136703222990036,
0.11271530389785767,
-0.014065285213291645,
0.053042616695165634,
-0.29721778631210327,
-0.12317748367786407,
0.0715777650475502,
0.10428895056247711,
0.039872508496046066,
-0.12767581641674042,
-0.027380837127566338,
-0.011524908244609833,
-0.14564228057861328,
0.14557383954524994,
-0.07295552641153336,
0.1220991238951683,
-0.007462579756975174,
0.12211264669895172,
0.022521592676639557,
-0.04204699024558067,
0.12910519540309906,
0.07732179760932922,
0.08751111477613449,
-0.038867875933647156,
-0.0034280612599104643,
0.051412492990493774,
-0.06582829356193542,
0.0500047393143177,
-0.04125725105404854,
0.06616269052028656,
-0.16375742852687836,
-0.0001926847326103598,
-0.08382285386323929,
0.042858805507421494,
-0.048179205507040024,
-0.04673679172992706,
-0.02837354503571987,
0.047072429209947586,
0.06762097030878067,
-0.034425802528858185,
0.028975019231438637,
0.022207561880350113,
0.05482621118426323,
0.08879289031028748,
0.08902720361948013,
-0.016652587801218033,
-0.10802239924669266,
0.011666722595691681,
-0.005237754434347153,
0.052015192806720734,
-0.10303502529859543,
0.015008479356765747,
0.1355186402797699,
0.05856452137231827,
0.12622582912445068,
0.025493772700428963,
-0.033693332225084305,
-0.01588965207338333,
0.015048588626086712,
-0.12310485541820526,
-0.11950059235095978,
0.03813892975449562,
-0.044968657195568085,
-0.1564963012933731,
0.007456466089934111,
0.10001354664564133,
-0.038831569254398346,
-0.013249670155346394,
-0.01153651438653469,
0.02370799519121647,
-0.012997102923691273,
0.20347286760807037,
0.04207928851246834,
0.06322971731424332,
-0.10291847586631775,
0.125278502702713,
0.05781823769211769,
-0.046917859464883804,
0.05640323832631111,
0.06819168478250504,
-0.09437438100576401,
-0.005559313111007214,
0.10854540020227432,
0.16908247768878937,
-0.04301924630999565,
-0.019544007256627083,
-0.07265399396419525,
-0.07574106007814407,
0.05816299468278885,
0.16124121844768524,
0.049453675746917725,
-0.0072744861245155334,
-0.043734826147556305,
0.02573220431804657,
-0.127155140042305,
0.0742294117808342,
0.04777776822447777,
0.06773710995912552,
-0.10603280365467072,
0.10999467223882675,
-0.008827384561300278,
0.03603869676589966,
-0.015293309465050697,
0.029815787449479103,
-0.09613595902919769,
-0.025384530425071716,
-0.12199772894382477,
0.018123673275113106,
-0.012103267014026642,
0.007166232448071241,
-0.011531086638569832,
-0.052691712975502014,
-0.03719907999038696,
0.0250241719186306,
-0.08071354776620865,
-0.0555092990398407,
0.01592441461980343,
0.044496044516563416,
-0.15431174635887146,
-0.011960435658693314,
0.022838694974780083,
-0.09326671063899994,
0.07630770653486252,
0.06400254368782043,
0.01679043658077717,
0.02824779972434044,
-0.10626271367073059,
-0.04577949643135071,
0.014363003894686699,
0.028768910095095634,
0.08392637223005295,
-0.08719441294670105,
-0.013661785051226616,
-0.03408362716436386,
0.05065039172768593,
0.013558520935475826,
0.08583671599626541,
-0.11507310718297958,
-0.004388142842799425,
-0.058386728167533875,
-0.033108532428741455,
-0.060499779880046844,
0.035212863236665726,
0.11471434682607651,
0.03420228511095047,
0.1697176992893219,
-0.06624399125576019,
0.03905555233359337,
-0.19449296593666077,
-0.03425237163901329,
0.0022307448089122772,
-0.041714418679475784,
-0.08464933186769485,
-0.04331273213028908,
0.09281247109174728,
-0.05104425922036171,
0.09339030832052231,
-0.008023485541343689,
0.09168950468301773,
0.03106769174337387,
-0.004159982316195965,
-0.0574500598013401,
0.002587763126939535,
0.14829230308532715,
0.059578195214271545,
-0.019535692408680916,
0.10161074250936508,
-0.007636324502527714,
0.055051978677511215,
0.042200472205877304,
0.22369717061519623,
0.14934852719306946,
-0.019970005378127098,
0.06442901492118835,
0.07116922736167908,
-0.12108270823955536,
-0.1231888085603714,
0.1450147032737732,
-0.04702764004468918,
0.12182969599962234,
-0.05513804405927658,
0.22249366343021393,
0.022729134187102318,
-0.17566929757595062,
0.05271832272410393,
-0.05563005059957504,
-0.12081822752952576,
-0.11865558475255966,
-0.01637094095349312,
-0.0812670961022377,
-0.09863834083080292,
0.02674688771367073,
-0.12161817401647568,
0.06365501135587692,
0.1197798103094101,
0.01622503250837326,
0.02091297321021557,
0.15546195209026337,
-0.04194974526762962,
0.017765434458851814,
0.06134774908423424,
0.023563535884022713,
-0.005445914808660746,
-0.060589369386434555,
-0.06372005492448807,
0.051819805055856705,
0.03416842967271805,
0.08645053952932358,
-0.048021335154771805,
0.017603132873773575,
0.03387916088104248,
-0.014511496759951115,
-0.07334713637828827,
0.012821829877793789,
0.026026152074337006,
0.040715474635362625,
0.054663367569446564,
0.05334704741835594,
0.015987182036042213,
-0.036218658089637756,
0.26475760340690613,
-0.07303677499294281,
-0.085704505443573,
-0.13139083981513977,
0.19618704915046692,
0.01842135563492775,
-0.01868838258087635,
0.07636210322380066,
-0.10420916974544525,
-0.021272817626595497,
0.16892629861831665,
0.12432333081960678,
-0.100326307117939,
-0.03050236590206623,
-0.015095849521458149,
-0.012279326096177101,
-0.04000566527247429,
0.1165037602186203,
0.09432512521743774,
0.005858945660293102,
-0.07683194428682327,
-0.029023734852671623,
-0.01598750613629818,
-0.04573478177189827,
-0.06333954632282257,
0.036668140441179276,
0.01629621349275112,
0.001390916295349598,
-0.037258733063936234,
0.05366704985499382,
-0.010518928058445454,
-0.24397854506969452,
0.02984105609357357,
-0.15564216673374176,
-0.18083685636520386,
-0.03202323615550995,
0.06257467716932297,
-0.003578064264729619,
0.03914331644773483,
-0.018408015370368958,
0.004005352966487408,
0.1483214646577835,
-0.034478265792131424,
-0.048656053841114044,
-0.12151851505041122,
0.10222617536783218,
-0.09991360455751419,
0.20394137501716614,
0.007015187293291092,
0.08458120375871658,
0.09798981994390488,
0.02101932466030121,
-0.13412854075431824,
0.034687552601099014,
0.07500398904085159,
-0.10820427536964417,
0.015648994594812393,
0.15202172100543976,
-0.054584112018346786,
0.07829757779836655,
0.025238037109375,
-0.1043907105922699,
-0.01788037270307541,
-0.026323385536670685,
-0.03175824135541916,
-0.08100558072328568,
-0.012937393970787525,
-0.06292213499546051,
0.16618552803993225,
0.22106698155403137,
-0.024782856926321983,
0.01676815189421177,
-0.08739461749792099,
0.01570943184196949,
0.04444042965769768,
0.052224740386009216,
-0.04296450316905975,
-0.2057558298110962,
0.029216589406132698,
0.018857449293136597,
0.024299796670675278,
-0.19669115543365479,
-0.08264556527137756,
0.046031586825847626,
-0.030007624998688698,
-0.05261676758527756,
0.0998680591583252,
0.02479231357574463,
0.043899055570364,
-0.03436813876032829,
-0.09566767513751984,
-0.04475254938006401,
0.1445103883743286,
-0.16338606178760529,
-0.053359899669885635
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09773217141628265,
0.15434561669826508,
-0.002176688751205802,
0.09130309522151947,
0.1370958387851715,
0.039450522512197495,
0.08763068914413452,
0.1350013166666031,
-0.08115600049495697,
0.07363004237413406,
0.07893723994493484,
0.053475238382816315,
0.051450591534376144,
0.12407264113426208,
-0.04645957052707672,
-0.21009866893291473,
0.0129769928753376,
-0.019811732694506645,
-0.05772647261619568,
0.0986359715461731,
0.08296174556016922,
-0.10408276319503784,
0.07936573028564453,
-0.019655508920550346,
-0.1541697382926941,
0.009550623595714569,
-0.03853033483028412,
-0.022551434114575386,
0.09541783481836319,
-0.007062269374728203,
0.09713481366634369,
0.011491799727082253,
0.1398780345916748,
-0.2123769223690033,
-0.00039122404996305704,
0.0777609795331955,
0.03720564767718315,
0.08943942189216614,
0.0454958938062191,
0.026929447427392006,
0.05109170079231262,
-0.15269893407821655,
0.08985103666782379,
0.025638079270720482,
-0.07621201127767563,
-0.08960612863302231,
-0.09076084196567535,
0.019366659224033356,
0.07919102162122726,
0.08167212456464767,
0.008987032808363438,
0.13255083560943604,
-0.0915868803858757,
0.08190856873989105,
0.1985885053873062,
-0.2740398049354553,
-0.06698577851057053,
0.04626866802573204,
0.05218883976340294,
0.06782560050487518,
-0.11727578192949295,
-0.02825629711151123,
0.029272466897964478,
0.0322447270154953,
0.09430081397294998,
-0.019488021731376648,
-0.13937470316886902,
0.0012573670828714967,
-0.13358381390571594,
-0.015775369480252266,
0.10938315838575363,
0.04921581596136093,
-0.04472837597131729,
-0.06268598884344101,
-0.07795683294534683,
-0.10644365102052689,
-0.02374907396733761,
-0.02223231829702854,
0.052306730300188065,
-0.05410003662109375,
-0.05687427520751953,
-0.04167981445789337,
-0.05971875786781311,
-0.08783352375030518,
-0.00273704226128757,
0.12096526473760605,
0.04452517628669739,
0.02460361085832119,
-0.034065186977386475,
0.10374034196138382,
0.0011314048897475004,
-0.13262027502059937,
-0.01706669293344021,
0.00894224550575018,
-0.12349898368120193,
-0.056905340403318405,
-0.028120841830968857,
0.013680060394108295,
0.01663806475698948,
0.14763592183589935,
-0.04089832305908203,
0.08120440691709518,
0.017189446836709976,
-0.019892079755663872,
-0.01170824933797121,
0.14357921481132507,
-0.03869707137346268,
-0.05164948105812073,
0.005507018882781267,
0.09688359498977661,
-0.00019148783758282661,
-0.004171453882008791,
-0.07396124303340912,
-0.019475650042295456,
0.09320010989904404,
0.055875249207019806,
-0.0551091693341732,
0.03196410462260246,
-0.03262708708643913,
-0.02494293823838234,
0.02201738953590393,
-0.1244281530380249,
0.03559516742825508,
0.008802715688943863,
-0.0831790640950203,
-0.03961249440908432,
0.011903250589966774,
-0.01707167737185955,
-0.019715406000614166,
0.088053397834301,
-0.0857691690325737,
-0.005285531282424927,
-0.07709472626447678,
-0.07208443433046341,
0.015160148963332176,
-0.1433897763490677,
-0.01450892724096775,
-0.04920031130313873,
-0.2017439752817154,
-0.0366717129945755,
0.04350423812866211,
-0.07654285430908203,
-0.04471447691321373,
-0.05807224288582802,
-0.08289550244808197,
0.01467339787632227,
-0.00452185096219182,
0.1673603057861328,
-0.05934417247772217,
0.07774989306926727,
-0.01507299579679966,
0.04166920483112335,
0.01099286787211895,
0.048466384410858154,
-0.08655580133199692,
0.023892516270279884,
-0.14166919887065887,
0.08013179153203964,
-0.09553461521863937,
0.008326290175318718,
-0.12575748562812805,
-0.08812315762042999,
0.03976529464125633,
-0.028056465089321136,
0.06916828453540802,
0.1437348872423172,
-0.192520409822464,
0.0028523807413876057,
0.11651979386806488,
-0.04831567779183388,
-0.04556015506386757,
0.07963525503873825,
-0.05688337981700897,
0.03199870139360428,
0.050773948431015015,
0.166508287191391,
0.07561836391687393,
-0.14858633279800415,
-0.01634201966226101,
0.027291348204016685,
0.048862673342227936,
0.005838535260409117,
0.04651150107383728,
0.00537998927757144,
0.023782452568411827,
0.003509162925183773,
-0.07929997891187668,
-0.025567466393113136,
-0.09077846258878708,
-0.07172145694494247,
-0.05268719419836998,
-0.08862724900245667,
0.023545200005173683,
0.01051755528897047,
0.021138498559594154,
-0.05742688477039337,
-0.10187892615795135,
0.10308090597391129,
0.12996357679367065,
-0.05282061919569969,
0.007908322848379612,
-0.06959152221679688,
0.019866617396473885,
-0.03143713250756264,
-0.03578740358352661,
-0.19477041065692902,
-0.13514581322669983,
0.053873054683208466,
-0.05270993337035179,
0.03582732751965523,
0.036269884556531906,
0.06303083896636963,
0.06247885897755623,
-0.0326845645904541,
-0.019377071410417557,
-0.07026756554841995,
-0.004213051404803991,
-0.11384306102991104,
-0.19818425178527832,
-0.06383821368217468,
-0.03331846371293068,
0.1425940990447998,
-0.20221394300460815,
-0.0007613177876919508,
-0.026437705382704735,
0.12027856707572937,
0.015564888715744019,
-0.05722469463944435,
0.007727525662630796,
0.02831418067216873,
0.006079099141061306,
-0.09938367456197739,
0.04076487943530083,
0.009100805968046188,
-0.055505137890577316,
-0.059238992631435394,
-0.10712804645299911,
-0.004412030801177025,
0.056815728545188904,
0.0799744725227356,
-0.10499252378940582,
-0.00696562509983778,
-0.05085882544517517,
-0.04592577740550041,
-0.0880383774638176,
0.009680334478616714,
0.19732192158699036,
0.03250550478696823,
0.12136383354663849,
-0.0616694837808609,
-0.07063579559326172,
0.001299285446293652,
0.0226350799202919,
0.023067642003297806,
0.09086766093969345,
0.10523086041212082,
-0.10025005042552948,
0.08652301132678986,
0.07040796428918839,
-0.04850098118185997,
0.11793680489063263,
-0.04490014538168907,
-0.08204072713851929,
-0.02044837549328804,
0.007713013794273138,
-0.03441780433058739,
0.13860160112380981,
-0.08368832617998123,
0.012315060943365097,
0.03330100327730179,
0.024318544194102287,
0.01764598861336708,
-0.17064520716667175,
-0.000395363662391901,
0.012646318413317204,
-0.06524639576673508,
-0.03693568706512451,
-0.024680517613887787,
0.03630482405424118,
0.09360799938440323,
0.026006102561950684,
-0.04433826729655266,
0.017626218497753143,
-0.011597255244851112,
-0.06974537670612335,
0.1921377032995224,
-0.10711531341075897,
-0.09110880643129349,
-0.10972986370325089,
0.027452891692519188,
-0.04361888766288757,
-0.036507535725831985,
0.004415955860167742,
-0.08223515003919601,
-0.05443701893091202,
-0.08862854540348053,
-0.022413022816181183,
-0.01819491945207119,
-0.0022983266972005367,
0.024697883054614067,
-0.015235465951263905,
0.073374442756176,
-0.13089707493782043,
0.0051719690673053265,
-0.029469309374690056,
-0.10421805828809738,
0.008648411370813847,
0.06324754655361176,
0.08519896864891052,
0.0993838980793953,
-0.014622489921748638,
0.012689583003520966,
-0.027087781578302383,
0.23895218968391418,
-0.05435655638575554,
0.015161674469709396,
0.0833970159292221,
-0.00839932356029749,
0.06145518645644188,
0.14265784621238708,
0.030546782538294792,
-0.10354557633399963,
0.02690400369465351,
0.08210594207048416,
-0.011787861585617065,
-0.24847929179668427,
-0.02863497845828533,
-0.020460400730371475,
-0.07202330976724625,
0.08877795189619064,
0.03952356427907944,
-0.04862434044480324,
0.03861113265156746,
0.010509045794606209,
0.013515651226043701,
-0.05354674533009529,
0.07270308583974838,
0.08602936565876007,
0.044091906398534775,
0.09420207142829895,
-0.021907705813646317,
-0.013296769000589848,
0.0656939446926117,
0.01679726131260395,
0.27024269104003906,
-0.02974727936089039,
0.12578299641609192,
0.02691734954714775,
0.14971201121807098,
-0.029689261689782143,
0.04682434722781181,
0.017327938228845596,
0.005877888761460781,
-0.0073158699087798595,
-0.05200224742293358,
-0.030254490673542023,
0.011149942874908447,
-0.03244936838746071,
0.03156231343746185,
-0.07325738668441772,
0.049158163368701935,
0.012172797694802284,
0.3022304177284241,
0.04568719491362572,
-0.2813953757286072,
-0.06063332408666611,
-0.00209798919968307,
-0.04711215943098068,
-0.07417362928390503,
0.00460055423900485,
0.14721399545669556,
-0.12588275969028473,
0.03630860149860382,
-0.0546332411468029,
0.08764344453811646,
-0.048113223165273666,
0.001949498662725091,
0.064095638692379,
0.14608441293239594,
-0.008609047159552574,
0.07057495415210724,
-0.1881282776594162,
0.225938618183136,
0.027910292148590088,
0.10922135412693024,
-0.0604667142033577,
0.013835567981004715,
0.009635047055780888,
0.02596115693449974,
0.1101253405213356,
0.00020589851192198694,
-0.026333706453442574,
-0.17013877630233765,
-0.1160028874874115,
0.05891925469040871,
0.1174415647983551,
-0.016706380993127823,
0.09674599766731262,
-0.04158838465809822,
-0.00214829808101058,
0.037276845425367355,
-0.07324784994125366,
-0.12829507887363434,
-0.086454376578331,
0.0006475311238318682,
0.017332058399915695,
-0.040461353957653046,
-0.04910639300942421,
-0.09176009148359299,
-0.02316676266491413,
0.13549122214317322,
-0.0014440150698646903,
-0.04314539209008217,
-0.1342252641916275,
0.05678344890475273,
0.14321599900722504,
-0.05907853692770004,
0.019276995211839676,
0.004310679621994495,
0.102610282599926,
0.0514894463121891,
-0.08221188932657242,
0.05137920752167702,
-0.06680648773908615,
-0.1602986454963684,
-0.06050003319978714,
0.1173732727766037,
0.08004090189933777,
0.05394935980439186,
-0.0018842217978090048,
0.03144308552145958,
0.0020469489973038435,
-0.08630438894033432,
0.007625184021890163,
0.05372806638479233,
0.0895393118262291,
0.05044113099575043,
-0.09478190541267395,
0.05038083344697952,
-0.03487950935959816,
-0.0030293769668787718,
0.12931807339191437,
0.21659405529499054,
-0.08673875778913498,
0.08793332427740097,
0.06975770741701126,
-0.0799935832619667,
-0.17730623483657837,
0.06848670542240143,
0.13381226360797882,
0.01627965271472931,
0.03757502883672714,
-0.20540791749954224,
0.13646382093429565,
0.11270210891962051,
-0.01549054030328989,
0.05179774761199951,
-0.2984207272529602,
-0.12331932038068771,
0.07033468037843704,
0.1040993258357048,
0.04285150393843651,
-0.12648388743400574,
-0.027775084599852562,
-0.009747079573571682,
-0.14481298625469208,
0.14462979137897491,
-0.070750392973423,
0.12216909229755402,
-0.007566405460238457,
0.12320730835199356,
0.022609779611229897,
-0.04085785150527954,
0.13153603672981262,
0.07564831525087357,
0.08611419051885605,
-0.03838784992694855,
-0.0021675967145711184,
0.04844691604375839,
-0.06648679822683334,
0.0484820157289505,
-0.040885064750909805,
0.06725167483091354,
-0.16498848795890808,
-0.00030986330239102244,
-0.08337869495153427,
0.04293716698884964,
-0.048723846673965454,
-0.046548303216695786,
-0.028693461790680885,
0.046721551567316055,
0.06770709902048111,
-0.03406386449933052,
0.02755366824567318,
0.02238379418849945,
0.052935175597667694,
0.09387592226266861,
0.08812907338142395,
-0.015366407111287117,
-0.1086912527680397,
0.010194720700383186,
-0.005435798317193985,
0.0523061528801918,
-0.10171890258789062,
0.016598401591181755,
0.13445861637592316,
0.057561226189136505,
0.12632031738758087,
0.02546106092631817,
-0.03375120088458061,
-0.014562682248651981,
0.014923829585313797,
-0.12460891157388687,
-0.12101361155509949,
0.0371471531689167,
-0.044002193957567215,
-0.1565912961959839,
0.004483197350054979,
0.10252877324819565,
-0.03819936513900757,
-0.012858905829489231,
-0.011248503811657429,
0.024226531386375427,
-0.013609470799565315,
0.20156434178352356,
0.0408281646668911,
0.06305345892906189,
-0.10167256742715836,
0.12531355023384094,
0.05773787200450897,
-0.04528846591711044,
0.05636581405997276,
0.06643050163984299,
-0.09316642582416534,
-0.004908430855721235,
0.1088971346616745,
0.16909100115299225,
-0.04583359882235527,
-0.019215548411011696,
-0.07232020795345306,
-0.07590986788272858,
0.05731029063463211,
0.15938514471054077,
0.0504511296749115,
-0.006226816214621067,
-0.043963052332401276,
0.025251565501093864,
-0.1259051412343979,
0.07426322251558304,
0.04881132394075394,
0.06771928071975708,
-0.10664647817611694,
0.11039306968450546,
-0.009263561107218266,
0.03750142455101013,
-0.01567186787724495,
0.028393404558300972,
-0.09583114087581635,
-0.02552969940006733,
-0.12359984964132309,
0.020804651081562042,
-0.009485648013651371,
0.007094209082424641,
-0.01017631683498621,
-0.053490396589040756,
-0.03728960081934929,
0.025864019989967346,
-0.0809851810336113,
-0.05525479093194008,
0.015610471367835999,
0.045730531215667725,
-0.1540515273809433,
-0.012765347957611084,
0.024440430104732513,
-0.09371448308229446,
0.07758641988039017,
0.06475891917943954,
0.01727624423801899,
0.027530761435627937,
-0.10433982312679291,
-0.04607345163822174,
0.014252658002078533,
0.02882000431418419,
0.08304817974567413,
-0.0880194827914238,
-0.014829291962087154,
-0.03377111628651619,
0.049727343022823334,
0.013561710715293884,
0.08924944698810577,
-0.11573031544685364,
-0.0035066972486674786,
-0.05843992158770561,
-0.03408036381006241,
-0.060053933411836624,
0.033671215176582336,
0.1138933077454567,
0.036846823990345,
0.16964659094810486,
-0.06786263734102249,
0.03892114758491516,
-0.19422005116939545,
-0.03384898230433464,
0.0023989519104361534,
-0.039478134363889694,
-0.08585912734270096,
-0.043713487684726715,
0.09143628925085068,
-0.05103814974427223,
0.09217540919780731,
-0.007920085452497005,
0.08942009508609772,
0.031353432685136795,
-0.007166740950196981,
-0.05513272434473038,
0.0017851843731477857,
0.1467631310224533,
0.05946220085024834,
-0.01882075145840645,
0.10259190946817398,
-0.00931723602116108,
0.05585822835564613,
0.04186277091503143,
0.22583767771720886,
0.14908497035503387,
-0.019565524533391,
0.06424332410097122,
0.0705837681889534,
-0.11989389359951019,
-0.12563902139663696,
0.14355067908763885,
-0.04667629301548004,
0.12147925049066544,
-0.053502246737480164,
0.22061322629451752,
0.02319333888590336,
-0.17639566957950592,
0.051352377980947495,
-0.05549304932355881,
-0.12070794403553009,
-0.12034031748771667,
-0.015175485983490944,
-0.08225275576114655,
-0.09931410849094391,
0.02614576928317547,
-0.12164365500211716,
0.06441907584667206,
0.11963452398777008,
0.015475243330001831,
0.021476682275533676,
0.15276463329792023,
-0.04139484465122223,
0.017516212537884712,
0.06060551106929779,
0.02303468808531761,
-0.004518431611359119,
-0.05953666567802429,
-0.06522931903600693,
0.05167735368013382,
0.036228206008672714,
0.0857706218957901,
-0.04678601771593094,
0.01883319951593876,
0.03264312446117401,
-0.015819091349840164,
-0.07348915934562683,
0.012153606861829758,
0.02644813060760498,
0.040322739630937576,
0.05362025648355484,
0.053959548473358154,
0.016164399683475494,
-0.0354313887655735,
0.2664763927459717,
-0.07231435924768448,
-0.0847516655921936,
-0.13101743161678314,
0.19904859364032745,
0.016223210841417313,
-0.017796721309423447,
0.07709088921546936,
-0.10316097736358643,
-0.0235085878521204,
0.16702938079833984,
0.12324638664722443,
-0.1038978323340416,
-0.029990889132022858,
-0.015954596921801567,
-0.012137099169194698,
-0.039021313190460205,
0.1184937134385109,
0.09357115626335144,
0.007276204880326986,
-0.07783116400241852,
-0.030025649815797806,
-0.01788053847849369,
-0.04386217147111893,
-0.06411340832710266,
0.03679896518588066,
0.0148544330149889,
0.0027255655732005835,
-0.0385289192199707,
0.05197775736451149,
-0.011555041186511517,
-0.24336087703704834,
0.0295147392898798,
-0.15496201813220978,
-0.18113622069358826,
-0.030948635190725327,
0.06331634521484375,
-0.005808208603411913,
0.038938090205192566,
-0.020206553861498833,
0.004050268791615963,
0.1492832750082016,
-0.03536932170391083,
-0.04859237000346184,
-0.11991424858570099,
0.10324038565158844,
-0.0996491089463234,
0.20542941987514496,
0.008103462867438793,
0.08513276278972626,
0.0968661829829216,
0.02208808809518814,
-0.13495932519435883,
0.03369820863008499,
0.07440964132547379,
-0.10842763632535934,
0.014853538945317268,
0.15244024991989136,
-0.05466211587190628,
0.08012402057647705,
0.027094611898064613,
-0.10359364002943039,
-0.01918765716254711,
-0.026669539511203766,
-0.03181767463684082,
-0.08030202239751816,
-0.015854211524128914,
-0.06511678546667099,
0.1650298833847046,
0.2205592542886734,
-0.025000546127557755,
0.01708901859819889,
-0.08647997677326202,
0.016408700495958328,
0.044576577842235565,
0.05442863702774048,
-0.04213928431272507,
-0.2053743302822113,
0.028823118656873703,
0.0210579801350832,
0.023890312761068344,
-0.19783025979995728,
-0.08463907241821289,
0.04629027843475342,
-0.028737789019942284,
-0.05170305073261261,
0.10146062076091766,
0.023272618651390076,
0.043749090284109116,
-0.03376885876059532,
-0.0973014310002327,
-0.045232828706502914,
0.14353880286216736,
-0.16235406696796417,
-0.05360354110598564
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.098466657102108,
0.15393711626529694,
-0.0021470931824296713,
0.09225495159626007,
0.1381899118423462,
0.040056467056274414,
0.08805861324071884,
0.1341075599193573,
-0.0818236768245697,
0.07333383709192276,
0.07886374741792679,
0.05307834967970848,
0.05106872320175171,
0.12357530742883682,
-0.04623623937368393,
-0.20981992781162262,
0.012609981000423431,
-0.019316937774419785,
-0.05731016397476196,
0.09838426858186722,
0.08194074779748917,
-0.10474354028701782,
0.07996905595064163,
-0.020141463726758957,
-0.15452058613300323,
0.010120649822056293,
-0.03896871581673622,
-0.022935181856155396,
0.09533975273370743,
-0.007122326642274857,
0.09701830893754959,
0.010859298519790173,
0.13985088467597961,
-0.2104482650756836,
-0.00021444105368573219,
0.0778186097741127,
0.03649325296282768,
0.08908287435770035,
0.045287881046533585,
0.02683170884847641,
0.050181373953819275,
-0.15285542607307434,
0.09004425257444382,
0.025392798706889153,
-0.07630027830600739,
-0.09055844694375992,
-0.09051264077425003,
0.01851128786802292,
0.07844704389572144,
0.0828097015619278,
0.008576265536248684,
0.13145756721496582,
-0.09183917939662933,
0.08214445412158966,
0.1960049867630005,
-0.27496740221977234,
-0.06721366196870804,
0.046320680528879166,
0.05201422795653343,
0.06810149550437927,
-0.11820325255393982,
-0.02778257057070732,
0.02949884720146656,
0.033224161714315414,
0.09449545294046402,
-0.019803866744041443,
-0.13922977447509766,
0.0015697014750912786,
-0.13305780291557312,
-0.01420819666236639,
0.1105635017156601,
0.0494857020676136,
-0.04463272541761398,
-0.06343919783830643,
-0.07679936289787292,
-0.10757549852132797,
-0.02413635142147541,
-0.021168453618884087,
0.0525604709982872,
-0.05530167371034622,
-0.057383943349123,
-0.03973611816763878,
-0.05934436619281769,
-0.08665326982736588,
-0.0025007009971886873,
0.11942721903324127,
0.04438989609479904,
0.024384401738643646,
-0.03356091305613518,
0.10281870514154434,
0.00123937230091542,
-0.13242729008197784,
-0.016049034893512726,
0.00848228670656681,
-0.12286179512739182,
-0.0569223128259182,
-0.028527313843369484,
0.015612942166626453,
0.01688683032989502,
0.14810501039028168,
-0.04095354303717613,
0.08208068460226059,
0.01789775863289833,
-0.021068058907985687,
-0.011774259619414806,
0.14195626974105835,
-0.04004097357392311,
-0.053288888186216354,
0.005246221553534269,
0.09728635847568512,
-0.0008922002743929625,
-0.004421499092131853,
-0.07378964126110077,
-0.01875905878841877,
0.09201807528734207,
0.05530339851975441,
-0.05639640614390373,
0.03379053622484207,
-0.03142315149307251,
-0.02461903728544712,
0.022605348378419876,
-0.1247371956706047,
0.035215914249420166,
0.008905858732759953,
-0.08341724425554276,
-0.03874441608786583,
0.011199546046555042,
-0.018044766038656235,
-0.020382432267069817,
0.08808164298534393,
-0.08675365149974823,
-0.006326593924313784,
-0.07740433514118195,
-0.07295642048120499,
0.014666386879980564,
-0.14306135475635529,
-0.013549204915761948,
-0.04878036677837372,
-0.2016415297985077,
-0.03802341967821121,
0.043407026678323746,
-0.07649710774421692,
-0.04357921704649925,
-0.05786547809839249,
-0.08360323309898376,
0.014334645122289658,
-0.00445924699306488,
0.1681874692440033,
-0.05976671725511551,
0.07676298916339874,
-0.01355504896491766,
0.04182007536292076,
0.010659354738891125,
0.04850928485393524,
-0.08597330749034882,
0.023113710805773735,
-0.14173908531665802,
0.07900940626859665,
-0.09623045474290848,
0.010007216595113277,
-0.12445975095033646,
-0.08949950337409973,
0.040639616549015045,
-0.02720584347844124,
0.06860800087451935,
0.14354078471660614,
-0.19150228798389435,
0.0025256345979869366,
0.11552674323320389,
-0.04740289971232414,
-0.04572925344109535,
0.07909982651472092,
-0.05694746971130371,
0.030421169474720955,
0.05135497823357582,
0.16653519868850708,
0.07575546950101852,
-0.14835608005523682,
-0.0176799688488245,
0.026112399995326996,
0.04911305010318756,
0.005484194029122591,
0.045132264494895935,
0.006229943595826626,
0.02415253035724163,
0.00426699360832572,
-0.07777632027864456,
-0.02566860057413578,
-0.08999831229448318,
-0.07129894942045212,
-0.05190955847501755,
-0.08854958415031433,
0.023342709988355637,
0.010968676768243313,
0.021178780123591423,
-0.05846371129155159,
-0.10210278630256653,
0.10237647593021393,
0.1296471208333969,
-0.05332311615347862,
0.006315281614661217,
-0.0700889527797699,
0.020392002537846565,
-0.0314597487449646,
-0.03529680147767067,
-0.1942061334848404,
-0.13627390563488007,
0.05280975252389908,
-0.05067528411746025,
0.03548325598239899,
0.03652585297822952,
0.06392089277505875,
0.06304911524057388,
-0.033139556646347046,
-0.019670136272907257,
-0.06936279684305191,
-0.0037510215770453215,
-0.11354056745767593,
-0.1984696090221405,
-0.06428472697734833,
-0.03295038267970085,
0.14304304122924805,
-0.20333118736743927,
-0.00035788633977063,
-0.02560589089989662,
0.1198728084564209,
0.015145785175263882,
-0.05688675120472908,
0.008564758114516735,
0.029361573979258537,
0.007139472756534815,
-0.09930040687322617,
0.04124167189002037,
0.008887682110071182,
-0.054060209542512894,
-0.06066368892788887,
-0.10771824419498444,
-0.0026594630908221006,
0.058152198791503906,
0.07922494411468506,
-0.10530875623226166,
-0.0068078977055847645,
-0.05055375397205353,
-0.04586179554462433,
-0.08650428801774979,
0.009313770569860935,
0.19845585525035858,
0.03179633244872093,
0.12126386165618896,
-0.060821447521448135,
-0.07029270380735397,
0.0018450767965987325,
0.02416512742638588,
0.024065613746643066,
0.09062279760837555,
0.10414904356002808,
-0.09762682020664215,
0.08680173754692078,
0.06957690417766571,
-0.04843231290578842,
0.11860979348421097,
-0.04533987492322922,
-0.0816497653722763,
-0.019770409911870956,
0.006621147971600294,
-0.03478475287556648,
0.13955533504486084,
-0.08453643321990967,
0.011048471555113792,
0.033118732273578644,
0.02382812649011612,
0.017917994409799576,
-0.16967308521270752,
-0.0002474521170370281,
0.011624990962445736,
-0.06428474187850952,
-0.03804251551628113,
-0.024322697892785072,
0.035825133323669434,
0.09333345293998718,
0.026126304641366005,
-0.04527052119374275,
0.017683636397123337,
-0.011598317883908749,
-0.06933333724737167,
0.19230052828788757,
-0.1074274331331253,
-0.09016456454992294,
-0.10968848317861557,
0.025770725682377815,
-0.044704411178827286,
-0.03694077581167221,
0.003911891486495733,
-0.08396515250205994,
-0.05497351288795471,
-0.08824675530195236,
-0.023555511608719826,
-0.01812783069908619,
-0.002494610846042633,
0.02477722428739071,
-0.015406248159706593,
0.07314811646938324,
-0.13068853318691254,
0.0052550723776221275,
-0.029734037816524506,
-0.10441091656684875,
0.009023788385093212,
0.06383508443832397,
0.08543732762336731,
0.09867328405380249,
-0.014311178587377071,
0.012744019739329815,
-0.026962261646986008,
0.23984764516353607,
-0.05458493530750275,
0.015487924218177795,
0.08349355310201645,
-0.007441223133355379,
0.06025945767760277,
0.14251765608787537,
0.03161157667636871,
-0.10415191203355789,
0.026606595143675804,
0.08264945447444916,
-0.011216379702091217,
-0.2484973818063736,
-0.02798895724117756,
-0.021018780767917633,
-0.07226104289293289,
0.0888686254620552,
0.03953084722161293,
-0.047895774245262146,
0.03909684717655182,
0.011437221430242062,
0.014892278239130974,
-0.05361856892704964,
0.07233205437660217,
0.08799073100090027,
0.043624263256788254,
0.09479481726884842,
-0.02219146490097046,
-0.013306207954883575,
0.06506897509098053,
0.017089474946260452,
0.271125853061676,
-0.02986379712820053,
0.12568028271198273,
0.027202490717172623,
0.1495351344347,
-0.029447168111801147,
0.04792901501059532,
0.01668008230626583,
0.005307900253683329,
-0.006865920498967171,
-0.052102744579315186,
-0.029035741463303566,
0.010293680243194103,
-0.03388866409659386,
0.030947422608733177,
-0.07353205233812332,
0.04728318378329277,
0.012251067906618118,
0.30153071880340576,
0.045358963310718536,
-0.28368452191352844,
-0.06156764179468155,
-0.002787951612845063,
-0.046733301132917404,
-0.07373254001140594,
0.004795243497937918,
0.1466534286737442,
-0.12536348402500153,
0.037480395287275314,
-0.054611433297395706,
0.087239570915699,
-0.04832728952169418,
0.0028690434992313385,
0.06620117276906967,
0.14688073098659515,
-0.009375245310366154,
0.0700221061706543,
-0.18820883333683014,
0.22445401549339294,
0.02818034216761589,
0.11004684120416641,
-0.06077646464109421,
0.014012456871569157,
0.010321232490241528,
0.027044568210840225,
0.10968595743179321,
-0.0003003958554472774,
-0.025859955698251724,
-0.170199915766716,
-0.11496121436357498,
0.06003899499773979,
0.11678969115018845,
-0.015517833642661572,
0.09741572290658951,
-0.04070666432380676,
-0.0026510723400861025,
0.03689400851726532,
-0.07488909363746643,
-0.12861888110637665,
-0.08738730847835541,
0.0001942133967531845,
0.018205245956778526,
-0.03984597697854042,
-0.04854270815849304,
-0.09231199324131012,
-0.024886231869459152,
0.13478389382362366,
-0.0005342251970432699,
-0.04335131496191025,
-0.1347472220659256,
0.05571722984313965,
0.14336992800235748,
-0.05782138183712959,
0.019640866667032242,
0.005031142849475145,
0.10173960030078888,
0.05256412923336029,
-0.08198868483304977,
0.05150981619954109,
-0.06726803630590439,
-0.1594841629266739,
-0.06052718311548233,
0.11684142798185349,
0.07949423789978027,
0.05311722680926323,
-0.0022119367495179176,
0.03181728348135948,
0.0013733574887737632,
-0.08693879097700119,
0.008363342843949795,
0.05181749910116196,
0.09112662822008133,
0.0495951808989048,
-0.09574196487665176,
0.04986283555626869,
-0.03416614234447479,
-0.0021139259915798903,
0.12717090547084808,
0.21384020149707794,
-0.08587879687547684,
0.08619027584791183,
0.07032834738492966,
-0.07946524024009705,
-0.17665009200572968,
0.0696486234664917,
0.1324092298746109,
0.01678735390305519,
0.03588606044650078,
-0.20637111365795135,
0.13787810504436493,
0.11254454404115677,
-0.014468871988356113,
0.05398506298661232,
-0.29527541995048523,
-0.12278173863887787,
0.07015765458345413,
0.10487686842679977,
0.043047789484262466,
-0.12688854336738586,
-0.02715339884161949,
-0.009930893778800964,
-0.14508327841758728,
0.14361067116260529,
-0.07356823235750198,
0.12213703989982605,
-0.007912439294159412,
0.12358645349740982,
0.021886663511395454,
-0.04111918807029724,
0.13073016703128815,
0.07665234059095383,
0.08684324473142624,
-0.03874978795647621,
-0.0016862696502357721,
0.04853008687496185,
-0.06567217409610748,
0.04854224622249603,
-0.041610829532146454,
0.06663480401039124,
-0.16574902832508087,
-0.00021306784765329212,
-0.08397293835878372,
0.04259762912988663,
-0.04867219179868698,
-0.04616757109761238,
-0.027666307985782623,
0.04695925861597061,
0.06658902019262314,
-0.034299030900001526,
0.026247277855873108,
0.021864641457796097,
0.053690411150455475,
0.09288965910673141,
0.08895702660083771,
-0.016619563102722168,
-0.10964460670948029,
0.011322898790240288,
-0.005895086098462343,
0.05217582732439041,
-0.10137277096509933,
0.01542446855455637,
0.13527911901474,
0.0569138377904892,
0.12607575953006744,
0.02608952485024929,
-0.032640308141708374,
-0.01504136435687542,
0.01607612334191799,
-0.12494348734617233,
-0.11949712783098221,
0.037443600594997406,
-0.04786038398742676,
-0.15643581748008728,
0.005755642894655466,
0.10223983228206635,
-0.0380035862326622,
-0.012587335892021656,
-0.011310437694191933,
0.02392515353858471,
-0.013703604228794575,
0.20238077640533447,
0.041165802627801895,
0.06272093951702118,
-0.10279219597578049,
0.12503698468208313,
0.057843003422021866,
-0.046306271106004715,
0.056235332041978836,
0.06769983470439911,
-0.09434715658426285,
-0.005804480519145727,
0.10760977119207382,
0.17057478427886963,
-0.04423470422625542,
-0.019965805113315582,
-0.07342416793107986,
-0.0763888731598854,
0.057240091264247894,
0.15786193311214447,
0.04977346956729889,
-0.007090540137141943,
-0.04411766305565834,
0.024923548102378845,
-0.12648490071296692,
0.07390710711479187,
0.047501467168331146,
0.06786788254976273,
-0.10645068436861038,
0.10982457548379898,
-0.008988503366708755,
0.03660820797085762,
-0.01566866971552372,
0.028953906148672104,
-0.0963587611913681,
-0.025879286229610443,
-0.1236119270324707,
0.019827421754598618,
-0.010053776204586029,
0.007698704954236746,
-0.010660890489816666,
-0.05236463248729706,
-0.03796008974313736,
0.02616121992468834,
-0.08101142197847366,
-0.05480168014764786,
0.017346888780593872,
0.04620044678449631,
-0.15340644121170044,
-0.012149614281952381,
0.023306624963879585,
-0.0938725620508194,
0.07753590494394302,
0.0647818073630333,
0.01707855798304081,
0.02816794440150261,
-0.1021670326590538,
-0.04639391601085663,
0.013597635552287102,
0.02819078229367733,
0.08369582891464233,
-0.08747927099466324,
-0.014221018180251122,
-0.03393610194325447,
0.0501210056245327,
0.01382068358361721,
0.08843542635440826,
-0.11521349102258682,
-0.0030514057725667953,
-0.05753020942211151,
-0.0330343022942543,
-0.06078813597559929,
0.034188058227300644,
0.11498746275901794,
0.035754475742578506,
0.16972596943378448,
-0.06798618286848068,
0.0380968302488327,
-0.1944582462310791,
-0.03428301587700844,
0.0021764570847153664,
-0.04095260426402092,
-0.08525579422712326,
-0.04338443651795387,
0.09225042909383774,
-0.051717858761548996,
0.0938287079334259,
-0.008249095641076565,
0.08943179249763489,
0.031133992597460747,
-0.0071371858939528465,
-0.056772418320178986,
0.002416357398033142,
0.14653466641902924,
0.059081364423036575,
-0.019526124000549316,
0.10180220752954483,
-0.008340497501194477,
0.05485750734806061,
0.043114129453897476,
0.22400864958763123,
0.14831304550170898,
-0.01885618455708027,
0.06405448913574219,
0.07135477662086487,
-0.12078732252120972,
-0.124625064432621,
0.14477403461933136,
-0.046584539115428925,
0.12231122702360153,
-0.05454503372311592,
0.21897074580192566,
0.022622890770435333,
-0.1755601167678833,
0.05162300169467926,
-0.05662725493311882,
-0.12085580080747604,
-0.11897484213113785,
-0.01380206923931837,
-0.0814657211303711,
-0.09983948618173599,
0.026346851140260696,
-0.12222716957330704,
0.06344519555568695,
0.12077264487743378,
0.01617281883955002,
0.020900582894682884,
0.15434105694293976,
-0.040476709604263306,
0.018142830580472946,
0.06076844781637192,
0.022830264642834663,
-0.004827477037906647,
-0.058999888598918915,
-0.06441856175661087,
0.05096201226115227,
0.03468357399106026,
0.08546938747167587,
-0.04742100089788437,
0.018315806984901428,
0.03315670043230057,
-0.014869106002151966,
-0.0729929730296135,
0.01218736357986927,
0.026497790589928627,
0.0400417298078537,
0.05469284579157829,
0.05340895801782608,
0.015586006455123425,
-0.03570820018649101,
0.2651890218257904,
-0.07230691611766815,
-0.0832354724407196,
-0.1318701058626175,
0.19827976822853088,
0.016129953786730766,
-0.01840021088719368,
0.07624798268079758,
-0.10257042944431305,
-0.022369537502527237,
0.16854321956634521,
0.1251838356256485,
-0.10335730761289597,
-0.030248334631323814,
-0.01535437535494566,
-0.012134609743952751,
-0.03888601064682007,
0.1188197135925293,
0.09355708956718445,
0.006812563166022301,
-0.07751326262950897,
-0.02996605448424816,
-0.017353888601064682,
-0.043718013912439346,
-0.06390227377414703,
0.03662668168544769,
0.015628019347786903,
0.0023935099598020315,
-0.03796277567744255,
0.0520947240293026,
-0.011213932186365128,
-0.24388588964939117,
0.03069109097123146,
-0.15454721450805664,
-0.18164370954036713,
-0.03187102824449539,
0.06249966472387314,
-0.004964772146195173,
0.039608582854270935,
-0.01964382641017437,
0.0037449155934154987,
0.15032748878002167,
-0.0354229174554348,
-0.04789821058511734,
-0.12089766561985016,
0.10340835899114609,
-0.09958169609308243,
0.2041902095079422,
0.0075696054846048355,
0.08505003899335861,
0.0973985344171524,
0.021783364936709404,
-0.1347663253545761,
0.03430958837270737,
0.07379516959190369,
-0.10894311964511871,
0.015313141979277134,
0.15162360668182373,
-0.054543957114219666,
0.07924938201904297,
0.025886397808790207,
-0.10253030061721802,
-0.0186114851385355,
-0.026243818923830986,
-0.03158832713961601,
-0.08026740700006485,
-0.015487277880311012,
-0.06472977250814438,
0.16538245975971222,
0.2220519334077835,
-0.024871690198779106,
0.01679159514605999,
-0.08635232597589493,
0.016587350517511368,
0.04514966905117035,
0.05251859873533249,
-0.04285082966089249,
-0.20565037429332733,
0.02900966815650463,
0.021240750327706337,
0.0234364066272974,
-0.1987474113702774,
-0.0833844467997551,
0.046058736741542816,
-0.028459828346967697,
-0.051946256309747696,
0.10121288895606995,
0.0241509098559618,
0.04459133744239807,
-0.03429654985666275,
-0.09423165768384933,
-0.0451827198266983,
0.1431913822889328,
-0.16266091167926788,
-0.05414127930998802
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09821498394012451,
0.15335752069950104,
-0.0021143516059964895,
0.0918867439031601,
0.1385621279478073,
0.03953501954674721,
0.08798553794622421,
0.1347069889307022,
-0.08043037354946136,
0.07370304316282272,
0.07819019258022308,
0.0542726032435894,
0.051483966410160065,
0.12303294986486435,
-0.0462697371840477,
-0.20962809026241302,
0.012890228070318699,
-0.019028862938284874,
-0.05710376426577568,
0.09860493242740631,
0.08197688311338425,
-0.10492590069770813,
0.0791793242096901,
-0.02060914784669876,
-0.15500155091285706,
0.010302236303687096,
-0.039445098489522934,
-0.022358488291502,
0.09503266960382462,
-0.007796466816216707,
0.09663069993257523,
0.011051248759031296,
0.13966558873653412,
-0.21125070750713348,
-0.00016282529395539314,
0.07813729345798492,
0.03666534274816513,
0.0892321839928627,
0.0461924783885479,
0.02681669034063816,
0.05187668651342392,
-0.15258070826530457,
0.08981378376483917,
0.025990860536694527,
-0.07618119567632675,
-0.08819875121116638,
-0.09090282022953033,
0.01764390617609024,
0.0789322555065155,
0.08351446688175201,
0.008022893220186234,
0.13270284235477448,
-0.09251244366168976,
0.08229897171258926,
0.1980440616607666,
-0.27331987023353577,
-0.0672459602355957,
0.04762018099427223,
0.05200424790382385,
0.06691980361938477,
-0.11915113776922226,
-0.02881808392703533,
0.029227396473288536,
0.0329451709985733,
0.09320533275604248,
-0.019053705036640167,
-0.14080362021923065,
0.0012276831548660994,
-0.13352924585342407,
-0.014421889558434486,
0.10882490128278732,
0.04974987730383873,
-0.04388739541172981,
-0.06325916945934296,
-0.07718811184167862,
-0.10643552243709564,
-0.02397122047841549,
-0.021080907434225082,
0.052443426102399826,
-0.05530468001961708,
-0.05705755576491356,
-0.03957713022828102,
-0.059106986969709396,
-0.08786569535732269,
-0.0021168917883187532,
0.11890815198421478,
0.0443943589925766,
0.02403099834918976,
-0.03434458002448082,
0.1027076318860054,
0.001220930484123528,
-0.13287334144115448,
-0.01689404621720314,
0.009121609851717949,
-0.12312058359384537,
-0.05698065459728241,
-0.02802090160548687,
0.013423754833638668,
0.016471534967422485,
0.1453869640827179,
-0.04094027727842331,
0.08252159506082535,
0.017126942053437233,
-0.020997634157538414,
-0.012283777818083763,
0.14228376746177673,
-0.039275795221328735,
-0.051077950745821,
0.004382222890853882,
0.09743622690439224,
-0.0015108182560652494,
-0.0039509134367108345,
-0.07327555119991302,
-0.018705278635025024,
0.09286226332187653,
0.05508154258131981,
-0.055745240300893784,
0.03324708342552185,
-0.03193739801645279,
-0.024773795157670975,
0.023226650431752205,
-0.124441958963871,
0.03549566492438316,
0.008479814976453781,
-0.08379945904016495,
-0.039895199239254,
0.011313785798847675,
-0.018300054594874382,
-0.02071205899119377,
0.08900430053472519,
-0.08714023977518082,
-0.005949352867901325,
-0.07832685858011246,
-0.07252557575702667,
0.015246884897351265,
-0.14464819431304932,
-0.013805664144456387,
-0.047749895602464676,
-0.20211242139339447,
-0.038119275122880936,
0.04283933714032173,
-0.07681435346603394,
-0.04309208318591118,
-0.05882944166660309,
-0.08446045964956284,
0.014791818335652351,
-0.0039210813120007515,
0.16918951272964478,
-0.05932249873876572,
0.07752840965986252,
-0.013763519003987312,
0.04139011353254318,
0.011184683069586754,
0.04929189011454582,
-0.08729861676692963,
0.022776545956730843,
-0.14106854796409607,
0.07933063060045242,
-0.09725410491228104,
0.009754995815455914,
-0.12540404498577118,
-0.0887002944946289,
0.039357803761959076,
-0.02795264683663845,
0.06953874230384827,
0.1444227397441864,
-0.19184844195842743,
0.0029969611205160618,
0.11609335988759995,
-0.047922562807798386,
-0.046005140990018845,
0.07799742370843887,
-0.05666094645857811,
0.030138812959194183,
0.05059331655502319,
0.16698269546031952,
0.07526282966136932,
-0.1480533629655838,
-0.01911015436053276,
0.025575101375579834,
0.049956273287534714,
0.00500792870298028,
0.04486902058124542,
0.006190320942550898,
0.02574268728494644,
0.004125270992517471,
-0.07781559973955154,
-0.02592320926487446,
-0.09033533185720444,
-0.07079961895942688,
-0.05271860212087631,
-0.088603176176548,
0.02272694557905197,
0.012686747126281261,
0.020809346809983253,
-0.05795540660619736,
-0.1016785278916359,
0.1021716520190239,
0.1294422447681427,
-0.05264892429113388,
0.006490140687674284,
-0.06947632133960724,
0.01941653899848461,
-0.03228125721216202,
-0.03512445464730263,
-0.19592539966106415,
-0.13716189563274384,
0.053438395261764526,
-0.051283154636621475,
0.03603757917881012,
0.03579225391149521,
0.06381642073392868,
0.06187713146209717,
-0.033287350088357925,
-0.019854947924613953,
-0.07021871209144592,
-0.0042807357385754585,
-0.11372189968824387,
-0.19856488704681396,
-0.0643693208694458,
-0.03383241593837738,
0.14067305624485016,
-0.20252373814582825,
-0.0007683045114390552,
-0.02557724341750145,
0.12067022174596786,
0.01520405150949955,
-0.0574355348944664,
0.008851436898112297,
0.029091374948620796,
0.006821172311902046,
-0.09955710917711258,
0.041092392057180405,
0.008003047667443752,
-0.05386142060160637,
-0.06062969192862511,
-0.10764697194099426,
-0.004987150896340609,
0.05762296915054321,
0.08081134408712387,
-0.10534932464361191,
-0.006925483699887991,
-0.050563450902700424,
-0.04605744779109955,
-0.08747374266386032,
0.01050608791410923,
0.19786888360977173,
0.0322532020509243,
0.12036620825529099,
-0.0611402802169323,
-0.07056724280118942,
0.0022968421690165997,
0.024325013160705566,
0.023220693692564964,
0.0919354110956192,
0.10599172115325928,
-0.10015297681093216,
0.08783285319805145,
0.07031138986349106,
-0.047667551785707474,
0.11889824271202087,
-0.04568033665418625,
-0.08216365426778793,
-0.018245309591293335,
0.0072026909328997135,
-0.034578025341033936,
0.13920767605304718,
-0.08400712162256241,
0.011596985161304474,
0.032954536378383636,
0.024205053225159645,
0.017794886603951454,
-0.1698615401983261,
-0.00028533139266073704,
0.01127845048904419,
-0.06415732949972153,
-0.03853587433695793,
-0.02400311641395092,
0.0358048640191555,
0.09358559548854828,
0.025776349008083344,
-0.044888559728860855,
0.017034299671649933,
-0.011883731931447983,
-0.06907161325216293,
0.1931905448436737,
-0.10695824772119522,
-0.08898291736841202,
-0.10835154354572296,
0.0264995489269495,
-0.044369690120220184,
-0.03719858080148697,
0.00400123093277216,
-0.08518843352794647,
-0.05482722818851471,
-0.08789962530136108,
-0.02333015389740467,
-0.017549777403473854,
-0.0035269134677946568,
0.023902354761958122,
-0.015203779563307762,
0.07245906442403793,
-0.1312798112630844,
0.005512257572263479,
-0.03028498776257038,
-0.10457415133714676,
0.008631007745862007,
0.06334147602319717,
0.08555949479341507,
0.0991496816277504,
-0.014684333465993404,
0.012791730463504791,
-0.026925858110189438,
0.23880507051944733,
-0.05501898378133774,
0.014784038998186588,
0.08324408531188965,
-0.006412798073142767,
0.060931283980607986,
0.142755389213562,
0.03095448762178421,
-0.1040181815624237,
0.027061576023697853,
0.08332289755344391,
-0.011609473265707493,
-0.24957109987735748,
-0.02842443250119686,
-0.021291323006153107,
-0.0727626159787178,
0.08884013444185257,
0.03991217911243439,
-0.04844335839152336,
0.039162777364254,
0.010660254396498203,
0.012953301891684532,
-0.05375758558511734,
0.07271802425384521,
0.0857558399438858,
0.04466395452618599,
0.09441892057657242,
-0.022266177460551262,
-0.012875057756900787,
0.06413968652486801,
0.017509175464510918,
0.2719872295856476,
-0.03025089018046856,
0.12532088160514832,
0.026875285431742668,
0.14910995960235596,
-0.03008144162595272,
0.04818008095026016,
0.01621090993285179,
0.005219935439527035,
-0.006832202896475792,
-0.05193387717008591,
-0.029076922684907913,
0.010084299370646477,
-0.03462223708629608,
0.03144354745745659,
-0.07331182807683945,
0.047908373177051544,
0.011829512193799019,
0.3019528090953827,
0.04559113830327988,
-0.28259846568107605,
-0.06067895516753197,
-0.002804908901453018,
-0.04689592495560646,
-0.07434364408254623,
0.004393258597701788,
0.14576825499534607,
-0.12488871812820435,
0.037269171327352524,
-0.055071450769901276,
0.08815760910511017,
-0.046573251485824585,
0.002218228532001376,
0.06547591090202332,
0.14712034165859222,
-0.009130457416176796,
0.07070992141962051,
-0.18925468623638153,
0.22645296156406403,
0.02815546654164791,
0.10987547039985657,
-0.06124797835946083,
0.013689197599887848,
0.009592296555638313,
0.02512984909117222,
0.11035905033349991,
-0.0004285209288354963,
-0.02627997286617756,
-0.16914401948451996,
-0.11427655071020126,
0.05976954475045204,
0.11768484115600586,
-0.015613782219588757,
0.09674014151096344,
-0.040722474455833435,
-0.0025528976693749428,
0.03772120550274849,
-0.07471916824579239,
-0.1286063939332962,
-0.08752871304750443,
0.0004795779241248965,
0.017585977911949158,
-0.040823470801115036,
-0.048129621893167496,
-0.09200536459684372,
-0.023348238319158554,
0.1344653218984604,
-0.00016478517500218004,
-0.04276159405708313,
-0.13461242616176605,
0.05683630332350731,
0.14351774752140045,
-0.058378297835588455,
0.019402973353862762,
0.004317841492593288,
0.101483553647995,
0.05176210403442383,
-0.08249927312135696,
0.05239028483629227,
-0.06719579547643661,
-0.15986303985118866,
-0.0604780837893486,
0.11656897515058517,
0.08004040271043777,
0.05361269414424896,
-0.002439321717247367,
0.03201805055141449,
0.001473850104957819,
-0.08661004900932312,
0.008650685660541058,
0.051777180284261703,
0.09062923491001129,
0.05053381994366646,
-0.09537222981452942,
0.048686958849430084,
-0.03504622355103493,
-0.002634537871927023,
0.12696775794029236,
0.21464383602142334,
-0.08565660566091537,
0.08645670861005783,
0.0709807425737381,
-0.07968691736459732,
-0.17734026908874512,
0.07040810585021973,
0.13286854326725006,
0.01612928695976734,
0.03663504496216774,
-0.20605143904685974,
0.1376979649066925,
0.11179354786872864,
-0.014918969012796879,
0.053064681589603424,
-0.2962619662284851,
-0.12251488119363785,
0.07030364125967026,
0.10487066954374313,
0.0410815104842186,
-0.12697887420654297,
-0.02723388560116291,
-0.009454079903662205,
-0.14443658292293549,
0.144710972905159,
-0.07162102311849594,
0.12243542820215225,
-0.008378240279853344,
0.12424830347299576,
0.02213817462325096,
-0.04163791984319687,
0.12914031744003296,
0.07675807923078537,
0.08708322048187256,
-0.03862336277961731,
-0.0018196989549323916,
0.048703473061323166,
-0.06574901938438416,
0.04873214289546013,
-0.041834913194179535,
0.06724845618009567,
-0.1641690582036972,
0.00023789027181919664,
-0.08486547321081161,
0.04308547079563141,
-0.048679839819669724,
-0.04623357206583023,
-0.027783991768956184,
0.04737243056297302,
0.06693211942911148,
-0.03453638032078743,
0.02811960130929947,
0.021847080439329147,
0.05544530227780342,
0.09272784739732742,
0.08946187794208527,
-0.0161434318870306,
-0.10843201726675034,
0.010554618202149868,
-0.005149901378899813,
0.052236054092645645,
-0.10236266255378723,
0.01499665342271328,
0.13519087433815002,
0.05813158303499222,
0.1258777379989624,
0.02656593918800354,
-0.03363727405667305,
-0.015028911642730236,
0.015224440023303032,
-0.12433688342571259,
-0.12049249559640884,
0.03771955892443657,
-0.04330342635512352,
-0.15711422264575958,
0.006291260942816734,
0.10055773705244064,
-0.03872828930616379,
-0.012715908698737621,
-0.011075417511165142,
0.024427801370620728,
-0.012887670658528805,
0.20298771560192108,
0.04136918857693672,
0.06329777836799622,
-0.10261063277721405,
0.1254761964082718,
0.05752112716436386,
-0.04736680909991264,
0.056400880217552185,
0.06795873492956161,
-0.09369539469480515,
-0.005846768151968718,
0.10904428362846375,
0.16920603811740875,
-0.04454330727458,
-0.018485497683286667,
-0.07216642051935196,
-0.07609396427869797,
0.05756524205207825,
0.15951858460903168,
0.049783628433942795,
-0.007329217158257961,
-0.04364929720759392,
0.025189116597175598,
-0.12726333737373352,
0.07437445968389511,
0.04700693488121033,
0.06826336681842804,
-0.10623318701982498,
0.10721645504236221,
-0.009124386124312878,
0.03704393282532692,
-0.015473444014787674,
0.029515383765101433,
-0.0961381122469902,
-0.02577422745525837,
-0.12099151313304901,
0.019565308466553688,
-0.009948872029781342,
0.007208419963717461,
-0.011170092038810253,
-0.05315891653299332,
-0.036918818950653076,
0.026294702664017677,
-0.08126325160264969,
-0.055418793112039566,
0.01670900732278824,
0.04586092010140419,
-0.15402768552303314,
-0.012224819511175156,
0.023762637749314308,
-0.0931256040930748,
0.07650347799062729,
0.06413962692022324,
0.017047427594661713,
0.028689604252576828,
-0.10546071082353592,
-0.04621860012412071,
0.014070097357034683,
0.02833569049835205,
0.0837109163403511,
-0.08678683638572693,
-0.01454269140958786,
-0.03416243940591812,
0.050327785313129425,
0.013560126535594463,
0.08769839257001877,
-0.11577518284320831,
-0.0036498813424259424,
-0.0590522326529026,
-0.03380395099520683,
-0.06007692590355873,
0.03500181809067726,
0.115225650370121,
0.03597096726298332,
0.16941606998443604,
-0.06765545904636383,
0.039002057164907455,
-0.1943521499633789,
-0.03400229662656784,
0.002441501710563898,
-0.04083890840411186,
-0.0857536569237709,
-0.04260144755244255,
0.0924578607082367,
-0.051812347024679184,
0.09101930260658264,
-0.007918345741927624,
0.09030841290950775,
0.03146897256374359,
-0.006334703415632248,
-0.05698329582810402,
0.0023559723049402237,
0.14596772193908691,
0.05885656550526619,
-0.019074173644185066,
0.1038035899400711,
-0.008073828183114529,
0.05428674444556236,
0.04434565082192421,
0.22585590183734894,
0.14960549771785736,
-0.020288873463869095,
0.0640576034784317,
0.07163789868354797,
-0.12111058086156845,
-0.12436577677726746,
0.14457759261131287,
-0.045868486166000366,
0.12209441512823105,
-0.05503217503428459,
0.21984761953353882,
0.02245238423347473,
-0.1751265525817871,
0.05224163830280304,
-0.057214103639125824,
-0.12067922949790955,
-0.11903729289770126,
-0.013873711228370667,
-0.08147858083248138,
-0.09955689311027527,
0.02687247470021248,
-0.12230083346366882,
0.06425180286169052,
0.12104509025812149,
0.015888044610619545,
0.02106720767915249,
0.15543782711029053,
-0.03981336951255798,
0.018341803923249245,
0.06119799613952637,
0.022877711802721024,
-0.00534761231392622,
-0.059369828552007675,
-0.0636630728840828,
0.052060145884752274,
0.034979309886693954,
0.08539317548274994,
-0.04708172008395195,
0.017221149057149887,
0.03278260678052902,
-0.01502187643200159,
-0.07319606095552444,
0.012372673489153385,
0.02690386027097702,
0.03992481157183647,
0.05510367080569267,
0.053217023611068726,
0.01620575226843357,
-0.035856518894433975,
0.2671401798725128,
-0.07274377346038818,
-0.08318158239126205,
-0.13054166734218597,
0.1989666074514389,
0.01714560016989708,
-0.018389228731393814,
0.0762026384472847,
-0.10349670797586441,
-0.022658219560980797,
0.166514053940773,
0.12319792807102203,
-0.10252571105957031,
-0.029983917251229286,
-0.015778131783008575,
-0.011927035637199879,
-0.038938917219638824,
0.11826004087924957,
0.09412543475627899,
0.007534319534897804,
-0.07805885374546051,
-0.029383322224020958,
-0.0173879973590374,
-0.04450646787881851,
-0.06285427510738373,
0.036443937569856644,
0.01620538718998432,
0.002021266147494316,
-0.03823833912611008,
0.05232945457100868,
-0.01180349662899971,
-0.24293096363544464,
0.029380034655332565,
-0.15581217408180237,
-0.1811666637659073,
-0.03201354295015335,
0.06230490282177925,
-0.004810079466551542,
0.039599087089300156,
-0.019670890644192696,
0.00424515875056386,
0.14751330018043518,
-0.03520911931991577,
-0.048117104917764664,
-0.12162864953279495,
0.10258470475673676,
-0.10036048293113708,
0.20491164922714233,
0.007587336003780365,
0.08401445299386978,
0.09740037471055984,
0.02176556922495365,
-0.13552449643611908,
0.03380872309207916,
0.07418673485517502,
-0.10929516702890396,
0.0152245769277215,
0.15286172926425934,
-0.05443425104022026,
0.07946272194385529,
0.025680718943476677,
-0.10419391095638275,
-0.01827673614025116,
-0.027176883071660995,
-0.030963603407144547,
-0.08071091026067734,
-0.013539527542889118,
-0.06447093933820724,
0.16566915810108185,
0.2221451848745346,
-0.02467070333659649,
0.016479453071951866,
-0.08698634803295135,
0.016344232484698296,
0.044023122638463974,
0.05374625325202942,
-0.0425870418548584,
-0.20545019209384918,
0.029370637610554695,
0.02144349366426468,
0.02337518334388733,
-0.19916512072086334,
-0.08304997533559799,
0.0465521439909935,
-0.029401078820228577,
-0.05206684768199921,
0.10056854039430618,
0.02430121973156929,
0.04391677677631378,
-0.03444304317235947,
-0.09635869413614273,
-0.04507749527692795,
0.1437794417142868,
-0.16280263662338257,
-0.053646519780159
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09908097237348557,
0.15503749251365662,
-0.002138739451766014,
0.09228024631738663,
0.13839155435562134,
0.040144700556993484,
0.0875769630074501,
0.13486848771572113,
-0.07977592945098877,
0.0737377256155014,
0.07845146954059601,
0.05311594158411026,
0.05123148858547211,
0.12235310673713684,
-0.04645726457238197,
-0.20914007723331451,
0.012477315962314606,
-0.018451541662216187,
-0.05469968542456627,
0.09798453748226166,
0.08194024115800858,
-0.10498624294996262,
0.07900488376617432,
-0.020699264481663704,
-0.15377706289291382,
0.01018125656992197,
-0.03899291902780533,
-0.02286835014820099,
0.09528283774852753,
-0.006901023909449577,
0.09643394500017166,
0.010276494547724724,
0.1396164745092392,
-0.21186313033103943,
-0.00046354823280125856,
0.07841984927654266,
0.03639661520719528,
0.08925646543502808,
0.04512624070048332,
0.02803979255259037,
0.050735436379909515,
-0.1529613882303238,
0.08994779735803604,
0.025607898831367493,
-0.07595180720090866,
-0.08869555592536926,
-0.09034096449613571,
0.019084161147475243,
0.07847895473241806,
0.0828939899802208,
0.009000607766211033,
0.1341519057750702,
-0.0909920334815979,
0.08288400620222092,
0.1984556019306183,
-0.2730787694454193,
-0.06627494096755981,
0.04603441432118416,
0.05219099670648575,
0.06839102506637573,
-0.11817605048418045,
-0.02888605184853077,
0.029291409999132156,
0.0326523631811142,
0.09326788038015366,
-0.01982298493385315,
-0.1422187089920044,
0.0011185432085767388,
-0.13319863379001617,
-0.01496373675763607,
0.1091497614979744,
0.04989447817206383,
-0.04423457011580467,
-0.06317951530218124,
-0.07832363247871399,
-0.10839762538671494,
-0.02427234873175621,
-0.021441195160150528,
0.05196714773774147,
-0.05488872528076172,
-0.055257827043533325,
-0.03951027989387512,
-0.058556895703077316,
-0.08668610453605652,
-0.0016084680100902915,
0.11915203928947449,
0.044683534651994705,
0.023961305618286133,
-0.0334429107606411,
0.10227508842945099,
-0.0009623684454709291,
-0.13272063434123993,
-0.01655377633869648,
0.009804917499423027,
-0.12314565479755402,
-0.0570773221552372,
-0.028205925598740578,
0.012506281957030296,
0.016051538288593292,
0.14778317511081696,
-0.040636103600263596,
0.08229633420705795,
0.017529910430312157,
-0.02032030187547207,
-0.012028385885059834,
0.14340776205062866,
-0.040050361305475235,
-0.052405256778001785,
0.005435214843600988,
0.0973953828215599,
-0.0005137207335792482,
-0.004822243470698595,
-0.07409494370222092,
-0.01964564248919487,
0.09366950392723083,
0.05447433516383171,
-0.05567079782485962,
0.032563842833042145,
-0.03186432644724846,
-0.02491823583841324,
0.024461165070533752,
-0.12440986931324005,
0.03569488972425461,
0.008255882188677788,
-0.08337787538766861,
-0.03873923048377037,
0.012461462989449501,
-0.01768215373158455,
-0.021028442308306694,
0.08782386034727097,
-0.08630639314651489,
-0.005563906393945217,
-0.07770644873380661,
-0.07262644916772842,
0.015656692907214165,
-0.14458270370960236,
-0.013220199383795261,
-0.04900584742426872,
-0.2020963877439499,
-0.037926919758319855,
0.04262688010931015,
-0.0760258361697197,
-0.042831819504499435,
-0.0575743243098259,
-0.08296322822570801,
0.014584902673959732,
-0.00452606612816453,
0.16643878817558289,
-0.05952318757772446,
0.0771200880408287,
-0.013556651771068573,
0.04104173928499222,
0.009701970033347607,
0.04950006678700447,
-0.0866793692111969,
0.022744912654161453,
-0.14072956144809723,
0.07882895320653915,
-0.09652743488550186,
0.010009740479290485,
-0.1240513026714325,
-0.08839190751314163,
0.04090701416134834,
-0.027270907536149025,
0.07027880102396011,
0.14345526695251465,
-0.1915774643421173,
0.003278933698311448,
0.11507164686918259,
-0.04749160259962082,
-0.046043992042541504,
0.08002986013889313,
-0.05651095137000084,
0.029354695230722427,
0.05107142776250839,
0.16628842055797577,
0.07685457915067673,
-0.14786379039287567,
-0.018247079104185104,
0.0268653966486454,
0.05071447789669037,
0.003721179673448205,
0.04488242417573929,
0.006399409379810095,
0.0262598916888237,
0.00435083732008934,
-0.0766182541847229,
-0.02531101554632187,
-0.0902600809931755,
-0.0711093544960022,
-0.052941422909498215,
-0.08823460340499878,
0.02219018153846264,
0.01311345212161541,
0.020781755447387695,
-0.05788339674472809,
-0.10237832367420197,
0.10326669365167618,
0.1294090896844864,
-0.05301238223910332,
0.006828030105680227,
-0.0691499188542366,
0.0200243778526783,
-0.03146771341562271,
-0.035377129912376404,
-0.1952141970396042,
-0.1352238655090332,
0.05364637449383736,
-0.05109533667564392,
0.03514999896287918,
0.03639879822731018,
0.06297803670167923,
0.06270242482423782,
-0.03259808197617531,
-0.019157079979777336,
-0.06938840448856354,
-0.004121726844459772,
-0.11503688991069794,
-0.19739042222499847,
-0.06447543203830719,
-0.03323301672935486,
0.14032548666000366,
-0.20374557375907898,
-0.0002852587786037475,
-0.02627687156200409,
0.12013950198888779,
0.01484290324151516,
-0.05737036094069481,
0.00869399681687355,
0.029795726761221886,
0.00633704848587513,
-0.09973816573619843,
0.04109542816877365,
0.008512904867529869,
-0.05458223819732666,
-0.06185685843229294,
-0.10815677791833878,
-0.0037223491817712784,
0.05709175392985344,
0.07978501170873642,
-0.10510296374559402,
-0.007259323727339506,
-0.050070621073246,
-0.046140510588884354,
-0.08671554177999496,
0.009401839226484299,
0.1984071284532547,
0.032400090247392654,
0.12170722335577011,
-0.060890402644872665,
-0.0700637549161911,
0.002258682157844305,
0.024206921458244324,
0.023818787187337875,
0.09110643714666367,
0.10556288063526154,
-0.097407765686512,
0.08693299442529678,
0.06934667378664017,
-0.048772942274808884,
0.11818348616361618,
-0.045198988169431686,
-0.08188247680664062,
-0.018492309376597404,
0.007149082608520985,
-0.03436674177646637,
0.13890399038791656,
-0.08622772246599197,
0.010831163264811039,
0.03327682986855507,
0.023652832955121994,
0.01820085011422634,
-0.16995765268802643,
-0.00034288843744434416,
0.012185345403850079,
-0.06358638405799866,
-0.03719476982951164,
-0.024821259081363678,
0.035446520894765854,
0.09294844418764114,
0.025100218132138252,
-0.045382604002952576,
0.017630547285079956,
-0.011476995423436165,
-0.06958593428134918,
0.1922929883003235,
-0.10685214400291443,
-0.0903286337852478,
-0.11013839393854141,
0.026777340099215508,
-0.044316280633211136,
-0.03718995302915573,
0.004595617298036814,
-0.08320412039756775,
-0.054781798273324966,
-0.08841816335916519,
-0.02557639591395855,
-0.01741904765367508,
-0.003290463937446475,
0.024444580078125,
-0.015815194696187973,
0.07329420000314713,
-0.1311727613210678,
0.005289497785270214,
-0.029418958351016045,
-0.10351398587226868,
0.009004103019833565,
0.0634731724858284,
0.0862928107380867,
0.09956040233373642,
-0.015469717793166637,
0.012061034329235554,
-0.026512451469898224,
0.2382717728614807,
-0.054508354514837265,
0.015003594569861889,
0.08360191434621811,
-0.00782026257365942,
0.06117503345012665,
0.14206838607788086,
0.031036650761961937,
-0.10408555716276169,
0.026460466906428337,
0.08217579126358032,
-0.011872424744069576,
-0.24888984858989716,
-0.02818995527923107,
-0.021292777732014656,
-0.07190553843975067,
0.08916369825601578,
0.03995506092905998,
-0.04762088507413864,
0.039347369223833084,
0.010498840361833572,
0.015126211568713188,
-0.055016230791807175,
0.07236631214618683,
0.08625363558530807,
0.04419248178601265,
0.09422629326581955,
-0.021695684641599655,
-0.01268550381064415,
0.06458127498626709,
0.01658914051949978,
0.2697611451148987,
-0.03027268312871456,
0.12650156021118164,
0.025647668167948723,
0.1500188559293747,
-0.029905306175351143,
0.048266056925058365,
0.016067197546362877,
0.004955211188644171,
-0.006829931400716305,
-0.051988255232572556,
-0.030551200732588768,
0.010885123163461685,
-0.03494493290781975,
0.03178133815526962,
-0.07359384000301361,
0.04871596395969391,
0.011438910849392414,
0.3015572726726532,
0.04537012055516243,
-0.2827901840209961,
-0.06097057834267616,
-0.0032927037682384253,
-0.04708343744277954,
-0.07496734708547592,
0.004618191160261631,
0.1470678150653839,
-0.12441196292638779,
0.03656885772943497,
-0.05444755405187607,
0.08823809772729874,
-0.04799019172787666,
0.0025433662813156843,
0.06511610746383667,
0.14684684574604034,
-0.008844609372317791,
0.07089405506849289,
-0.18977946043014526,
0.2247994989156723,
0.028396086767315865,
0.10998405516147614,
-0.06142406165599823,
0.014298067428171635,
0.009302212856709957,
0.026773015037178993,
0.10950367152690887,
0.000056355878768954426,
-0.025590363889932632,
-0.17097416520118713,
-0.11494677513837814,
0.059653863310813904,
0.11631842702627182,
-0.013944454491138458,
0.09659242630004883,
-0.04082635045051575,
-0.002434183144941926,
0.037893377244472504,
-0.07401114702224731,
-0.12840618193149567,
-0.08781079947948456,
0.00045176484854891896,
0.01910894364118576,
-0.04058016836643219,
-0.048085711896419525,
-0.09169694781303406,
-0.025378786027431488,
0.13431541621685028,
0.0000036513054055831162,
-0.042832665145397186,
-0.13460519909858704,
0.05754242092370987,
0.1428413838148117,
-0.05866948515176773,
0.019412115216255188,
0.004522460978478193,
0.10133451223373413,
0.05150418356060982,
-0.08149579912424088,
0.052356015890836716,
-0.06705383956432343,
-0.15944361686706543,
-0.06102640554308891,
0.11566846072673798,
0.07982296496629715,
0.053630974143743515,
-0.0024367193691432476,
0.03169916570186615,
0.0014325532829388976,
-0.08638528734445572,
0.007559343706816435,
0.053144268691539764,
0.089562326669693,
0.05030245706439018,
-0.09523051232099533,
0.049612950533628464,
-0.03458007052540779,
-0.0027412190102040768,
0.1279883235692978,
0.21459895372390747,
-0.08589430898427963,
0.08550799638032913,
0.0717003270983696,
-0.07969389110803604,
-0.1766604632139206,
0.0695982277393341,
0.13229045271873474,
0.01634426787495613,
0.036868494004011154,
-0.20513401925563812,
0.13775214552879333,
0.11335627734661102,
-0.014395610429346561,
0.053509652614593506,
-0.2962064743041992,
-0.12276560068130493,
0.07122168689966202,
0.10443928837776184,
0.043351005762815475,
-0.1278562843799591,
-0.02732032537460327,
-0.009967302903532982,
-0.14601585268974304,
0.14300253987312317,
-0.07059282064437866,
0.12230947613716125,
-0.008534829132258892,
0.1222304254770279,
0.02229267545044422,
-0.0417599156498909,
0.12976449728012085,
0.07744774967432022,
0.08690173923969269,
-0.03869650512933731,
-0.0017883410910144448,
0.04804234206676483,
-0.06587690114974976,
0.04935743659734726,
-0.041476599872112274,
0.06703205406665802,
-0.16670086979866028,
0.000030849361792206764,
-0.08339288830757141,
0.04324886575341225,
-0.04844716563820839,
-0.046265609562397,
-0.02771306410431862,
0.04608600586652756,
0.06693825125694275,
-0.03440382331609726,
0.029194211587309837,
0.02232242561876774,
0.05455370992422104,
0.09463492035865784,
0.08757902681827545,
-0.019447455182671547,
-0.1087697446346283,
0.010235374793410301,
-0.005282856989651918,
0.05261755734682083,
-0.10177213698625565,
0.015550968237221241,
0.1356578916311264,
0.05842879042029381,
0.12623818218708038,
0.025297274813055992,
-0.033069707453250885,
-0.014707610011100769,
0.014670965261757374,
-0.12415076047182083,
-0.11940477043390274,
0.03701171651482582,
-0.04398322105407715,
-0.15664535760879517,
0.005333257839083672,
0.10145922005176544,
-0.03971082717180252,
-0.01225397177040577,
-0.011589499190449715,
0.02328549139201641,
-0.012722894549369812,
0.20276468992233276,
0.04189770668745041,
0.06315965205430984,
-0.10236191004514694,
0.12515898048877716,
0.05778875574469566,
-0.04651062935590744,
0.056704580783843994,
0.0676824301481247,
-0.09373372793197632,
-0.00589748052880168,
0.10877623409032822,
0.1689901053905487,
-0.045515142381191254,
-0.019590985029935837,
-0.07322093099355698,
-0.0745294988155365,
0.05742548406124115,
0.158523291349411,
0.050194788724184036,
-0.00753781758248806,
-0.043957822024822235,
0.024525534361600876,
-0.12716560065746307,
0.07391799986362457,
0.0472429133951664,
0.06850866973400116,
-0.10661401599645615,
0.10932836681604385,
-0.00861737597733736,
0.03695543482899666,
-0.015371584333479404,
0.02930484712123871,
-0.0958743765950203,
-0.02561347186565399,
-0.12315668910741806,
0.01991542987525463,
-0.009876024909317493,
0.00758181931450963,
-0.011136636137962341,
-0.05280620604753494,
-0.037007831037044525,
0.02619647979736328,
-0.08040500432252884,
-0.05520191416144371,
0.01698809303343296,
0.04545654356479645,
-0.15329301357269287,
-0.012453615665435791,
0.023264680057764053,
-0.093154177069664,
0.07676132768392563,
0.06349468231201172,
0.016615962609648705,
0.027924787253141403,
-0.10531550645828247,
-0.04618890583515167,
0.014129932038486004,
0.02911229617893696,
0.08362531661987305,
-0.085016168653965,
-0.01379463728517294,
-0.03374475985765457,
0.050152793526649475,
0.013283252716064453,
0.08757983148097992,
-0.11604783684015274,
-0.004135532770305872,
-0.05863141641020775,
-0.033644530922174454,
-0.060656800866127014,
0.034462735056877136,
0.11492668837308884,
0.0353487953543663,
0.16918553411960602,
-0.06774571537971497,
0.03897226229310036,
-0.19459378719329834,
-0.03421564772725105,
0.0026747575029730797,
-0.04045486077666283,
-0.08539541065692902,
-0.04309454932808876,
0.09221628308296204,
-0.051089730113744736,
0.09297855198383331,
-0.008400102145969868,
0.09093863517045975,
0.03103078156709671,
-0.00753383943811059,
-0.05697173625230789,
0.0015857235994189978,
0.146920308470726,
0.05965614318847656,
-0.019030213356018066,
0.10264270752668381,
-0.007984005846083164,
0.05581948906183243,
0.043586451560258865,
0.22323176264762878,
0.14960551261901855,
-0.020731566473841667,
0.06438615173101425,
0.07091173529624939,
-0.12082849442958832,
-0.12475832551717758,
0.14405854046344757,
-0.04606137424707413,
0.12282347679138184,
-0.05477045476436615,
0.21978577971458435,
0.022576505318284035,
-0.17476288974285126,
0.051778994500637054,
-0.05648370087146759,
-0.12079515308141708,
-0.11889408528804779,
-0.013163655996322632,
-0.08153976500034332,
-0.09977280348539352,
0.026418378576636314,
-0.12185633927583694,
0.06419409811496735,
0.12145821005105972,
0.01561103854328394,
0.02070670761168003,
0.15427495539188385,
-0.039934657514095306,
0.018606068566441536,
0.061099112033843994,
0.02298048697412014,
-0.004790672101080418,
-0.060400448739528656,
-0.06483584642410278,
0.052275966852903366,
0.03498419374227524,
0.08597466349601746,
-0.04729529097676277,
0.01918383501470089,
0.033428825438022614,
-0.014644285663962364,
-0.07382351905107498,
0.012342432513833046,
0.0260629765689373,
0.03980789706110954,
0.05379627272486687,
0.05357777699828148,
0.016546770930290222,
-0.035980433225631714,
0.2652141749858856,
-0.07213620841503143,
-0.08338852226734161,
-0.13064149022102356,
0.19730030000209808,
0.01771770417690277,
-0.01812790147960186,
0.07620590180158615,
-0.10367845743894577,
-0.021840913221240044,
0.16682130098342896,
0.12171489000320435,
-0.10290858149528503,
-0.030045246705412865,
-0.015278482809662819,
-0.012105517089366913,
-0.03986654058098793,
0.11845846474170685,
0.09429647773504257,
0.006103482097387314,
-0.07698724418878555,
-0.03015248104929924,
-0.017521856352686882,
-0.04455608129501343,
-0.06394617259502411,
0.03539413586258888,
0.016341956332325935,
0.0029403804801404476,
-0.03795364871621132,
0.051559194922447205,
-0.010499541647732258,
-0.24369379878044128,
0.029882118105888367,
-0.15583108365535736,
-0.18108098208904266,
-0.03153472766280174,
0.06299027055501938,
-0.004282843321561813,
0.03888140618801117,
-0.01956014148890972,
0.00472562899813056,
0.14894886314868927,
-0.0356619656085968,
-0.04868606850504875,
-0.12051938474178314,
0.10147132724523544,
-0.09904883056879044,
0.20475901663303375,
0.007715881336480379,
0.08456837385892868,
0.09719555079936981,
0.022275131195783615,
-0.1347341686487198,
0.03409786522388458,
0.07387279719114304,
-0.10844049602746964,
0.014853904023766518,
0.1515907645225525,
-0.054155461490154266,
0.07947986572980881,
0.026078885421156883,
-0.10394540429115295,
-0.019137468189001083,
-0.028053853660821915,
-0.031201353296637535,
-0.08044327050447464,
-0.015088100917637348,
-0.06390328705310822,
0.1658552885055542,
0.22129406034946442,
-0.02459302917122841,
0.015879986807703972,
-0.08683796226978302,
0.01611032336950302,
0.044579993933439255,
0.05379660055041313,
-0.042482487857341766,
-0.2055140882730484,
0.030114738270640373,
0.02073107473552227,
0.023701444268226624,
-0.1981792002916336,
-0.08360286802053452,
0.0464518778026104,
-0.029904522001743317,
-0.051993194967508316,
0.10077020525932312,
0.024204524233937263,
0.04413007199764252,
-0.03442750126123428,
-0.09681825339794159,
-0.045448239892721176,
0.1438915878534317,
-0.16284047067165375,
-0.05395250394940376
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.1005517989397049,
0.12562184035778046,
-0.0022013515699654818,
0.0896008312702179,
0.1391366571187973,
0.03136532008647919,
0.08791501820087433,
0.1349555253982544,
-0.08526008576154709,
0.05155423283576965,
0.07057338953018188,
0.06986147910356522,
0.04514771327376366,
0.1303384006023407,
-0.04329132288694382,
-0.21216844022274017,
0.006453644949942827,
-0.01214080024510622,
-0.06490027904510498,
0.10348270833492279,
0.08867616951465607,
-0.11504127085208893,
0.06822643429040909,
-0.020631765946745872,
-0.15550780296325684,
0.007626758422702551,
-0.03131670504808426,
-0.026830850169062614,
0.1062653586268425,
-0.01671336032450199,
0.09427409619092941,
0.017714500427246094,
0.13711492717266083,
-0.2158968448638916,
0.0013703079894185066,
0.07722703367471695,
0.04730764403939247,
0.08734071254730225,
0.0393056683242321,
0.02078128419816494,
0.038976434618234634,
-0.1492808759212494,
0.09600624442100525,
0.024949191138148308,
-0.08783133327960968,
-0.13904385268688202,
-0.09084653854370117,
0.02099604904651642,
0.07972253113985062,
0.07924389094114304,
0.008232440799474716,
0.14003270864486694,
-0.10261527448892593,
0.08021140843629837,
0.20011694729328156,
-0.2854647636413574,
-0.06534958630800247,
0.04261753335595131,
0.05338520184159279,
0.0722331777215004,
-0.11616769433021545,
-0.01439073495566845,
0.019768904894590378,
0.03197160363197327,
0.11221431940793991,
-0.020481513813138008,
-0.1275019645690918,
0.006439669989049435,
-0.12661802768707275,
-0.011643046513199806,
0.10440333932638168,
0.035425830632448196,
-0.046669188886880875,
-0.07824287563562393,
-0.060971882194280624,
-0.08604691922664642,
-0.034758031368255615,
-0.016210928559303284,
0.055207833647727966,
-0.05614658445119858,
-0.0648840069770813,
-0.045490507036447525,
-0.05615641921758652,
-0.08740435540676117,
-0.0019367568893358111,
0.13943304121494293,
0.03966253250837326,
0.02114219032227993,
-0.039896439760923386,
0.11339204013347626,
0.00025500456104055047,
-0.1298506110906601,
-0.004651217721402645,
0.0023102639243006706,
-0.11115565150976181,
-0.05404854193329811,
-0.03674650564789772,
0.009819415397942066,
0.016978630796074867,
0.1499619483947754,
-0.043638087809085846,
0.07523109018802643,
0.015153205953538418,
-0.0172274112701416,
-0.016852296888828278,
0.1531485617160797,
-0.033503301441669464,
-0.042515166103839874,
-0.00022848608205094934,
0.09902296215295792,
0.0011832797899842262,
-0.00782480463385582,
-0.0803643986582756,
-0.01588279753923416,
0.0735647901892662,
0.06807871162891388,
-0.05278356373310089,
0.037157658487558365,
-0.04528267681598663,
-0.02432584948837757,
0.028018180280923843,
-0.12484195083379745,
0.0387326218187809,
0.0064185503870248795,
-0.08467443287372589,
-0.05916517227888107,
0.007548449095338583,
-0.009339778684079647,
-0.02654414251446724,
0.08288897573947906,
-0.07354238629341125,
-0.0005682640476152301,
-0.08612444251775742,
-0.08002901822328568,
0.0006460511940531433,
-0.1448652595281601,
-0.011594296433031559,
-0.04815652221441269,
-0.19241636991500854,
-0.032164059579372406,
0.051529720425605774,
-0.07824619114398956,
-0.03778070956468582,
-0.050499655306339264,
-0.07601224631071091,
0.01125297974795103,
-0.007627865299582481,
0.1884256899356842,
-0.061594776809215546,
0.07956190407276154,
-0.014700818806886673,
0.048774171620607376,
0.014984936453402042,
0.04890821501612663,
-0.08587566018104553,
0.025421805679798126,
-0.1507890224456787,
0.08197315037250519,
-0.09941765666007996,
0.020150918513536453,
-0.123317651450634,
-0.08795949816703796,
0.052017226815223694,
-0.02176104113459587,
0.07869833707809448,
0.13794708251953125,
-0.19988994300365448,
-0.0003370265185367316,
0.11674261093139648,
-0.040997039526700974,
-0.05590340867638588,
0.07613135874271393,
-0.058539070188999176,
0.04389714077115059,
0.05407349020242691,
0.18936002254486084,
0.07095986604690552,
-0.14773033559322357,
0.017517106607556343,
0.033649567514657974,
0.05651309713721275,
0.006075439974665642,
0.035765938460826874,
-0.0018592230044305325,
0.02584538608789444,
0.008237210102379322,
-0.08357153087854385,
-0.02301665209233761,
-0.09220438450574875,
-0.07325204461812973,
-0.05151115730404854,
-0.09749845415353775,
0.03146658465266228,
0.013846131041646004,
0.022944776341319084,
-0.072250135242939,
-0.09938003867864609,
0.10503926128149033,
0.11864936351776123,
-0.05105191469192505,
0.017000392079353333,
-0.07750695198774338,
0.026890940964221954,
-0.015186907723546028,
-0.029261356219649315,
-0.20051750540733337,
-0.1199130117893219,
0.05122150853276253,
-0.048520997166633606,
0.02687162719666958,
0.019786059856414795,
0.06627529859542847,
0.05538293346762657,
-0.04326484724879265,
-0.017237845808267593,
-0.06739907711744308,
0.003929012920707464,
-0.11443740129470825,
-0.20596472918987274,
-0.06556329876184464,
-0.036968592554330826,
0.1330907642841339,
-0.19099724292755127,
0.002452875953167677,
-0.026977362111210823,
0.11273252964019775,
0.01787812076508999,
-0.05109183117747307,
0.011940151453018188,
0.03887331485748291,
0.01874532736837864,
-0.09529657661914825,
0.05657017603516579,
0.012273997999727726,
-0.07063888758420944,
-0.042920179665088654,
-0.11046182364225388,
-0.00218599708750844,
0.0626850575208664,
0.07049401849508286,
-0.0981595516204834,
-0.015566009096801281,
-0.05314066261053085,
-0.036287713795900345,
-0.07512886822223663,
0.023956140503287315,
0.20433475077152252,
0.030722396448254585,
0.12461681663990021,
-0.06586489826440811,
-0.0795382410287857,
-0.0026462827809154987,
0.016018453985452652,
0.03312821686267853,
0.09972192347049713,
0.07706151157617569,
-0.07865599542856216,
0.07942747324705124,
0.09165079891681671,
-0.03717350214719772,
0.1177971363067627,
-0.05312100052833557,
-0.07547707110643387,
-0.016975397244095802,
-0.004302885849028826,
-0.028876928612589836,
0.15043142437934875,
-0.0835055559873581,
0.004812981467694044,
0.039503779262304306,
0.02531813457608223,
0.013023601844906807,
-0.1785166710615158,
-0.007474974263459444,
0.014271749183535576,
-0.06247459724545479,
-0.05790004879236221,
-0.02573581598699093,
0.03745689615607262,
0.09798555076122284,
0.0317850261926651,
-0.029749196022748947,
0.020467568188905716,
-0.012572834268212318,
-0.05970272794365883,
0.1901649832725525,
-0.12155790627002716,
-0.09116775542497635,
-0.08559561520814896,
0.022591976448893547,
-0.039348091930150986,
-0.03530474752187729,
0.008106386289000511,
-0.09706030040979385,
-0.05427265539765358,
-0.08572044968605042,
-0.024029944092035294,
-0.005783144384622574,
-0.005609293933957815,
0.029562905430793762,
-0.013113954104483128,
0.07867127656936646,
-0.13337913155555725,
0.0010078315390273929,
-0.03548016399145126,
-0.09360907226800919,
0.01817133091390133,
0.06755192577838898,
0.08265446871519089,
0.10266949981451035,
-0.01395756658166647,
0.015506958588957787,
-0.027707522734999657,
0.23781585693359375,
-0.06630244106054306,
0.015650128945708275,
0.09052382409572601,
-0.000290108349872753,
0.054097387939691544,
0.1364492028951645,
0.03752604499459267,
-0.1135755181312561,
0.027107561007142067,
0.07760511338710785,
-0.01874641515314579,
-0.24834059178829193,
-0.031775880604982376,
-0.02036847546696663,
-0.08218220621347427,
0.08176689594984055,
0.0357043631374836,
-0.05374445766210556,
0.03341514989733696,
0.019805798307061195,
-0.005483055952936411,
-0.03511521965265274,
0.06596103310585022,
0.08645853400230408,
0.04217267036437988,
0.1004246398806572,
-0.025141283869743347,
-0.01083102822303772,
0.06548410654067993,
0.019164618104696274,
0.2609206438064575,
-0.03964716196060181,
0.12480000406503677,
0.036984365433454514,
0.14451539516448975,
-0.027462372556328773,
0.0505695678293705,
0.008944062516093254,
-0.0058363922871649265,
-0.005389838479459286,
-0.05023229867219925,
-0.019540240988135338,
-0.001998989377170801,
-0.03453825041651726,
0.023229600861668587,
-0.07622536271810532,
0.027462946251034737,
0.02768169529736042,
0.3043176233768463,
0.03926403447985649,
-0.26588281989097595,
-0.06892777234315872,
0.0028370546642690897,
-0.040858037769794464,
-0.07736344635486603,
0.005541075952351093,
0.1387128233909607,
-0.1251107007265091,
0.035900719463825226,
-0.05032188817858696,
0.09287568926811218,
-0.03596777468919754,
0.011365109123289585,
0.06741128861904144,
0.14612692594528198,
-0.01195437554270029,
0.0691986232995987,
-0.21389970183372498,
0.23353463411331177,
0.026830947026610374,
0.11071215569972992,
-0.06417511403560638,
0.009383483789861202,
0.004700257908552885,
0.0482146330177784,
0.109659843146801,
0.006898364517837763,
0.000222077636863105,
-0.17438329756259918,
-0.09728702157735825,
0.0556478314101696,
0.118385449051857,
-0.023035414516925812,
0.08659492433071136,
-0.037250034511089325,
-0.0018880799179896712,
0.0338095985352993,
-0.08113717287778854,
-0.1245214119553566,
-0.08949992060661316,
-0.00548204593360424,
0.0012436717515811324,
-0.034670669585466385,
-0.05743585526943207,
-0.09027433395385742,
-0.02771344967186451,
0.13451699912548065,
0.020748546347022057,
-0.054646871984004974,
-0.136338010430336,
0.040820132941007614,
0.13802550733089447,
-0.046225178986787796,
0.01693422719836235,
0.005405760370194912,
0.09970530867576599,
0.04623850807547569,
-0.07815999537706375,
0.06618739664554596,
-0.0739927738904953,
-0.16553322970867157,
-0.059154435992240906,
0.11936026066541672,
0.08262555301189423,
0.05596378073096275,
0.0012406072346493602,
0.0318579338490963,
-0.005853212904185057,
-0.08339796215295792,
0.018199581652879715,
0.043890614062547684,
0.09174062311649323,
0.03490525484085083,
-0.09532967954874039,
0.06743216514587402,
-0.03954397514462471,
-0.011819262057542801,
0.12383399903774261,
0.21693603694438934,
-0.09480204433202744,
0.10692992061376572,
0.07722141593694687,
-0.07964155077934265,
-0.1848379671573639,
0.07122713327407837,
0.1206648200750351,
0.015292405150830746,
0.038301482796669006,
-0.2074321210384369,
0.13066326081752777,
0.1028677374124527,
-0.015016818419098854,
0.044050831347703934,
-0.30163028836250305,
-0.13086993992328644,
0.07618608325719833,
0.10611940175294876,
0.042442914098501205,
-0.12599873542785645,
-0.021818287670612335,
-0.013494390062987804,
-0.1310449242591858,
0.14156121015548706,
-0.08482412248849869,
0.11593753099441528,
-0.006606009788811207,
0.11641496419906616,
0.02707727998495102,
-0.03868737816810608,
0.13526073098182678,
0.06753560155630112,
0.0951344445347786,
-0.04360901191830635,
0.01193995401263237,
0.054206326603889465,
-0.06456103175878525,
0.03688427060842514,
-0.032571617513895035,
0.07015249878168106,
-0.16072456538677216,
0.0001393343845847994,
-0.09327056258916855,
0.036331262439489365,
-0.05035017430782318,
-0.05637144297361374,
-0.020452484488487244,
0.05423317849636078,
0.06928236037492752,
-0.04011271521449089,
0.03136739507317543,
0.004645512904971838,
0.07641708105802536,
0.0880323126912117,
0.106813944876194,
-0.03731968626379967,
-0.10122019052505493,
0.01996629126369953,
-0.009302294813096523,
0.05548567324876785,
-0.11006377637386322,
0.02518383413553238,
0.13026773929595947,
0.05779866874217987,
0.11493153125047684,
0.028549790382385254,
-0.030636945739388466,
-0.017298821359872818,
0.016199028119444847,
-0.12374826520681381,
-0.11412826180458069,
0.04446662962436676,
-0.039156828075647354,
-0.13669517636299133,
0.007981260307133198,
0.09450525790452957,
-0.0288886409252882,
-0.019441280514001846,
-0.010408488102257252,
0.020104311406612396,
-0.01985297165811062,
0.19951122999191284,
0.04375927522778511,
0.06914053857326508,
-0.10774311423301697,
0.12655788660049438,
0.04923273250460625,
-0.05713935196399689,
0.05094589665532112,
0.06675787270069122,
-0.1044798344373703,
-0.01206629816442728,
0.12019457668066025,
0.1660524159669876,
-0.03077625297009945,
-0.019229695200920105,
-0.08178512752056122,
-0.09229741245508194,
0.06180387735366821,
0.15524673461914062,
0.051128145307302475,
-0.015936806797981262,
-0.048790376633405685,
0.027059683576226234,
-0.1253204494714737,
0.0752510130405426,
0.04406600818037987,
0.06204751878976822,
-0.09086711704730988,
0.11077942699193954,
-0.004191640764474869,
0.04098658263683319,
-0.019376371055841446,
0.025203431025147438,
-0.10398153215646744,
-0.009974459186196327,
-0.14763803780078888,
0.0006360822590067983,
-0.004302028100937605,
0.012902798131108284,
-0.02079079858958721,
-0.05298911780118942,
-0.025604525581002235,
0.026728233322501183,
-0.09251075237989426,
-0.049286819994449615,
0.023219963535666466,
0.03779395669698715,
-0.14622491598129272,
-0.015514451079070568,
0.021634621545672417,
-0.09245671331882477,
0.08073122054338455,
0.06102411076426506,
0.011363355442881584,
0.02899681217968464,
-0.10782759636640549,
-0.04873686283826828,
0.002344829961657524,
0.024977250024676323,
0.08839347958564758,
-0.09311279654502869,
-0.021014703437685966,
-0.03181493654847145,
0.04587918147444725,
0.018823668360710144,
0.098236083984375,
-0.11537358909845352,
0.00928814709186554,
-0.040822584182024,
-0.044964004307985306,
-0.061966534703969955,
0.04033929109573364,
0.10980159044265747,
0.046052683144807816,
0.15972214937210083,
-0.07338661700487137,
0.035341981798410416,
-0.18450580537319183,
-0.04367530718445778,
0.0003663946408778429,
-0.048282768577337265,
-0.0895569920539856,
-0.047687072306871414,
0.09862666577100754,
-0.05200786516070366,
0.10945181548595428,
-0.0009637218317948282,
0.09858562797307968,
0.03257559984922409,
-0.011342004872858524,
-0.054694466292858124,
0.008728488348424435,
0.15498965978622437,
0.04944273829460144,
-0.01401426363736391,
0.11064618080854416,
0.0011140224523842335,
0.04448043555021286,
0.06674664467573166,
0.21395337581634521,
0.1540045440196991,
0.011907080188393593,
0.048776015639305115,
0.0600285679101944,
-0.11658184975385666,
-0.13583293557167053,
0.1334049552679062,
-0.04784710332751274,
0.12334340810775757,
-0.07252279669046402,
0.21688662469387054,
0.022912077605724335,
-0.18475738167762756,
0.06931470334529877,
-0.061510976403951645,
-0.12211037427186966,
-0.11742118000984192,
-0.02373385801911354,
-0.06983550637960434,
-0.10965414345264435,
0.02017885632812977,
-0.12052419781684875,
0.06331679970026016,
0.12192335724830627,
0.016041019931435585,
0.023897167295217514,
0.16344982385635376,
-0.03766026720404625,
0.01990189030766487,
0.06818497180938721,
0.01730329543352127,
-0.00666010519489646,
-0.0628027617931366,
-0.06496390700340271,
0.0436147004365921,
0.023579226806759834,
0.07172048836946487,
-0.038954272866249084,
0.0015274947509169579,
0.019647032022476196,
-0.015322387218475342,
-0.07343123853206635,
0.01695678010582924,
0.024317052215337753,
0.045862678438425064,
0.059295397251844406,
0.05255059152841568,
0.009544871747493744,
-0.0376615896821022,
0.2891842722892761,
-0.07505754381418228,
-0.09953559935092926,
-0.1297561526298523,
0.2276890128850937,
0.009867405518889427,
-0.02278393879532814,
0.07351142168045044,
-0.09483209252357483,
-0.030131202191114426,
0.179368257522583,
0.14740799367427826,
-0.10891900956630707,
-0.022131823003292084,
-0.020022552460432053,
-0.011204109527170658,
-0.04383106157183647,
0.13433164358139038,
0.10864589363336563,
-0.01452154666185379,
-0.07840445637702942,
-0.01989828236401081,
-0.018355591222643852,
-0.04711015895009041,
-0.07318542152643204,
0.06285858154296875,
0.025318972766399384,
-0.0032820154447108507,
-0.045719943940639496,
0.05803734436631203,
-0.0070793782360851765,
-0.23785732686519623,
0.032578665763139725,
-0.1637982726097107,
-0.17749379575252533,
-0.038292501121759415,
0.059056662023067474,
-0.009002498351037502,
0.041949860751628876,
-0.009001998230814934,
0.011179889552295208,
0.14039769768714905,
-0.03262845799326897,
-0.02623562701046467,
-0.12545153498649597,
0.11636772751808167,
-0.10966740548610687,
0.20140688121318817,
0.002935885451734066,
0.07263243943452835,
0.09601543098688126,
0.02013484761118889,
-0.13392828404903412,
0.04429512843489647,
0.07339420914649963,
-0.10551624745130539,
0.009271197952330112,
0.14424198865890503,
-0.05039425566792488,
0.07192499935626984,
0.024766936898231506,
-0.10914565622806549,
-0.007897553965449333,
-0.04630986601114273,
-0.039937231689691544,
-0.07425841689109802,
-0.015739811584353447,
-0.06449826806783676,
0.15917611122131348,
0.22226101160049438,
-0.023910047486424446,
0.017328480258584023,
-0.09329815208911896,
0.015090567991137505,
0.042166680097579956,
0.04433589428663254,
-0.05374787747859955,
-0.20015905797481537,
0.0289969090372324,
0.04350961744785309,
0.018857643008232117,
-0.20812095701694489,
-0.07995837926864624,
0.04541206359863281,
-0.030272196978330612,
-0.04939708858728409,
0.10211266577243805,
0.028684012591838837,
0.04514503479003906,
-0.03621067479252815,
-0.10716307908296585,
-0.0366402193903923,
0.1482405811548233,
-0.15668803453445435,
-0.040837738662958145
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10132835805416107,
0.12458344548940659,
-0.0021574092097580433,
0.0890110582113266,
0.13878482580184937,
0.031667277216911316,
0.08935161679983139,
0.13395492732524872,
-0.08412817120552063,
0.05071624368429184,
0.06987441331148148,
0.07127468287944794,
0.045695710927248,
0.13184264302253723,
-0.042506564408540726,
-0.213563472032547,
0.00665510818362236,
-0.011435789056122303,
-0.0638638585805893,
0.10369480401277542,
0.08897271007299423,
-0.11530616879463196,
0.06721305847167969,
-0.02100408263504505,
-0.15459582209587097,
0.007394534535706043,
-0.031799543648958206,
-0.026701511815190315,
0.10625502467155457,
-0.018330741673707962,
0.09398000687360764,
0.01789012737572193,
0.1392814964056015,
-0.21549373865127563,
0.0012384153669700027,
0.07623425871133804,
0.046984460204839706,
0.08715429157018661,
0.039340682327747345,
0.022037377581000328,
0.038771651685237885,
-0.15011830627918243,
0.09719998389482498,
0.024227993562817574,
-0.08781401813030243,
-0.1375901848077774,
-0.09142738580703735,
0.021935563534498215,
0.08293929696083069,
0.07835306972265244,
0.0072207385674119,
0.13946190476417542,
-0.1035175696015358,
0.08070071041584015,
0.20075176656246185,
-0.28691762685775757,
-0.06546946614980698,
0.04484635964035988,
0.05601440742611885,
0.072527676820755,
-0.11626515537500381,
-0.015445365570485592,
0.02017187885940075,
0.031349338591098785,
0.11268608272075653,
-0.02114764228463173,
-0.12613193690776825,
0.006113090086728334,
-0.12795376777648926,
-0.011114610359072685,
0.10405877232551575,
0.03674309700727463,
-0.046743590384721756,
-0.07963448762893677,
-0.06095699593424797,
-0.08754264563322067,
-0.035884443670511246,
-0.01791507378220558,
0.05489587411284447,
-0.05718766897916794,
-0.06452761590480804,
-0.04556628689169884,
-0.05528724566102028,
-0.08801458775997162,
0.0006322974222712219,
0.13721852004528046,
0.040298331528902054,
0.02001022733747959,
-0.03909299895167351,
0.11292418837547302,
-0.0015109532978385687,
-0.12942038476467133,
-0.003644263371825218,
0.0022838134318590164,
-0.11310628801584244,
-0.05535423755645752,
-0.03655232489109039,
0.011042536236345768,
0.015684448182582855,
0.1513611227273941,
-0.040708914399147034,
0.07557471096515656,
0.017024854198098183,
-0.017740946263074875,
-0.017060983926057816,
0.15283793210983276,
-0.034664735198020935,
-0.04512929543852806,
-0.0012755542993545532,
0.10039705038070679,
0.0005283342907205224,
-0.006619873456656933,
-0.0808417871594429,
-0.016171999275684357,
0.07447225600481033,
0.06823836266994476,
-0.054295558482408524,
0.037263110280036926,
-0.044247597455978394,
-0.024068545550107956,
0.028729408979415894,
-0.12523595988750458,
0.0391349121928215,
0.0058163912035524845,
-0.08580204099416733,
-0.05989926680922508,
0.0060431393794715405,
-0.008263684809207916,
-0.026784105226397514,
0.08224128186702728,
-0.07331883162260056,
-0.0008458488155156374,
-0.08638835698366165,
-0.08144640922546387,
0.0012382857967168093,
-0.14912725985050201,
-0.010856139473617077,
-0.04645729064941406,
-0.19569595158100128,
-0.03324921801686287,
0.05100838467478752,
-0.07739522308111191,
-0.038195934146642685,
-0.05256337672472,
-0.0765872672200203,
0.009656568989157677,
-0.00680591631680727,
0.18968094885349274,
-0.060340382158756256,
0.07876451313495636,
-0.014966094866394997,
0.05025837942957878,
0.01568741351366043,
0.04903920739889145,
-0.08494779467582703,
0.0256735198199749,
-0.1498330682516098,
0.08169061690568924,
-0.0994563102722168,
0.022509517148137093,
-0.12330735474824905,
-0.0885300263762474,
0.05101311206817627,
-0.021440397948026657,
0.07877876609563828,
0.13806255161762238,
-0.20154520869255066,
0.0007959635113365948,
0.11774516105651855,
-0.040354616940021515,
-0.055210813879966736,
0.07503936439752579,
-0.05900484696030617,
0.04527449980378151,
0.055414605885744095,
0.1889594942331314,
0.07232828438282013,
-0.1464075893163681,
0.01670340821146965,
0.03281958028674126,
0.05635856091976166,
0.0047180261462926865,
0.03608406335115433,
-0.000603828055318445,
0.028409570455551147,
0.007990719750523567,
-0.08245151489973068,
-0.023431768640875816,
-0.09201636910438538,
-0.07369782030582428,
-0.05077110230922699,
-0.09806565940380096,
0.03160906955599785,
0.014345179311931133,
0.02321358397603035,
-0.0723518431186676,
-0.09970968216657639,
0.10536117106676102,
0.11885202676057816,
-0.05145662650465965,
0.015873800963163376,
-0.07758311182260513,
0.02632902003824711,
-0.015935659408569336,
-0.029011359438300133,
-0.20038805902004242,
-0.11792796850204468,
0.05102921649813652,
-0.048969995230436325,
0.02623981423676014,
0.02225540019571781,
0.06712406873703003,
0.05519931763410568,
-0.04296904057264328,
-0.018176371231675148,
-0.0675092488527298,
0.003732638666406274,
-0.11580098420381546,
-0.20506128668785095,
-0.06745410710573196,
-0.03769649192690849,
0.133278951048851,
-0.19263307750225067,
0.0037538546603173018,
-0.02879600040614605,
0.11214712262153625,
0.017168231308460236,
-0.051306817680597305,
0.012685082852840424,
0.03828013315796852,
0.019354848191142082,
-0.09586408734321594,
0.05658840388059616,
0.010764379985630512,
-0.07019956409931183,
-0.04498480260372162,
-0.11196862161159515,
-0.004354502540081739,
0.0620720200240612,
0.07213548570871353,
-0.0978614091873169,
-0.015816034749150276,
-0.053376454859972,
-0.035639286041259766,
-0.07344835996627808,
0.023609453812241554,
0.20192378759384155,
0.030119238421320915,
0.12384279817342758,
-0.06592443585395813,
-0.0807831734418869,
-0.0022021743934601545,
0.017425457015633583,
0.033868759870529175,
0.09992532432079315,
0.07703699171543121,
-0.07840795069932938,
0.07907265424728394,
0.09273314476013184,
-0.03646787256002426,
0.11675021052360535,
-0.05314510688185692,
-0.07629872113466263,
-0.016879215836524963,
-0.00411068182438612,
-0.029530903324484825,
0.1502518653869629,
-0.08450894057750702,
0.003100711153820157,
0.03908054158091545,
0.024213017895817757,
0.012776289135217667,
-0.1791526973247528,
-0.007056625094264746,
0.013846960850059986,
-0.06178116425871849,
-0.0596686452627182,
-0.026321597397327423,
0.03602689877152443,
0.0973752811551094,
0.031106144189834595,
-0.030920112505555153,
0.019891606643795967,
-0.012653982266783714,
-0.05941493436694145,
0.1905335634946823,
-0.11990929394960403,
-0.08995035290718079,
-0.08410874009132385,
0.022143971174955368,
-0.03722618147730827,
-0.035603493452072144,
0.007447250187397003,
-0.09760969877243042,
-0.05363484099507332,
-0.0850653275847435,
-0.02506631799042225,
-0.005703311879187822,
-0.0058668931014835835,
0.0312042273581028,
-0.012469632551074028,
0.07694291323423386,
-0.1332920491695404,
0.0015831389464437962,
-0.036297135055065155,
-0.09345091134309769,
0.017675179988145828,
0.06657660752534866,
0.08250618726015091,
0.10339201986789703,
-0.014215844683349133,
0.016030311584472656,
-0.02821963094174862,
0.23743818700313568,
-0.06689657270908356,
0.017095457762479782,
0.08977670222520828,
-0.00012243467790540308,
0.053866706788539886,
0.13763421773910522,
0.037018489092588425,
-0.11365052312612534,
0.026548661291599274,
0.0772540494799614,
-0.018030647188425064,
-0.24926990270614624,
-0.03071584738790989,
-0.019718579947948456,
-0.08245664834976196,
0.0825699046254158,
0.03509792685508728,
-0.05120323225855827,
0.034890852868556976,
0.019439062103629112,
-0.00392361031845212,
-0.03352254629135132,
0.06594979017972946,
0.08369550108909607,
0.041384633630514145,
0.10064412653446198,
-0.025339575484395027,
-0.011746044270694256,
0.06337094306945801,
0.019642800092697144,
0.2624169588088989,
-0.03801003843545914,
0.12417570501565933,
0.03634238243103027,
0.14332211017608643,
-0.028266731649637222,
0.05355988070368767,
0.009897262789309025,
-0.006373748183250427,
-0.005295357201248407,
-0.049630701541900635,
-0.017588507384061813,
-0.0012361232656985521,
-0.03393326699733734,
0.022585181519389153,
-0.0756942629814148,
0.026978233829140663,
0.027395330369472504,
0.3032633662223816,
0.041357215493917465,
-0.2671400308609009,
-0.06801126897335052,
0.0032656360417604446,
-0.042716577649116516,
-0.07714519649744034,
0.005056730937212706,
0.13738968968391418,
-0.12494499236345291,
0.0369110144674778,
-0.05109865590929985,
0.09313791245222092,
-0.035021234303712845,
0.011693421751260757,
0.06659785658121109,
0.1454167515039444,
-0.01179014053195715,
0.07012306898832321,
-0.21533474326133728,
0.23314157128334045,
0.02645392343401909,
0.11202280223369598,
-0.06512241065502167,
0.009480022825300694,
0.004313252866268158,
0.04679339379072189,
0.111415334045887,
0.00647112587466836,
-0.00018836036906577647,
-0.17495055496692657,
-0.09720481187105179,
0.05559045076370239,
0.11892964690923691,
-0.0222105011343956,
0.08794785290956497,
-0.03604412451386452,
-0.0030806942377239466,
0.033966466784477234,
-0.0821072980761528,
-0.12565849721431732,
-0.08894503116607666,
-0.005617708899080753,
-0.0003902398166246712,
-0.034646473824977875,
-0.057118970900774,
-0.0903443917632103,
-0.030153581872582436,
0.13298733532428741,
0.021045515313744545,
-0.05444478243589401,
-0.13594195246696472,
0.0429450161755085,
0.1389230191707611,
-0.04546139016747475,
0.01814720220863819,
0.006929636001586914,
0.09955114126205444,
0.04551997780799866,
-0.07672487199306488,
0.06585708260536194,
-0.07399356365203857,
-0.1656465232372284,
-0.05835495889186859,
0.12098552286624908,
0.08285540342330933,
0.056186869740486145,
0.0026389090344309807,
0.031282179057598114,
-0.005553848575800657,
-0.0831323191523552,
0.018032051622867584,
0.043416861444711685,
0.09124965220689774,
0.0344034805893898,
-0.09566521644592285,
0.06613951921463013,
-0.04035336151719093,
-0.009675773791968822,
0.12402312457561493,
0.21539804339408875,
-0.09445731341838837,
0.10595779120922089,
0.07683411240577698,
-0.07963337004184723,
-0.1838047057390213,
0.07170554250478745,
0.12113934010267258,
0.015982167795300484,
0.03817592188715935,
-0.20686151087284088,
0.13019244372844696,
0.10237027704715729,
-0.014251845888793468,
0.042972300201654434,
-0.302143394947052,
-0.1305028200149536,
0.07693895697593689,
0.10640066117048264,
0.04040975123643875,
-0.1254531741142273,
-0.021811794489622116,
-0.014254768379032612,
-0.13084997236728668,
0.14149701595306396,
-0.08519478142261505,
0.1153578981757164,
-0.006036830134689808,
0.11478279531002045,
0.02714291214942932,
-0.039299868047237396,
0.13377645611763,
0.0694870799779892,
0.0953257605433464,
-0.0433817021548748,
0.00991878379136324,
0.05692707747220993,
-0.06405951827764511,
0.038293007761240005,
-0.03164009004831314,
0.06902054697275162,
-0.16033178567886353,
-0.000494977633934468,
-0.09337697923183441,
0.035416122525930405,
-0.0506463460624218,
-0.05628685653209686,
-0.019688019528985023,
0.054353587329387665,
0.0690116435289383,
-0.039716511964797974,
0.02954985201358795,
0.005746330600231886,
0.07586632668972015,
0.08364122360944748,
0.10804343968629837,
-0.035736631602048874,
-0.10002963989973068,
0.0202436912804842,
-0.009060999378561974,
0.05467301979660988,
-0.11152848601341248,
0.024084312841296196,
0.13061784207820892,
0.057969607412815094,
0.11563778668642044,
0.02846471779048443,
-0.03106791153550148,
-0.017742855474352837,
0.016152886673808098,
-0.12144075334072113,
-0.11544656753540039,
0.04493983834981918,
-0.040482889860868454,
-0.1377372443675995,
0.009920748881995678,
0.09257030487060547,
-0.029621107503771782,
-0.020436817780137062,
-0.011301729828119278,
0.020048856735229492,
-0.01992414891719818,
0.20121008157730103,
0.04444239288568497,
0.07016391307115555,
-0.10808015614748001,
0.12585847079753876,
0.04923073202371597,
-0.057867325842380524,
0.05026974529027939,
0.06745075434446335,
-0.10529910027980804,
-0.012408333830535412,
0.12039706110954285,
0.1655173897743225,
-0.027497923001646996,
-0.01933954283595085,
-0.08226368576288223,
-0.09207988530397415,
0.062495652586221695,
0.15551072359085083,
0.05112891271710396,
-0.01692471094429493,
-0.04831080883741379,
0.02793026901781559,
-0.1257791966199875,
0.07526695728302002,
0.044720783829689026,
0.061960604041814804,
-0.09100816398859024,
0.11192825436592102,
-0.00428962055593729,
0.04148690775036812,
-0.018906259909272194,
0.026328472420573235,
-0.1035294383764267,
-0.010238622315227985,
-0.1457090675830841,
-0.0013989738654345274,
-0.006395521108061075,
0.012531265616416931,
-0.021343056112527847,
-0.052241355180740356,
-0.025237776339054108,
0.02585107460618019,
-0.09255389124155045,
-0.04971202462911606,
0.022676939144730568,
0.03642809018492699,
-0.1451815515756607,
-0.015406048856675625,
0.020470160990953445,
-0.09235400706529617,
0.08038278669118881,
0.059974320232868195,
0.011829180642962456,
0.02967204339802265,
-0.10735717415809631,
-0.04810646176338196,
0.0034547767136245966,
0.02460418827831745,
0.0880739763379097,
-0.09275468438863754,
-0.02061404660344124,
-0.03194967657327652,
0.047286372631788254,
0.01819010265171528,
0.09492120146751404,
-0.11422724276781082,
0.0087400758638978,
-0.042089659720659256,
-0.044905439019203186,
-0.06263010948896408,
0.04104316234588623,
0.10883355885744095,
0.044617004692554474,
0.16055577993392944,
-0.07144724577665329,
0.035712696611881256,
-0.18525825440883636,
-0.04414448142051697,
-0.00020867185958195478,
-0.04855659231543541,
-0.08860132098197937,
-0.048418156802654266,
0.0989641472697258,
-0.05197523906826973,
0.1098828837275505,
-0.001600058632902801,
0.0998515859246254,
0.03157533332705498,
-0.00851092766970396,
-0.054982740432024,
0.009889316745102406,
0.15552634000778198,
0.049356527626514435,
-0.01460033468902111,
0.10911153256893158,
0.002018939470872283,
0.04436694085597992,
0.06529805809259415,
0.21181349456310272,
0.15426480770111084,
0.011277761310338974,
0.049108635634183884,
0.060696739703416824,
-0.1172838807106018,
-0.134285107254982,
0.13554780185222626,
-0.04831384867429733,
0.12210482358932495,
-0.0726703628897667,
0.21949832141399384,
0.022931518033146858,
-0.18483874201774597,
0.06965145468711853,
-0.060546256601810455,
-0.12289123982191086,
-0.11607840657234192,
-0.02532340958714485,
-0.0700843408703804,
-0.10851476341485977,
0.020832760259509087,
-0.12052111327648163,
0.06234677881002426,
0.12224740535020828,
0.016241231933236122,
0.023466767743229866,
0.16430692374706268,
-0.037663206458091736,
0.020411446690559387,
0.06859614700078964,
0.017154410481452942,
-0.0059095886535942554,
-0.06281294673681259,
-0.06385856121778488,
0.044196102768182755,
0.022458821535110474,
0.07241233438253403,
-0.041076984256505966,
0.0004440177872311324,
0.019800176844000816,
-0.014399099163711071,
-0.07315292954444885,
0.01734786480665207,
0.023852000012993813,
0.04636675491929054,
0.05819619074463844,
0.05270417034626007,
0.008941124193370342,
-0.03808837756514549,
0.28732672333717346,
-0.07549859583377838,
-0.10098765045404434,
-0.1302826851606369,
0.22485561668872833,
0.011327306739985943,
-0.02275458723306656,
0.07407686114311218,
-0.09523385763168335,
-0.027505533769726753,
0.1801246851682663,
0.1477097123861313,
-0.10591261088848114,
-0.022899825125932693,
-0.019056443125009537,
-0.011652173474431038,
-0.044792383909225464,
0.13338550925254822,
0.10896875709295273,
-0.016681158915162086,
-0.0781116634607315,
-0.019431861117482185,
-0.01691090501844883,
-0.04871279001235962,
-0.0726761668920517,
0.06235935539007187,
0.026774410158395767,
-0.0036487991455942392,
-0.04390724003314972,
0.06025349348783493,
-0.004889302887022495,
-0.23736369609832764,
0.03188881650567055,
-0.16279473900794983,
-0.17753276228904724,
-0.039010148495435715,
0.05871415510773659,
-0.007081418298184872,
0.04210394248366356,
-0.008702186867594719,
0.01087475847452879,
0.13958846032619476,
-0.03181983157992363,
-0.026905614882707596,
-0.12571386992931366,
0.11658044904470444,
-0.1111331582069397,
0.20011115074157715,
0.002592296339571476,
0.0733925923705101,
0.09628886729478836,
0.01816360279917717,
-0.13329631090164185,
0.044477321207523346,
0.07363799959421158,
-0.10342350602149963,
0.010887157171964645,
0.14498397707939148,
-0.05004919692873955,
0.06928997486829758,
0.023674294352531433,
-0.11032514274120331,
-0.007189863361418247,
-0.04523565247654915,
-0.0399288684129715,
-0.07529555261135101,
-0.013096913695335388,
-0.06335283070802689,
0.1596534103155136,
0.22231046855449677,
-0.02441261149942875,
0.018140677362680435,
-0.09369905292987823,
0.01445895154029131,
0.042266011238098145,
0.04479487985372543,
-0.053733229637145996,
-0.20076636970043182,
0.02821139059960842,
0.039887685328722,
0.019424496218562126,
-0.20625095069408417,
-0.07894267141819,
0.044475361704826355,
-0.03174674138426781,
-0.050333354622125626,
0.10110683739185333,
0.030435306951403618,
0.044791191816329956,
-0.03595082834362984,
-0.10715276002883911,
-0.03681391850113869,
0.14935427904129028,
-0.15763118863105774,
-0.04016726836562157
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10009641945362091,
0.1256990283727646,
-0.0022202918771654367,
0.08957543969154358,
0.1384621560573578,
0.03164946660399437,
0.08793316036462784,
0.13433648645877838,
-0.08519602566957474,
0.051037292927503586,
0.0708543211221695,
0.06947190314531326,
0.044836271554231644,
0.1309758722782135,
-0.0425129272043705,
-0.21348276734352112,
0.006172251887619495,
-0.013221216388046741,
-0.06567734479904175,
0.10354319959878922,
0.08982706069946289,
-0.11418122053146362,
0.0676511898636818,
-0.020447660237550735,
-0.1564837396144867,
0.007333934307098389,
-0.03059842437505722,
-0.025207631289958954,
0.10625182837247849,
-0.01716078817844391,
0.09434441477060318,
0.018817514181137085,
0.13889716565608978,
-0.2158467322587967,
0.0010120620718225837,
0.07554474472999573,
0.04773286357522011,
0.0870995819568634,
0.039683181792497635,
0.02224470116198063,
0.038087259978055954,
-0.14887826144695282,
0.09557252377271652,
0.024481942877173424,
-0.08782567828893661,
-0.13786448538303375,
-0.09129400551319122,
0.02177400141954422,
0.08249371498823166,
0.07721274346113205,
0.008436211384832859,
0.13775132596492767,
-0.10293698310852051,
0.08029644191265106,
0.19952169060707092,
-0.28699833154678345,
-0.06591468304395676,
0.043566543608903885,
0.054407160729169846,
0.07387355715036392,
-0.11468134820461273,
-0.01465236023068428,
0.020541967824101448,
0.03151588514447212,
0.11203480511903763,
-0.021009348332881927,
-0.12634830176830292,
0.00599315669387579,
-0.12819376587867737,
-0.013189462944865227,
0.10499583929777145,
0.0360863171517849,
-0.04778354614973068,
-0.07660960406064987,
-0.06164253503084183,
-0.08494316786527634,
-0.034486640244722366,
-0.018032286316156387,
0.05529655143618584,
-0.055173035711050034,
-0.06386242061853409,
-0.0481497161090374,
-0.05711254104971886,
-0.0882980152964592,
-0.0008832342573441565,
0.13944149017333984,
0.040166132152080536,
0.021512819454073906,
-0.03922472149133682,
0.1144421175122261,
0.000526605814229697,
-0.12905658781528473,
-0.004150371998548508,
0.0030090156942605972,
-0.11194458603858948,
-0.05470393970608711,
-0.03697570040822029,
0.012039699591696262,
0.016809646040201187,
0.14966662228107452,
-0.04304021596908569,
0.07449400424957275,
0.015721693634986877,
-0.016140539199113846,
-0.016865810379385948,
0.1532638669013977,
-0.03223979100584984,
-0.04187830537557602,
-0.0001599624374648556,
0.09939082711935043,
0.0010744162136688828,
-0.006520627532154322,
-0.0811205580830574,
-0.01681351102888584,
0.07461122423410416,
0.06939172744750977,
-0.05378948897123337,
0.03505450114607811,
-0.04643914848566055,
-0.023940669372677803,
0.026437096297740936,
-0.12529875338077545,
0.03839971870183945,
0.005910319276154041,
-0.08400501310825348,
-0.060758281499147415,
0.007028933148831129,
-0.008225430734455585,
-0.025065073743462563,
0.08058623224496841,
-0.0720924437046051,
0.00025423281476832926,
-0.08533116430044174,
-0.07885154336690903,
0.0006763481069356203,
-0.14554215967655182,
-0.011522897519171238,
-0.047778695821762085,
-0.19312100112438202,
-0.03156780079007149,
0.05162077769637108,
-0.07851609587669373,
-0.04117579758167267,
-0.05134261026978493,
-0.07511702179908752,
0.010530360974371433,
-0.007428572978824377,
0.1893375664949417,
-0.060527950525283813,
0.07979770749807358,
-0.01511481124907732,
0.04998304694890976,
0.01566019468009472,
0.04815953969955444,
-0.08399732410907745,
0.026926040649414062,
-0.1507912576198578,
0.08288194239139557,
-0.09855280816555023,
0.019090060144662857,
-0.12468133121728897,
-0.08754651993513107,
0.05240710824728012,
-0.02223820611834526,
0.07759786397218704,
0.1379394233226776,
-0.2003541737794876,
0.0012528698425740004,
0.11694104224443436,
-0.041856084018945694,
-0.05460650473833084,
0.07698836922645569,
-0.058997273445129395,
0.04760301858186722,
0.05331519618630409,
0.18933971226215363,
0.07171838730573654,
-0.1463119238615036,
0.01984562538564205,
0.03544537350535393,
0.05555017292499542,
0.006288336589932442,
0.03757495805621147,
-0.0022609566804021597,
0.023892221972346306,
0.007564742583781481,
-0.08571929484605789,
-0.02324472740292549,
-0.09292072802782059,
-0.07426901906728745,
-0.0511929988861084,
-0.0977947860956192,
0.03249606862664223,
0.011641091667115688,
0.023380421102046967,
-0.07139216363430023,
-0.09779158234596252,
0.10460738837718964,
0.11968865990638733,
-0.050767652690410614,
0.017691558226943016,
-0.07767672836780548,
0.026326721534132957,
-0.01678594946861267,
-0.029956022277474403,
-0.20002669095993042,
-0.11863351613283157,
0.05207568407058716,
-0.04868987575173378,
0.02663702704012394,
0.0222602728754282,
0.0655464455485344,
0.05506949499249458,
-0.04261299967765808,
-0.01805245317518711,
-0.06759008020162582,
0.003143914043903351,
-0.11542850732803345,
-0.20602700114250183,
-0.06601178646087646,
-0.037369098514318466,
0.13426285982131958,
-0.19171802699565887,
0.00225398363545537,
-0.028461402282118797,
0.11156661808490753,
0.01718558557331562,
-0.05055459961295128,
0.0111332843080163,
0.037356093525886536,
0.01864214614033699,
-0.09520181268453598,
0.056563954800367355,
0.012573068030178547,
-0.07124949991703033,
-0.04211296886205673,
-0.10921728610992432,
-0.002879437990486622,
0.06021576374769211,
0.07121247053146362,
-0.0984601303935051,
-0.01619950868189335,
-0.05343033745884895,
-0.03671104833483696,
-0.07499115914106369,
0.023887986317276955,
0.20309992134571075,
0.02943660318851471,
0.1245371401309967,
-0.0664711445569992,
-0.079588383436203,
-0.0029869850259274244,
0.014612858183681965,
0.033195171505212784,
0.09952492266893387,
0.07706950604915619,
-0.08073140680789948,
0.07771335542201996,
0.0929347574710846,
-0.0371805801987648,
0.11610382795333862,
-0.0531219020485878,
-0.07560604810714722,
-0.01870039477944374,
-0.0015297944191843271,
-0.029135316610336304,
0.14932076632976532,
-0.08308592438697815,
0.005408702418208122,
0.03884296119213104,
0.025305788964033127,
0.012809223495423794,
-0.1797635555267334,
-0.0074373106472194195,
0.014586894772946835,
-0.06372798979282379,
-0.056298401206731796,
-0.026578858494758606,
0.0368107371032238,
0.09793411195278168,
0.031609516590833664,
-0.030763784423470497,
0.021490788087248802,
-0.012390240095555782,
-0.06053312495350838,
0.19014789164066315,
-0.12027613818645477,
-0.09113277494907379,
-0.08646462857723236,
0.02288532629609108,
-0.03767131641507149,
-0.0350770428776741,
0.007990594953298569,
-0.09594374895095825,
-0.05318070203065872,
-0.08569919317960739,
-0.023466888815164566,
-0.007402350194752216,
-0.005116356071084738,
0.031118838116526604,
-0.012795712798833847,
0.07965879887342453,
-0.13265489041805267,
0.001259024953469634,
-0.03559283912181854,
-0.09516685456037521,
0.018559547141194344,
0.06716232746839523,
0.08208496868610382,
0.10251671075820923,
-0.013397909700870514,
0.015459778718650341,
-0.027232864871621132,
0.23860488831996918,
-0.06538571417331696,
0.016184136271476746,
0.09056106954813004,
-0.0020497851073741913,
0.05482897907495499,
0.1370699107646942,
0.0362609401345253,
-0.11377136409282684,
0.027301115915179253,
0.07621083408594131,
-0.01867121271789074,
-0.2488802820444107,
-0.03171946853399277,
-0.0190589539706707,
-0.08254019170999527,
0.08187432587146759,
0.03532128036022186,
-0.05601406469941139,
0.03373905271291733,
0.02034243568778038,
-0.005086430348455906,
-0.035093698650598526,
0.06582296639680862,
0.08572327345609665,
0.041047003120183945,
0.10035405308008194,
-0.024779023602604866,
-0.011577068828046322,
0.06541457027196884,
0.020029112696647644,
0.2619852125644684,
-0.03817179426550865,
0.12371517717838287,
0.03749895468354225,
0.1455680876970291,
-0.027627797797322273,
0.05053269863128662,
0.010302959941327572,
-0.005603025667369366,
-0.0062355827540159225,
-0.049322668462991714,
-0.018982447683811188,
-0.0008374185999855399,
-0.03307576850056648,
0.02289816550910473,
-0.0766705647110939,
0.03007943369448185,
0.026555072516202927,
0.3049170970916748,
0.04063345119357109,
-0.26469337940216064,
-0.0675782635807991,
0.003702172078192234,
-0.042052797973155975,
-0.07683566957712173,
0.0054117487743496895,
0.13990603387355804,
-0.12634439766407013,
0.03399163484573364,
-0.04998062178492546,
0.092300184071064,
-0.03601976856589317,
0.01088815275579691,
0.06508845835924149,
0.14490585029125214,
-0.010894747450947762,
0.07044024765491486,
-0.21323415637016296,
0.23379169404506683,
0.02622380666434765,
0.10991063714027405,
-0.06265577673912048,
0.009594585746526718,
0.004086569882929325,
0.04784107580780983,
0.11157114803791046,
0.007876136340200901,
-0.0013480990892276168,
-0.17504729330539703,
-0.09876994043588638,
0.055433135479688644,
0.11936359107494354,
-0.02440183237195015,
0.08683151751756668,
-0.037116315215826035,
-0.0016396011924371123,
0.03347976505756378,
-0.08043002337217331,
-0.12491929531097412,
-0.08813092857599258,
-0.006273162551224232,
-0.0012629040284082294,
-0.034957405179739,
-0.058102793991565704,
-0.0901416465640068,
-0.025987813249230385,
0.13512678444385529,
0.020706869661808014,
-0.05465316027402878,
-0.1352880299091339,
0.042002156376838684,
0.13786859810352325,
-0.04704613238573074,
0.01681896112859249,
0.006184613797813654,
0.10010534524917603,
0.0459693968296051,
-0.07689476013183594,
0.06564117968082428,
-0.07409001886844635,
-0.16672039031982422,
-0.05854608491063118,
0.12105856835842133,
0.08301594853401184,
0.05637793242931366,
0.0027320068329572678,
0.030737759545445442,
-0.004731012042611837,
-0.08329266309738159,
0.018829867243766785,
0.04458122327923775,
0.09102712571620941,
0.03481481224298477,
-0.09572497010231018,
0.06967223435640335,
-0.0400235578417778,
-0.01206823531538248,
0.1261163204908371,
0.2191571593284607,
-0.09529802203178406,
0.10758817940950394,
0.07615401595830917,
-0.0810616984963417,
-0.18430189788341522,
0.06983577460050583,
0.1230049803853035,
0.015163625590503216,
0.04043407365679741,
-0.20735929906368256,
0.13012491166591644,
0.10247843712568283,
-0.015754014253616333,
0.041468750685453415,
-0.3034607172012329,
-0.13068260252475739,
0.07566377520561218,
0.1062498539686203,
0.04392644762992859,
-0.12452329695224762,
-0.0222803745418787,
-0.012570064514875412,
-0.13027003407478333,
0.1400810182094574,
-0.08257870376110077,
0.11565925925970078,
-0.006113535258919001,
0.11590281128883362,
0.02714209258556366,
-0.038057226687669754,
0.1364019215106964,
0.06807302683591843,
0.09384085237979889,
-0.04293927550315857,
0.011191777884960175,
0.053997233510017395,
-0.06489524245262146,
0.03682633489370346,
-0.031124765053391457,
0.07043210417032242,
-0.16145238280296326,
-0.00047615476069040596,
-0.09265404939651489,
0.03553445264697075,
-0.05140627920627594,
-0.05594324320554733,
-0.02013472281396389,
0.053712502121925354,
0.06946435570716858,
-0.03925511986017227,
0.028139617294073105,
0.006172127090394497,
0.07351104170084,
0.08868281543254852,
0.10675223916769028,
-0.03428105264902115,
-0.10113218426704407,
0.01864217221736908,
-0.009226902388036251,
0.054789379239082336,
-0.1103542149066925,
0.025623463094234467,
0.12941919267177582,
0.05691250041127205,
0.11560487002134323,
0.028268972411751747,
-0.031263068318367004,
-0.016346652060747147,
0.01575295254588127,
-0.12281082570552826,
-0.11710875481367111,
0.04402993246912956,
-0.039517659693956375,
-0.1378355622291565,
0.006924228277057409,
0.09521577507257462,
-0.02897953800857067,
-0.020004786550998688,
-0.011363296769559383,
0.020626740530133247,
-0.020508071407675743,
0.19918213784694672,
0.04329778626561165,
0.07005398720502853,
-0.10676795989274979,
0.12590374052524567,
0.04918823763728142,
-0.05601511150598526,
0.050215017050504684,
0.06593455374240875,
-0.10418938845396042,
-0.011616673320531845,
0.1211290955543518,
0.1654648333787918,
-0.030720124021172523,
-0.018772421404719353,
-0.08163052797317505,
-0.09227854758501053,
0.06147613748908043,
0.15395092964172363,
0.052278727293014526,
-0.01580904610455036,
-0.04840558022260666,
0.027247361838817596,
-0.12449229508638382,
0.07528195530176163,
0.045803502202034,
0.062098413705825806,
-0.09170711040496826,
0.11187612265348434,
-0.004760186653584242,
0.04325992614030838,
-0.01926409639418125,
0.02488505281507969,
-0.10333438962697983,
-0.01053887140005827,
-0.14763475954532623,
0.0016126936534419656,
-0.003742740023881197,
0.012540261261165142,
-0.02020549215376377,
-0.05310852453112602,
-0.02503214217722416,
0.02697741985321045,
-0.09258003532886505,
-0.04967428371310234,
0.022274626418948174,
0.03773655369877815,
-0.14462488889694214,
-0.01627049781382084,
0.02214314416050911,
-0.09278017282485962,
0.08176348358392715,
0.060719527304172516,
0.012304472737014294,
0.0287665706127882,
-0.10539714246988297,
-0.04832151159644127,
0.0034685328137129545,
0.024852922186255455,
0.0873304083943367,
-0.09378416836261749,
-0.02192421443760395,
-0.03165977820754051,
0.04650527611374855,
0.018174903467297554,
0.09850052744150162,
-0.11509837955236435,
0.009686687961220741,
-0.04234512522816658,
-0.0456572026014328,
-0.06222290173172951,
0.03936471790075302,
0.10767614841461182,
0.047369588166475296,
0.16022427380084991,
-0.0730581283569336,
0.03564288839697838,
-0.18490520119667053,
-0.04377223551273346,
-0.0002203913318226114,
-0.04642295464873314,
-0.09014840424060822,
-0.04888510704040527,
0.09753859043121338,
-0.051897529512643814,
0.10854669660329819,
-0.0014801181387156248,
0.09758450090885162,
0.03171166032552719,
-0.010900492779910564,
-0.05235831066966057,
0.008877802640199661,
0.15386012196540833,
0.049229320138692856,
-0.013737601228058338,
0.11003366112709045,
0.0002806582779157907,
0.04546027258038521,
0.06462743878364563,
0.21381622552871704,
0.1540675312280655,
0.011402761563658714,
0.04894164949655533,
0.06008307263255119,
-0.11591758579015732,
-0.13667209446430206,
0.13394859433174133,
-0.04772587865591049,
0.12195468693971634,
-0.0708659291267395,
0.21715284883975983,
0.02350655011832714,
-0.18558500707149506,
0.06806249916553497,
-0.06028072163462639,
-0.12283246964216232,
-0.11794131249189377,
-0.02418007329106331,
-0.07126043736934662,
-0.10904195159673691,
0.02028462663292885,
-0.12055277824401855,
0.063176728785038,
0.12203221768140793,
0.01520331297069788,
0.024020100012421608,
0.16145628690719604,
-0.03691909834742546,
0.01994401216506958,
0.06773065030574799,
0.01659717597067356,
-0.004988509230315685,
-0.06134134903550148,
-0.06547975540161133,
0.04404482617974281,
0.024512575939297676,
0.07167353481054306,
-0.039650946855545044,
0.0020038855727761984,
0.01862630806863308,
-0.01591823250055313,
-0.07333824783563614,
0.016626210883259773,
0.024359511211514473,
0.045937102288007736,
0.05709584057331085,
0.05331180617213249,
0.00898209773004055,
-0.03732350468635559,
0.2889796197414398,
-0.07463882863521576,
-0.10028880089521408,
-0.129481241106987,
0.22777394950389862,
0.009035698138177395,
-0.02201039344072342,
0.07478569447994232,
-0.09419705718755722,
-0.029663901776075363,
0.17855483293533325,
0.14631105959415436,
-0.10966060310602188,
-0.022454623132944107,
-0.02002057060599327,
-0.011589515022933483,
-0.04377225786447525,
0.1356692612171173,
0.1084570363163948,
-0.015344085171818733,
-0.07928808033466339,
-0.020457709208130836,
-0.01899465173482895,
-0.046907972544431686,
-0.07408825308084488,
0.06250522285699844,
0.02509288117289543,
-0.002220905851572752,
-0.04506519436836243,
0.05857930704951286,
-0.005876694805920124,
-0.23662596940994263,
0.03117128647863865,
-0.16178786754608154,
-0.1778365522623062,
-0.03790871053934097,
0.05956155061721802,
-0.009176055900752544,
0.04180826619267464,
-0.010586831718683243,
0.010982966981828213,
0.14048831164836884,
-0.03278452530503273,
-0.026790615171194077,
-0.12372810393571854,
0.11744919419288635,
-0.11110932379961014,
0.2014789581298828,
0.003988184966146946,
0.07411840558052063,
0.09525182843208313,
0.019371148198843002,
-0.13442900776863098,
0.04363350570201874,
0.07314509898424149,
-0.10353510081768036,
0.010262295603752136,
0.14523202180862427,
-0.05003729462623596,
0.0708436518907547,
0.025685453787446022,
-0.10925144702196121,
-0.008665111847221851,
-0.04515756294131279,
-0.03981144353747368,
-0.07457026839256287,
-0.01618810184299946,
-0.06576497852802277,
0.15862980484962463,
0.22139032185077667,
-0.02484690397977829,
0.018389040604233742,
-0.09283095598220825,
0.014935212209820747,
0.04250815138220787,
0.04702004790306091,
-0.05274408310651779,
-0.20048479735851288,
0.027798008173704147,
0.04186687618494034,
0.019187694415450096,
-0.20708279311656952,
-0.08119030296802521,
0.044541675597429276,
-0.03035042993724346,
-0.04951048269867897,
0.10279091447591782,
0.028734315186738968,
0.04429551959037781,
-0.03526781499385834,
-0.10892102122306824,
-0.0376041941344738,
0.14856423437595367,
-0.15664267539978027,
-0.040619585663080215
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10059002041816711,
0.12500929832458496,
-0.002197258174419403,
0.09042049944400787,
0.13953039050102234,
0.03228902816772461,
0.08883154392242432,
0.1336555778980255,
-0.08569686114788055,
0.050440333783626556,
0.07063678652048111,
0.06907809525728226,
0.04437291994690895,
0.13049477338790894,
-0.04212666302919388,
-0.21349219977855682,
0.006061105988919735,
-0.012758232653141022,
-0.0651671290397644,
0.10350096225738525,
0.0885697528719902,
-0.11483732610940933,
0.06804893910884857,
-0.02110867016017437,
-0.15671740472316742,
0.008090401068329811,
-0.03130769357085228,
-0.025589151307940483,
0.10617996752262115,
-0.017329039052128792,
0.0944000855088234,
0.018466001376509666,
0.13890497386455536,
-0.21437756717205048,
0.00118600286077708,
0.07565196603536606,
0.04686957970261574,
0.08644416928291321,
0.039501726627349854,
0.02191937156021595,
0.037651177495718,
-0.14907366037368774,
0.09597848355770111,
0.024237025529146194,
-0.08784418553113937,
-0.13886979222297668,
-0.09087604284286499,
0.020889880135655403,
0.08224021643400192,
0.07844825088977814,
0.007735928054898977,
0.13666488230228424,
-0.1037486344575882,
0.08045489341020584,
0.19679051637649536,
-0.2882062494754791,
-0.06621301174163818,
0.0435970276594162,
0.05426023155450821,
0.07397052645683289,
-0.11543918401002884,
-0.01408100500702858,
0.020511049777269363,
0.0327659547328949,
0.11227644234895706,
-0.021461354568600655,
-0.12640666961669922,
0.006443609017878771,
-0.12774136662483215,
-0.011568627320230007,
0.10578562319278717,
0.03616495802998543,
-0.047660164535045624,
-0.07779178023338318,
-0.0600939504802227,
-0.08611724525690079,
-0.034759413450956345,
-0.017147010192275047,
0.05545349046587944,
-0.05663880705833435,
-0.06462231278419495,
-0.04602796211838722,
-0.05686662346124649,
-0.08715031296014786,
-0.00029771108529530466,
0.13768641650676727,
0.04011295735836029,
0.021340545266866684,
-0.038465715944767,
0.1136484295129776,
0.0005070972256362438,
-0.128448024392128,
-0.003114357590675354,
0.002195216016843915,
-0.11180981248617172,
-0.05471282824873924,
-0.037329040467739105,
0.013995685614645481,
0.016950085759162903,
0.15015386044979095,
-0.04321683198213577,
0.0754149854183197,
0.016563843935728073,
-0.01745067909359932,
-0.01679408550262451,
0.15103566646575928,
-0.033823300153017044,
-0.04356583580374718,
-0.0008652389515191317,
0.09994656592607498,
0.0002988513733725995,
-0.006862661801278591,
-0.08097968250513077,
-0.015603844076395035,
0.07343853265047073,
0.06896094232797623,
-0.05521318316459656,
0.037001751363277435,
-0.04501679912209511,
-0.023703761398792267,
0.02690150775015354,
-0.1253965049982071,
0.03822030872106552,
0.006059171166270971,
-0.08427499979734421,
-0.05992744490504265,
0.006128645036369562,
-0.009171806275844574,
-0.026052413508296013,
0.08097204566001892,
-0.07293684780597687,
-0.0006283841794356704,
-0.08564480394124985,
-0.0798962339758873,
0.00033404745045118034,
-0.1453145295381546,
-0.010561134666204453,
-0.04722427576780319,
-0.19332480430603027,
-0.03297758847475052,
0.05158323422074318,
-0.0786992534995079,
-0.03989722952246666,
-0.05125467851758003,
-0.07612862437963486,
0.010098005644977093,
-0.007531976327300072,
0.19012829661369324,
-0.061054445803165436,
0.07863897830247879,
-0.013580071739852428,
0.04995693638920784,
0.015374035574495792,
0.04820200428366661,
-0.08343925327062607,
0.026073234155774117,
-0.15094171464443207,
0.08174706995487213,
-0.09934064745903015,
0.02069098874926567,
-0.12329398095607758,
-0.08919830620288849,
0.05345814675092697,
-0.021135292947292328,
0.07693721354007721,
0.13771240413188934,
-0.19954609870910645,
0.0007951041334308684,
0.11604355275630951,
-0.04104923456907272,
-0.054673343896865845,
0.07659357041120529,
-0.05901799723505974,
0.04573315009474754,
0.05389771983027458,
0.18921341001987457,
0.07184973359107971,
-0.14588770270347595,
0.01811019890010357,
0.033845189958810806,
0.05611693859100342,
0.005813088733702898,
0.03623415529727936,
-0.001017163973301649,
0.024667110294103622,
0.008358730003237724,
-0.08412498980760574,
-0.023593150079250336,
-0.09209905564785004,
-0.07386372238397598,
-0.050012536346912384,
-0.09763714671134949,
0.032398857176303864,
0.012383123859763145,
0.02341064251959324,
-0.07266122847795486,
-0.0985964983701706,
0.10388436168432236,
0.11933119595050812,
-0.05141669139266014,
0.015959059819579124,
-0.07831346243619919,
0.026911601424217224,
-0.01685437746345997,
-0.029302846640348434,
-0.19932310283184052,
-0.12006473541259766,
0.05083897337317467,
-0.04622279480099678,
0.026359284296631813,
0.022695045918226242,
0.0665488988161087,
0.055747050791978836,
-0.043356433510780334,
-0.018644552677869797,
-0.06670605391263962,
0.0036521118599921465,
-0.11482170224189758,
-0.20640161633491516,
-0.06686285883188248,
-0.03715154528617859,
0.1350163370370865,
-0.19306471943855286,
0.002785990247502923,
-0.027459831908345222,
0.11110692471265793,
0.016779251396656036,
-0.050203435122966766,
0.01216411218047142,
0.038749340921640396,
0.01977268047630787,
-0.09491634368896484,
0.05731670930981636,
0.012087896466255188,
-0.06957029551267624,
-0.043737463653087616,
-0.10977467894554138,
-0.0010817560832947493,
0.06162510812282562,
0.07037795335054398,
-0.0987081527709961,
-0.0160350538790226,
-0.053242027759552,
-0.03663801774382591,
-0.07334788143634796,
0.02360227331519127,
0.2040616124868393,
0.028868762776255608,
0.12411990016698837,
-0.06534848362207413,
-0.07923030853271484,
-0.002441795077174902,
0.01640789397060871,
0.0341460295021534,
0.09916742146015167,
0.0762101411819458,
-0.07803335785865784,
0.07817701995372772,
0.09188788384199142,
-0.03722091391682625,
0.11694423854351044,
-0.053368669003248215,
-0.07530059665441513,
-0.017920074984431267,
-0.002702275989577174,
-0.02961082011461258,
0.15039660036563873,
-0.08396673202514648,
0.003953934647142887,
0.03853581100702286,
0.024729806929826736,
0.013177486136555672,
-0.1788855791091919,
-0.007344071287661791,
0.013397546485066414,
-0.0627397745847702,
-0.057695407420396805,
-0.026343178004026413,
0.03624516725540161,
0.09757034480571747,
0.0315389446914196,
-0.03204445540904999,
0.021617382764816284,
-0.012425060383975506,
-0.060066238045692444,
0.19031471014022827,
-0.12040869146585464,
-0.09029687941074371,
-0.08615200966596603,
0.021045932546257973,
-0.03885277360677719,
-0.035623833537101746,
0.00764549570158124,
-0.09776926040649414,
-0.053593654185533524,
-0.0855085477232933,
-0.024683931842446327,
-0.007120931055396795,
-0.005334971006959677,
0.030769506469368935,
-0.012786397710442543,
0.0791582465171814,
-0.13220520317554474,
0.0014096508966758847,
-0.03590104728937149,
-0.09551571309566498,
0.01928553357720375,
0.06787048280239105,
0.08222377300262451,
0.10201729089021683,
-0.012950857169926167,
0.015666475519537926,
-0.02746817283332348,
0.23895584046840668,
-0.06562316417694092,
0.01646055094897747,
0.09032239019870758,
-0.0009865104220807552,
0.05338729918003082,
0.13724108040332794,
0.03737277165055275,
-0.11420917510986328,
0.026858719065785408,
0.07707708328962326,
-0.01808900758624077,
-0.24893538653850555,
-0.03102324716746807,
-0.0195135697722435,
-0.0822531059384346,
0.08191178739070892,
0.03527218848466873,
-0.055124811828136444,
0.03426475077867508,
0.021352095529437065,
-0.004073701798915863,
-0.03466039150953293,
0.06529715657234192,
0.08739478141069412,
0.04050678387284279,
0.10119279474020004,
-0.024982217699289322,
-0.01180213876068592,
0.06463414430618286,
0.020470328629016876,
0.2628067433834076,
-0.03855793550610542,
0.12316989153623581,
0.0379178486764431,
0.1448792964220047,
-0.0274514053016901,
0.05176374316215515,
0.009865088388323784,
-0.006273450329899788,
-0.005757268518209457,
-0.04934978485107422,
-0.017538461834192276,
-0.0015055579133331776,
-0.03412019461393356,
0.02197783812880516,
-0.077144093811512,
0.028117772191762924,
0.026902101933956146,
0.3044404089450836,
0.04084156081080437,
-0.26719310879707336,
-0.06848116219043732,
0.0031137941405177116,
-0.04191473498940468,
-0.07613120228052139,
0.005637133494019508,
0.1395753175020218,
-0.12570707499980927,
0.03544003888964653,
-0.05025891214609146,
0.09167872369289398,
-0.036050956696271896,
0.011921809986233711,
0.0675676017999649,
0.1454712450504303,
-0.011541318148374557,
0.07004246860742569,
-0.2133527398109436,
0.23230589926242828,
0.026380019262433052,
0.11097454279661179,
-0.06311068683862686,
0.009814193472266197,
0.004738363903015852,
0.048850856721401215,
0.11136776953935623,
0.007109807804226875,
-0.0013290437636896968,
-0.17514818906784058,
-0.0976463332772255,
0.05671287328004837,
0.11867125332355499,
-0.023060210049152374,
0.0879461020231247,
-0.03605348616838455,
-0.0022766755428165197,
0.03296336159110069,
-0.08182254433631897,
-0.12530481815338135,
-0.08915411680936813,
-0.006556086707860231,
-0.00040460473974235356,
-0.03462563455104828,
-0.05760754272341728,
-0.09086567163467407,
-0.027679787948727608,
0.13464365899562836,
0.021704046055674553,
-0.054947126656770706,
-0.13587480783462524,
0.041206154972314835,
0.13804025948047638,
-0.04549798741936684,
0.01690909080207348,
0.007219226099550724,
0.09912148118019104,
0.046990588307380676,
-0.07657688111066818,
0.065713070333004,
-0.07478193193674088,
-0.165762796998024,
-0.05848056450486183,
0.12060777097940445,
0.08231721818447113,
0.05565350130200386,
0.0023292924743145704,
0.030954798683524132,
-0.004963403567671776,
-0.08386486768722534,
0.019814081490039825,
0.04248499870300293,
0.09250757843255997,
0.034227386116981506,
-0.09660565108060837,
0.06909237056970596,
-0.03931543976068497,
-0.01094623003154993,
0.12383905053138733,
0.21649160981178284,
-0.09431768208742142,
0.10589268803596497,
0.07672321051359177,
-0.08034033328294754,
-0.1837453842163086,
0.07079703360795975,
0.12145741283893585,
0.015621431171894073,
0.03870200738310814,
-0.20848245918750763,
0.13135120272636414,
0.10207812488079071,
-0.014541069976985455,
0.04366525635123253,
-0.30036741495132446,
-0.1299130916595459,
0.07562733441591263,
0.1070866659283638,
0.04395608603954315,
-0.1248866617679596,
-0.021580073982477188,
-0.012645455077290535,
-0.13005468249320984,
0.1393153816461563,
-0.08599383383989334,
0.11571727693080902,
-0.006379265803843737,
0.11619896441698074,
0.02633262239396572,
-0.038383662700653076,
0.13539676368236542,
0.0688379630446434,
0.09456286579370499,
-0.043299976736307144,
0.011811107397079468,
0.05412176251411438,
-0.06412417441606522,
0.03695206716656685,
-0.03213416039943695,
0.06940558552742004,
-0.16198821365833282,
-0.0005229134694673121,
-0.09357430040836334,
0.0350411981344223,
-0.05127795785665512,
-0.05567916855216026,
-0.018774287775158882,
0.05424462631344795,
0.06817369163036346,
-0.03949281945824623,
0.02670103870332241,
0.005377188324928284,
0.0745009258389473,
0.08761276304721832,
0.10783840715885162,
-0.03560694307088852,
-0.10187094658613205,
0.01970287412405014,
-0.009754345752298832,
0.05456022918224335,
-0.10974452644586563,
0.024309126660227776,
0.13035696744918823,
0.05606812238693237,
0.11557363718748093,
0.029147444292902946,
-0.030287200585007668,
-0.01686304435133934,
0.01707196608185768,
-0.12320024520158768,
-0.11573381721973419,
0.04438409209251404,
-0.04347489774227142,
-0.1376296877861023,
0.008441089652478695,
0.09518033266067505,
-0.028820529580116272,
-0.019894013181328773,
-0.011291293427348137,
0.02044529654085636,
-0.020787661895155907,
0.20045213401317596,
0.04321393370628357,
0.06985367089509964,
-0.10791021585464478,
0.1256292760372162,
0.04915747791528702,
-0.0574176162481308,
0.05017548054456711,
0.06745672971010208,
-0.1053123027086258,
-0.012912656180560589,
0.1195116862654686,
0.1667962223291397,
-0.028805116191506386,
-0.019788509234786034,
-0.08307371288537979,
-0.0928945392370224,
0.06157722696661949,
0.1522766351699829,
0.05160580575466156,
-0.016850238665938377,
-0.048531945794820786,
0.027022553607821465,
-0.12513351440429688,
0.07502993941307068,
0.0444684699177742,
0.06219126656651497,
-0.09173046052455902,
0.1113363653421402,
-0.004466901998966932,
0.04248248413205147,
-0.019271669909358025,
0.025516917929053307,
-0.10403763502836227,
-0.010619746521115303,
-0.14783123135566711,
0.00033544356119818985,
-0.004569123964756727,
0.013111601583659649,
-0.020429100841283798,
-0.05163111165165901,
-0.025901777669787407,
0.02709846757352352,
-0.09278769791126251,
-0.04916081950068474,
0.023943590000271797,
0.03825509548187256,
-0.14409571886062622,
-0.01569647341966629,
0.020736785605549812,
-0.09303688257932663,
0.0818050280213356,
0.06061074137687683,
0.012035947293043137,
0.02946208044886589,
-0.1032865047454834,
-0.04892808943986893,
0.002930042101070285,
0.02395152859389782,
0.08807691186666489,
-0.0934276208281517,
-0.021259423345327377,
-0.03167952597141266,
0.046848785132169724,
0.01859136112034321,
0.09737221151590347,
-0.11421448737382889,
0.009941262193024158,
-0.04149554669857025,
-0.04489023983478546,
-0.06292599439620972,
0.040196459740400314,
0.10869429260492325,
0.046233005821704865,
0.16073332726955414,
-0.07325703650712967,
0.0349792018532753,
-0.1849963665008545,
-0.04421268031001091,
-0.0004729267966467887,
-0.047729119658470154,
-0.08907470107078552,
-0.04841586947441101,
0.09840953350067139,
-0.052619460970163345,
0.1101486012339592,
-0.0018340593669563532,
0.09807279706001282,
0.031395070254802704,
-0.011236178688704967,
-0.054001014679670334,
0.009934942238032818,
0.15323258936405182,
0.04877496883273125,
-0.014654099009931087,
0.10909543186426163,
0.0012013352243229747,
0.04431403428316116,
0.06634540110826492,
0.21220122277736664,
0.15322214365005493,
0.012441147118806839,
0.04904724657535553,
0.06096780672669411,
-0.11700411140918732,
-0.1355261355638504,
0.1351110339164734,
-0.04747864603996277,
0.12284516543149948,
-0.07217071205377579,
0.21565546095371246,
0.02280685119330883,
-0.18475081026554108,
0.06843054294586182,
-0.06157422065734863,
-0.1230175569653511,
-0.11645405739545822,
-0.023146454244852066,
-0.07031761854887009,
-0.10972028970718384,
0.020541295409202576,
-0.12109676748514175,
0.06191620975732803,
0.1231137216091156,
0.01621624082326889,
0.02353762835264206,
0.16299641132354736,
-0.03585348278284073,
0.020970480516552925,
0.06787502020597458,
0.01634332910180092,
-0.005368741229176521,
-0.060646891593933105,
-0.06424536556005478,
0.04313809052109718,
0.022826872766017914,
0.07138644903898239,
-0.040452416986227036,
0.0011931824265047908,
0.0193207785487175,
-0.0149683291092515,
-0.072864830493927,
0.016684850677847862,
0.02437201701104641,
0.045844294130802155,
0.0584569051861763,
0.052805155515670776,
0.008148454129695892,
-0.037509188055992126,
0.28768619894981384,
-0.07484156638383865,
-0.09890732169151306,
-0.13073121011257172,
0.22731511294841766,
0.00886810664087534,
-0.02254064939916134,
0.0739494264125824,
-0.09364302456378937,
-0.028447313234210014,
0.17987631261348724,
0.14876329898834229,
-0.10883186757564545,
-0.022800078615546227,
-0.01931845210492611,
-0.01158073265105486,
-0.04384583607316017,
0.1357635259628296,
0.10834918171167374,
-0.015397615730762482,
-0.07919089496135712,
-0.020258266478776932,
-0.018091732636094093,
-0.04697507992386818,
-0.07359025627374649,
0.062287211418151855,
0.026271862909197807,
-0.0024520198348909616,
-0.04445517435669899,
0.058853041380643845,
-0.005754734389483929,
-0.23672671616077423,
0.03262536600232124,
-0.1612883359193802,
-0.17839346826076508,
-0.03887354955077171,
0.05889199674129486,
-0.008353560231626034,
0.04275421053171158,
-0.010211905464529991,
0.010410856455564499,
0.14143167436122894,
-0.03277568146586418,
-0.026100926101207733,
-0.12477956712245941,
0.11771537363529205,
-0.11070110648870468,
0.2002178430557251,
0.003147836308926344,
0.07410947978496552,
0.0959596186876297,
0.018856368958950043,
-0.1342310756444931,
0.04423155635595322,
0.07260098308324814,
-0.10411285609006882,
0.01074159238487482,
0.14485566318035126,
-0.05012129247188568,
0.07017149776220322,
0.024347344413399696,
-0.10838636755943298,
-0.007940764538943768,
-0.04470551386475563,
-0.039688341319561005,
-0.07434465736150742,
-0.015584820881485939,
-0.06530167907476425,
0.15892496705055237,
0.2230718433856964,
-0.024653425440192223,
0.018315467983484268,
-0.09279049187898636,
0.0152655728161335,
0.04288771003484726,
0.04548504576086998,
-0.0534263476729393,
-0.20074622333049774,
0.02804131992161274,
0.04202011600136757,
0.018890999257564545,
-0.2085210084915161,
-0.07980505377054214,
0.04474843665957451,
-0.030072828754782677,
-0.05013113468885422,
0.10270582884550095,
0.03008767031133175,
0.04532745108008385,
-0.03606759011745453,
-0.10601568967103958,
-0.03753466159105301,
0.14819596707820892,
-0.1569346785545349,
-0.04114297032356262
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10052882134914398,
0.12482006102800369,
-0.002168304054066539,
0.09001113474369049,
0.13981685042381287,
0.031687140464782715,
0.08840201050043106,
0.13435040414333344,
-0.08455505967140198,
0.05097340792417526,
0.06992325931787491,
0.0704675242304802,
0.044862050563097,
0.13028568029403687,
-0.042196307331323624,
-0.21311312913894653,
0.00594720896333456,
-0.012349163182079792,
-0.06517915427684784,
0.10357366502285004,
0.08876092731952667,
-0.11517057567834854,
0.06715679168701172,
-0.021483544260263443,
-0.15699198842048645,
0.008225343190133572,
-0.03179548308253288,
-0.025159308686852455,
0.1058829128742218,
-0.01785506308078766,
0.0937228575348854,
0.01850675791501999,
0.13859103620052338,
-0.21529816091060638,
0.0011870310408994555,
0.0761040449142456,
0.04714938998222351,
0.08669096231460571,
0.040445804595947266,
0.022105658426880836,
0.039046842604875565,
-0.1489865928888321,
0.09580399096012115,
0.02484075166285038,
-0.0877557247877121,
-0.1372213512659073,
-0.09138479828834534,
0.02031531184911728,
0.08255931735038757,
0.07905066013336182,
0.007288611959666014,
0.13817010819911957,
-0.10391683876514435,
0.08083291351795197,
0.19921453297138214,
-0.28635022044181824,
-0.06618794798851013,
0.04458383098244667,
0.05434524640440941,
0.07286668568849564,
-0.11659448593854904,
-0.015042552724480629,
0.020509466528892517,
0.03222865238785744,
0.11115869134664536,
-0.020746150985360146,
-0.12798281013965607,
0.006053626071661711,
-0.1281643807888031,
-0.01178035605698824,
0.1042771115899086,
0.036412451416254044,
-0.046855878084897995,
-0.07776842266321182,
-0.06067395955324173,
-0.08545485138893127,
-0.03472509980201721,
-0.017063766717910767,
0.05551072955131531,
-0.05670673027634621,
-0.06395301222801208,
-0.045877598226070404,
-0.05646991729736328,
-0.08818500488996506,
0.00016560613585170358,
0.13732069730758667,
0.04017366096377373,
0.02081179805099964,
-0.039383359253406525,
0.1134718656539917,
0.000925944244954735,
-0.12913289666175842,
-0.0042073666118085384,
0.0031237483490258455,
-0.11200065165758133,
-0.05482826754450798,
-0.036665551364421844,
0.011835361830890179,
0.01660437323153019,
0.1480293720960617,
-0.043222248554229736,
0.07596567273139954,
0.01563335955142975,
-0.017349159345030785,
-0.017311332747340202,
0.15170063078403473,
-0.032734181731939316,
-0.041755203157663345,
-0.0015547216171398759,
0.09992774575948715,
-0.0002932910865638405,
-0.006426219362765551,
-0.08026974648237228,
-0.015705421566963196,
0.07430172711610794,
0.06888432055711746,
-0.054218001663684845,
0.03643179312348366,
-0.04561012238264084,
-0.023887719959020615,
0.027485225349664688,
-0.12529323995113373,
0.03852760046720505,
0.005583297926932573,
-0.08464492112398148,
-0.0611928254365921,
0.006348788272589445,
-0.009420379996299744,
-0.026285342872142792,
0.08142966032028198,
-0.07322689890861511,
-0.00038841081550344825,
-0.08642522990703583,
-0.07945765554904938,
0.0009232826996594667,
-0.14682643115520477,
-0.010841057635843754,
-0.04628153517842293,
-0.1936890035867691,
-0.03320391848683357,
0.050844211131334305,
-0.07875827699899673,
-0.039432816207408905,
-0.05192267894744873,
-0.07664280384778976,
0.01054983027279377,
-0.006978638935834169,
0.1910642683506012,
-0.06054680794477463,
0.07949158549308777,
-0.013884227722883224,
0.04964805766940117,
0.016393104568123817,
0.04897176846861839,
-0.0847279503941536,
0.02575090155005455,
-0.15029948949813843,
0.08208886533975601,
-0.10045262426137924,
0.020799992606043816,
-0.12432678043842316,
-0.08822702616453171,
0.052256882190704346,
-0.0218997523188591,
0.07782424986362457,
0.13862885534763336,
-0.19981102645397186,
0.0011202751193195581,
0.11670256406068802,
-0.04153421148657799,
-0.05516046658158302,
0.07541333884000778,
-0.05868598446249962,
0.045533791184425354,
0.053351666778326035,
0.1896381378173828,
0.07145638018846512,
-0.14562183618545532,
0.01719255931675434,
0.0333406999707222,
0.05696616694331169,
0.005573966074734926,
0.03586360812187195,
-0.0012728030560538173,
0.026134680956602097,
0.008151483722031116,
-0.08404062688350677,
-0.023656565696001053,
-0.092459537088871,
-0.0734093189239502,
-0.050956904888153076,
-0.09770942479372025,
0.03163081780076027,
0.014134228229522705,
0.023027412593364716,
-0.07218720763921738,
-0.09795243293046951,
0.1038554310798645,
0.11917141079902649,
-0.05078161507844925,
0.01631655916571617,
-0.07744324207305908,
0.025804219767451286,
-0.017153024673461914,
-0.029164789244532585,
-0.2009994089603424,
-0.1206146776676178,
0.05149133875966072,
-0.04723415523767471,
0.026644377037882805,
0.021781770512461662,
0.06636754423379898,
0.054615605622529984,
-0.04342431575059891,
-0.01870853640139103,
-0.06737672537565231,
0.0031432954128831625,
-0.11525386571884155,
-0.20656952261924744,
-0.06679800152778625,
-0.0379166379570961,
0.13233260810375214,
-0.19241079688072205,
0.0025133811868727207,
-0.02762620709836483,
0.1119857132434845,
0.01695009134709835,
-0.05087978020310402,
0.012476041913032532,
0.038373932242393494,
0.019409233704209328,
-0.09536393731832504,
0.05697004497051239,
0.011215250007808208,
-0.06943294405937195,
-0.04357457160949707,
-0.10979991406202316,
-0.003158355364575982,
0.06105959415435791,
0.07189111411571503,
-0.09897095710039139,
-0.016254890710115433,
-0.05320193991065025,
-0.03691331669688225,
-0.07440966367721558,
0.02466360479593277,
0.20339997112751007,
0.029300935566425323,
0.12351255118846893,
-0.06575346738100052,
-0.07983063906431198,
-0.0020161173306405544,
0.016500653699040413,
0.03338823840022087,
0.10077334195375443,
0.07781217992305756,
-0.0803699865937233,
0.07931370288133621,
0.092558354139328,
-0.036155376583337784,
0.11738145351409912,
-0.05385135859251022,
-0.07572189718484879,
-0.016389425843954086,
-0.002158159390091896,
-0.029381901025772095,
0.1499292552471161,
-0.08362564444541931,
0.00440572015941143,
0.038485098630189896,
0.024894315749406815,
0.01290617324411869,
-0.17890627682209015,
-0.007413068786263466,
0.013358657248318195,
-0.06260371953248978,
-0.0578756108880043,
-0.026111191138625145,
0.03633474186062813,
0.09786642342805862,
0.031100064516067505,
-0.0311429500579834,
0.020846180617809296,
-0.01256600022315979,
-0.05990620702505112,
0.1911919116973877,
-0.12016955763101578,
-0.08937789499759674,
-0.08468269556760788,
0.02210530824959278,
-0.038312822580337524,
-0.03590568155050278,
0.007837467826902866,
-0.09892436861991882,
-0.05348935350775719,
-0.08512061834335327,
-0.02431858517229557,
-0.006626178044825792,
-0.006325094029307365,
0.030078008770942688,
-0.0125938281416893,
0.07857754826545715,
-0.1330949068069458,
0.001713455538265407,
-0.036229874938726425,
-0.09545407444238663,
0.018731387332081795,
0.06744891405105591,
0.08241043984889984,
0.10243667662143707,
-0.013402124866843224,
0.015593516640365124,
-0.027366938069462776,
0.23811036348342896,
-0.0663028359413147,
0.01593272015452385,
0.09011347591876984,
0.00010873340215766802,
0.054121389985084534,
0.13715241849422455,
0.03668646141886711,
-0.11423806846141815,
0.02726663462817669,
0.07771565020084381,
-0.01836000569164753,
-0.25010961294174194,
-0.03129742667078972,
-0.01972191222012043,
-0.08266647160053253,
0.08196520805358887,
0.03575573116540909,
-0.05528688058257103,
0.03411319479346275,
0.02025528810918331,
-0.005827226210385561,
-0.03480224311351776,
0.06579393893480301,
0.08521094918251038,
0.04145519435405731,
0.1008218452334404,
-0.025085724890232086,
-0.011169438250362873,
0.06380270421504974,
0.020508142188191414,
0.26362523436546326,
-0.03886685520410538,
0.12327172607183456,
0.03754895552992821,
0.14467118680477142,
-0.02811155654489994,
0.05177543684840202,
0.009385199286043644,
-0.0063384538516402245,
-0.005505652166903019,
-0.04919564723968506,
-0.01773093082010746,
-0.0016986490227282047,
-0.03498135879635811,
0.02245279774069786,
-0.07680689543485641,
0.028358273208141327,
0.02645021304488182,
0.3048763871192932,
0.040694158524274826,
-0.2657751739025116,
-0.06759385764598846,
0.003025342943146825,
-0.04221002757549286,
-0.07685813307762146,
0.005243822932243347,
0.13865001499652863,
-0.12541426718235016,
0.03519643843173981,
-0.05057038739323616,
0.0928003117442131,
-0.034561093896627426,
0.011142855510115623,
0.0667862594127655,
0.14554670453071594,
-0.011474440805613995,
0.07061544805765152,
-0.2145359367132187,
0.23429498076438904,
0.026422660797834396,
0.11063725501298904,
-0.06368646025657654,
0.009573977440595627,
0.004028507973998785,
0.04694920405745506,
0.11177174746990204,
0.007066782098263502,
-0.0015955656999722123,
-0.1745804101228714,
-0.09686177223920822,
0.05618797242641449,
0.11950275301933289,
-0.02344701811671257,
0.08702192455530167,
-0.036034177988767624,
-0.00222861603833735,
0.033891379833221436,
-0.08212029188871384,
-0.1254127323627472,
-0.08919387310743332,
-0.006175142712891102,
-0.001035733730532229,
-0.035287175327539444,
-0.057271558791399,
-0.09059362858533859,
-0.026545029133558273,
0.13421045243740082,
0.021765301004052162,
-0.05449074134230614,
-0.1358126699924469,
0.042165324091911316,
0.1382753551006317,
-0.04615769535303116,
0.017041632905602455,
0.006664033513516188,
0.09895168244838715,
0.04623327776789665,
-0.07705361396074295,
0.06658434867858887,
-0.07468004524707794,
-0.16644848883152008,
-0.058484382927417755,
0.12035899609327316,
0.08305013179779053,
0.056065648794174194,
0.0020989165641367435,
0.03125719726085663,
-0.004959739279001951,
-0.08351418375968933,
0.019943319261074066,
0.042665671557188034,
0.09226073324680328,
0.03482582047581673,
-0.09624581784009933,
0.06798028200864792,
-0.0402526818215847,
-0.011745660565793514,
0.12371168285608292,
0.21733316779136658,
-0.0942799374461174,
0.10632205009460449,
0.07713823020458221,
-0.08080295473337173,
-0.18424302339553833,
0.07164408266544342,
0.12200429290533066,
0.014949592761695385,
0.03964357078075409,
-0.20823802053928375,
0.13128061592578888,
0.10167764127254486,
-0.014983035624027252,
0.04297228902578354,
-0.30148425698280334,
-0.12997093796730042,
0.07578987628221512,
0.10690674185752869,
0.041472747921943665,
-0.12527844309806824,
-0.02177148126065731,
-0.01212741993367672,
-0.12953799962997437,
0.14020897448062897,
-0.08378084748983383,
0.11583811789751053,
-0.006858828943222761,
0.11641643196344376,
0.02668282389640808,
-0.03883525729179382,
0.13395710289478302,
0.06905478239059448,
0.09486362338066101,
-0.043139759451150894,
0.011799671687185764,
0.05425090342760086,
-0.06412547826766968,
0.03703903406858444,
-0.03217121958732605,
0.07008347660303116,
-0.16083800792694092,
-0.00010164255945710465,
-0.09432265162467957,
0.03574617952108383,
-0.051387056708335876,
-0.055664874613285065,
-0.01905347965657711,
0.05456669256091118,
0.06868711113929749,
-0.03982272744178772,
0.028888460248708725,
0.005465441849082708,
0.07650412619113922,
0.08755620568990707,
0.10819634050130844,
-0.03511077165603638,
-0.10107497870922089,
0.018883148208260536,
-0.008981687016785145,
0.05461050197482109,
-0.11083237826824188,
0.02409370429813862,
0.1302688866853714,
0.057620819658041,
0.11528819799423218,
0.029357310384511948,
-0.031121378764510155,
-0.016983186826109886,
0.016040358692407608,
-0.12249204516410828,
-0.11653953045606613,
0.04454527050256729,
-0.03915033116936684,
-0.13810181617736816,
0.008853649720549583,
0.09327717870473862,
-0.029582707211375237,
-0.020046401768922806,
-0.011108380742371082,
0.020898491144180298,
-0.01965218596160412,
0.20108135044574738,
0.04373932629823685,
0.0705147534608841,
-0.10775580257177353,
0.12600497901439667,
0.0488567091524601,
-0.058256588876247406,
0.05046100169420242,
0.06773344427347183,
-0.10440769791603088,
-0.012832462787628174,
0.1213042214512825,
0.16537529230117798,
-0.029032673686742783,
-0.018209589645266533,
-0.08172894269227982,
-0.09241382777690887,
0.06188327446579933,
0.15413798391819,
0.051525238901376724,
-0.016997359693050385,
-0.048040151596069336,
0.027392253279685974,
-0.1258145272731781,
0.07560430467128754,
0.04428136721253395,
0.06255347281694412,
-0.09133567661046982,
0.10935232788324356,
-0.004434647969901562,
0.04272494465112686,
-0.01907859370112419,
0.025945434346795082,
-0.10376594960689545,
-0.010511591099202633,
-0.14518921077251434,
0.000042950727220159024,
-0.0043404595926404,
0.012622218579053879,
-0.02113916166126728,
-0.0525958351790905,
-0.024694157764315605,
0.027265120297670364,
-0.0929407849907875,
-0.049825672060251236,
0.023378755897283554,
0.03795185312628746,
-0.1447650045156479,
-0.015825282782316208,
0.021214600652456284,
-0.0923682153224945,
0.08060538023710251,
0.05981620028614998,
0.012051002122461796,
0.029736993834376335,
-0.10640648007392883,
-0.048418089747428894,
0.003254768205806613,
0.024167295545339584,
0.08787989616394043,
-0.09272709488868713,
-0.021565429866313934,
-0.03202430158853531,
0.04697103053331375,
0.018289344385266304,
0.09650254249572754,
-0.1150139570236206,
0.009252077899873257,
-0.04296417161822319,
-0.045659296214580536,
-0.06217687577009201,
0.04070952534675598,
0.10894962400197983,
0.04622641205787659,
0.16005998849868774,
-0.07283162325620651,
0.035942867398262024,
-0.18515123426914215,
-0.04409317299723625,
-0.00019783545576501638,
-0.04781045764684677,
-0.089623361825943,
-0.047513458877801895,
0.09869877994060516,
-0.05271601676940918,
0.10765702277421951,
-0.0015367576852440834,
0.09891273081302643,
0.031856656074523926,
-0.01053925883024931,
-0.05418418347835541,
0.009674035012722015,
0.15307840704917908,
0.048523444682359695,
-0.01401597447693348,
0.1112157478928566,
0.0016373813850805163,
0.04392760992050171,
0.06717141717672348,
0.21383866667747498,
0.15467025339603424,
0.010985077358782291,
0.04878098890185356,
0.06075999140739441,
-0.1175997406244278,
-0.13538049161434174,
0.13490580022335052,
-0.046894073486328125,
0.12280464917421341,
-0.0725661888718605,
0.21656443178653717,
0.02274814061820507,
-0.18448223173618317,
0.06910232454538345,
-0.06200496479868889,
-0.12283884733915329,
-0.11662614345550537,
-0.02302098274230957,
-0.07031121850013733,
-0.10940223187208176,
0.020973892882466316,
-0.12126318365335464,
0.06290881335735321,
0.12312200665473938,
0.015895798802375793,
0.023632632568478584,
0.16393902897834778,
-0.03521121293306351,
0.020867539569735527,
0.06849190592765808,
0.016614653170108795,
-0.005778952967375517,
-0.061432018876075745,
-0.06385502964258194,
0.044358935207128525,
0.023148665204644203,
0.07124035060405731,
-0.040060579776763916,
0.00010579427180346102,
0.0189600121229887,
-0.015126382932066917,
-0.07324496656656265,
0.0169069766998291,
0.024721600115299225,
0.04577667638659477,
0.058709800243377686,
0.0526089146733284,
0.008934026584029198,
-0.03773978352546692,
0.2896776795387268,
-0.07511822134256363,
-0.09868109971284866,
-0.12927359342575073,
0.22767192125320435,
0.010161683894693851,
-0.022604970261454582,
0.07394198328256607,
-0.09461423754692078,
-0.028720855712890625,
0.1780601441860199,
0.1465608924627304,
-0.10773784667253494,
-0.022445568814873695,
-0.019761577248573303,
-0.011395973153412342,
-0.04372050240635872,
0.13530859351158142,
0.10897859185934067,
-0.015058238990604877,
-0.07941168546676636,
-0.019773365929722786,
-0.018231568858027458,
-0.04789292812347412,
-0.07289446890354156,
0.06225254386663437,
0.026706209406256676,
-0.002863576402887702,
-0.044550493359565735,
0.059154462069272995,
-0.00613563135266304,
-0.23631106317043304,
0.03150159865617752,
-0.1627092957496643,
-0.17792098224163055,
-0.0389614999294281,
0.05879536271095276,
-0.008110597729682922,
0.042267125099897385,
-0.009767197072505951,
0.011196336708962917,
0.13902072608470917,
-0.03257486969232559,
-0.026306506246328354,
-0.1258254200220108,
0.11687542498111725,
-0.11169283837080002,
0.20098569989204407,
0.003090954851359129,
0.07296524196863174,
0.09581317007541656,
0.018854107707738876,
-0.1349237710237503,
0.043626632541418076,
0.07309430092573166,
-0.10456984490156174,
0.01071825623512268,
0.14588652551174164,
-0.04983410984277725,
0.07045318186283112,
0.024248166009783745,
-0.1098906546831131,
-0.007698412984609604,
-0.045944374054670334,
-0.038987670093774796,
-0.07492556422948837,
-0.013607712462544441,
-0.0649360939860344,
0.15914343297481537,
0.22319228947162628,
-0.02451784908771515,
0.017865199595689774,
-0.09329322725534439,
0.014869133941829205,
0.04185793921351433,
0.046432171016931534,
-0.05313287302851677,
-0.2007606327533722,
0.02849491871893406,
0.04201098904013634,
0.018858134746551514,
-0.20820800960063934,
-0.07960961014032364,
0.045181382447481155,
-0.031121762469410896,
-0.05006258562207222,
0.10201449692249298,
0.030038243159651756,
0.044476576149463654,
-0.036008287221193314,
-0.10831131041049957,
-0.037354979664087296,
0.14892543852329254,
-0.15713347494602203,
-0.04044506326317787
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
53,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10144220292568207,
0.12606041133403778,
-0.0021792296320199966,
0.09042196720838547,
0.13975268602371216,
0.03231623396277428,
0.08792364597320557,
0.1342141330242157,
-0.08417509496212006,
0.05072282999753952,
0.07032017409801483,
0.06917747110128403,
0.044641971588134766,
0.12941628694534302,
-0.042488183826208115,
-0.21258507668972015,
0.005634497385472059,
-0.01165358629077673,
-0.06302978098392487,
0.10300841182470322,
0.08866897225379944,
-0.11523731052875519,
0.06705552339553833,
-0.02150716260075569,
-0.1560717225074768,
0.00805183220654726,
-0.031247219070792198,
-0.02571709081530571,
0.10626817494630814,
-0.017034368589520454,
0.09398388862609863,
0.017710385844111443,
0.1387321501970291,
-0.2153979390859604,
0.0009262315579690039,
0.07631865888834,
0.04711303859949112,
0.08680234104394913,
0.039203617721796036,
0.023387055844068527,
0.03784406930208206,
-0.14924375712871552,
0.09586330503225327,
0.024274593219161034,
-0.08739788085222244,
-0.13703574240207672,
-0.09061586856842041,
0.021379824727773666,
0.08190151304006577,
0.07849331200122833,
0.008305384777486324,
0.13969972729682922,
-0.10267264395952225,
0.08122430741786957,
0.19919154047966003,
-0.28609174489974976,
-0.06539701670408249,
0.04352840781211853,
0.0542963370680809,
0.074124276638031,
-0.11544991284608841,
-0.015145765617489815,
0.020465347915887833,
0.0319938138127327,
0.11128140985965729,
-0.021435635164380074,
-0.129635289311409,
0.005983553361147642,
-0.12777575850486755,
-0.01229735929518938,
0.10483256727457047,
0.036496877670288086,
-0.04733365401625633,
-0.07733747363090515,
-0.061737142503261566,
-0.0868062973022461,
-0.03503180667757988,
-0.017398888245224953,
0.05490000545978546,
-0.05621665343642235,
-0.06259816884994507,
-0.04594334214925766,
-0.055986978113651276,
-0.0873631089925766,
0.00031043426133692265,
0.13756921887397766,
0.04032488912343979,
0.020773127675056458,
-0.038586828857660294,
0.11317190527915955,
-0.001393242389895022,
-0.12884771823883057,
-0.003635964123532176,
0.003580667544156313,
-0.11199186742305756,
-0.05493725463747978,
-0.037083808332681656,
0.010686913505196571,
0.01596982218325138,
0.14973172545433044,
-0.04280282184481621,
0.07562194019556046,
0.01610334776341915,
-0.016684573143720627,
-0.017175476998090744,
0.15275627374649048,
-0.03363868594169617,
-0.04264698177576065,
-0.0005156880361028016,
0.0999607965350151,
0.0005494164652191103,
-0.007277257740497589,
-0.08132638037204742,
-0.01681698113679886,
0.07489843666553497,
0.0679742842912674,
-0.0544218011200428,
0.03597719594836235,
-0.045633744448423386,
-0.02400612272322178,
0.02858186513185501,
-0.12508192658424377,
0.03851472586393356,
0.005350345280021429,
-0.08438272029161453,
-0.06022319197654724,
0.007451850920915604,
-0.008799034170806408,
-0.02650444209575653,
0.0805865004658699,
-0.07260987162590027,
-0.00012524799967650324,
-0.08619198203086853,
-0.0797046422958374,
0.0010542678646743298,
-0.14678217470645905,
-0.010221533477306366,
-0.04756306856870651,
-0.19387675821781158,
-0.032778285443782806,
0.050916437059640884,
-0.07795768231153488,
-0.0392071008682251,
-0.05095621198415756,
-0.07537565380334854,
0.010342170484364033,
-0.007582666352391243,
0.18885333836078644,
-0.060729995369911194,
0.07914135605096817,
-0.013800949789583683,
0.04930158331990242,
0.01456707064062357,
0.04912497475743294,
-0.08422920852899551,
0.02573929913341999,
-0.15004605054855347,
0.0817481055855751,
-0.09955727308988571,
0.02071448229253292,
-0.12281974405050278,
-0.08807928115129471,
0.05345727130770683,
-0.021211616694927216,
0.0788729190826416,
0.13773390650749207,
-0.19966678321361542,
0.0015927014173939824,
0.11563048511743546,
-0.04094218090176582,
-0.05508369952440262,
0.07732005417346954,
-0.05876002088189125,
0.045167434960603714,
0.05366334319114685,
0.18929705023765564,
0.07290519028902054,
-0.14572502672672272,
0.01807500049471855,
0.034878287464380264,
0.05764520540833473,
0.004174611996859312,
0.035763587802648544,
-0.0008892834885045886,
0.0261766966432333,
0.008355237543582916,
-0.08308733999729156,
-0.022934675216674805,
-0.09249495714902878,
-0.07368648797273636,
-0.05117533355951309,
-0.09755897521972656,
0.031320732086896896,
0.014575808309018612,
0.023190833628177643,
-0.07195554673671722,
-0.098538838326931,
0.10521690547466278,
0.11914368718862534,
-0.05109883472323418,
0.01641741581261158,
-0.07733632624149323,
0.02638240158557892,
-0.016517942771315575,
-0.029400555416941643,
-0.20040133595466614,
-0.11891689151525497,
0.05163784325122833,
-0.04673410579562187,
0.02593207359313965,
0.022319797426462173,
0.06557941436767578,
0.05535608530044556,
-0.04273689165711403,
-0.017935408279299736,
-0.06670712679624557,
0.0032851265277713537,
-0.11646552383899689,
-0.20520621538162231,
-0.0669328048825264,
-0.03730078414082527,
0.13227525353431702,
-0.1933799386024475,
0.002955083269625902,
-0.028447117656469345,
0.11126432567834854,
0.016478465870022774,
-0.05074397101998329,
0.01219271682202816,
0.03929126262664795,
0.019024373963475227,
-0.09550510346889496,
0.057064566761255264,
0.011980051174759865,
-0.07020583003759384,
-0.04485823214054108,
-0.11012689769268036,
-0.002157654380425811,
0.06071493774652481,
0.07096756994724274,
-0.09886484593153,
-0.016635209321975708,
-0.052638228982686996,
-0.03688379377126694,
-0.07367295771837234,
0.02374311350286007,
0.20434455573558807,
0.02950003184378147,
0.12479043751955032,
-0.06559792906045914,
-0.07906051725149155,
-0.0020145494490861893,
0.016367480158805847,
0.034146592020988464,
0.0998350977897644,
0.07765866070985794,
-0.07779119163751602,
0.07831171154975891,
0.09144659340381622,
-0.037705015391111374,
0.1167430430650711,
-0.053396303206682205,
-0.0753694400191307,
-0.01662517711520195,
-0.0022665669675916433,
-0.029132559895515442,
0.14987175166606903,
-0.0855148434638977,
0.0037615185137838125,
0.03870344161987305,
0.024418668821454048,
0.0133367283269763,
-0.17912766337394714,
-0.007586727850139141,
0.014232931658625603,
-0.06199098005890846,
-0.05698513239622116,
-0.026883583515882492,
0.03599561005830765,
0.09716164320707321,
0.030643247067928314,
-0.031878761947155,
0.021437257528305054,
-0.012166737578809261,
-0.06020653247833252,
0.19038385152816772,
-0.12003299593925476,
-0.09040739387273788,
-0.08649247884750366,
0.022564830258488655,
-0.03830002248287201,
-0.035832423716783524,
0.00812172144651413,
-0.09706118702888489,
-0.05332287400960922,
-0.08558542281389236,
-0.02654339000582695,
-0.006282552611082792,
-0.006164464633911848,
0.030560748651623726,
-0.013299237936735153,
0.07924652099609375,
-0.13292016088962555,
0.0014750491827726364,
-0.03556758910417557,
-0.09447227418422699,
0.01922507956624031,
0.06723641604185104,
0.08323656022548676,
0.10292240232229233,
-0.01438967790454626,
0.01494626235216856,
-0.026997990906238556,
0.23769454658031464,
-0.06563108414411545,
0.015915628522634506,
0.09050846844911575,
-0.0011315308511257172,
0.05426228791475296,
0.13644561171531677,
0.03685199096798897,
-0.11428641527891159,
0.026787899434566498,
0.07651852071285248,
-0.01884118653833866,
-0.249144047498703,
-0.031249037012457848,
-0.019850263372063637,
-0.08210175484418869,
0.08215905725955963,
0.035533156245946884,
-0.0549081414937973,
0.03455895930528641,
0.02045363187789917,
-0.003628107253462076,
-0.036136776208877563,
0.06544788926839828,
0.08563458919525146,
0.04118536412715912,
0.10068085044622421,
-0.024535031989216805,
-0.011243246495723724,
0.06412559747695923,
0.019715040922164917,
0.261646568775177,
-0.03920561075210571,
0.12407214194536209,
0.03614617884159088,
0.1456894725561142,
-0.027851279824972153,
0.05209507793188095,
0.00899876095354557,
-0.00658459821715951,
-0.005698167718946934,
-0.04923057556152344,
-0.019504457712173462,
-0.0010865264339372516,
-0.035450562834739685,
0.022830281406641006,
-0.07704319059848785,
0.029363706707954407,
0.025898030027747154,
0.30465152859687805,
0.0404060035943985,
-0.26631906628608704,
-0.06809592992067337,
0.002362680621445179,
-0.04200637340545654,
-0.07761115580797195,
0.005409987177699804,
0.14011961221694946,
-0.12484662979841232,
0.034263238310813904,
-0.04991942271590233,
0.092899851500988,
-0.03573162481188774,
0.011524160392582417,
0.06624175608158112,
0.1455024629831314,
-0.01121499389410019,
0.0707356184720993,
-0.2149169147014618,
0.23286092281341553,
0.02673470973968506,
0.11106042563915253,
-0.06382182240486145,
0.010030310600996017,
0.0037412643432617188,
0.048682864755392075,
0.11093386262655258,
0.007597099989652634,
-0.0002785289252642542,
-0.17640508711338043,
-0.09740886837244034,
0.05628668889403343,
0.11813314259052277,
-0.02151484042406082,
0.08685072511434555,
-0.036165934056043625,
-0.0019726608879864216,
0.03415529802441597,
-0.08132398873567581,
-0.1250895857810974,
-0.08952074497938156,
-0.006575750187039375,
0.0005041866097599268,
-0.035066165030002594,
-0.05715266987681389,
-0.09023251384496689,
-0.028174065053462982,
0.1342485249042511,
0.022291146218776703,
-0.05450151860713959,
-0.13563106954097748,
0.04266609624028206,
0.13749517500400543,
-0.04622825235128403,
0.016875578090548515,
0.006748109124600887,
0.09876014292240143,
0.04587845504283905,
-0.0762089267373085,
0.06656120717525482,
-0.07450494170188904,
-0.1655513495206833,
-0.059046000242233276,
0.1191653311252594,
0.08283092081546783,
0.05615686625242233,
0.0022745300084352493,
0.03086644411087036,
-0.005109649617224932,
-0.08337388932704926,
0.018604852259159088,
0.04363839700818062,
0.09134052693843842,
0.03475618362426758,
-0.09614256024360657,
0.06881260126829147,
-0.039681948721408844,
-0.011617782525718212,
0.12468784302473068,
0.2171528935432434,
-0.09453247487545013,
0.10509498417377472,
0.0780361071228981,
-0.08067847788333893,
-0.1838492751121521,
0.07124356925487518,
0.12154386937618256,
0.015347760170698166,
0.039434514939785004,
-0.20744797587394714,
0.13122418522834778,
0.1029878556728363,
-0.014515637420117855,
0.04290315508842468,
-0.3013918101787567,
-0.1301630288362503,
0.07670971751213074,
0.10672008991241455,
0.04427943378686905,
-0.12570489943027496,
-0.0216707531362772,
-0.012732942588627338,
-0.13099779188632965,
0.1389435976743698,
-0.08297130465507507,
0.115740567445755,
-0.006807747762650251,
0.11495044082403183,
0.02687271498143673,
-0.03906890004873276,
0.13448773324489594,
0.06960180401802063,
0.09481970220804214,
-0.043173111975193024,
0.011844843626022339,
0.05387372151017189,
-0.06424225121736526,
0.037780750542879105,
-0.03173457086086273,
0.07007192075252533,
-0.1630251556634903,
-0.0003366177552379668,
-0.0929078459739685,
0.03597741574048996,
-0.050949472934007645,
-0.05600006505846977,
-0.01899442821741104,
0.05345989018678665,
0.06842677295207977,
-0.03968806192278862,
0.029444660991430283,
0.005951773840934038,
0.0753851905465126,
0.08932168036699295,
0.10662665218114853,
-0.03839932754635811,
-0.10095266252756119,
0.01902780309319496,
-0.00910947285592556,
0.055296894162893295,
-0.11018925905227661,
0.024399952962994576,
0.13073652982711792,
0.0577264241874218,
0.11555159837007523,
0.028278889134526253,
-0.030483892187476158,
-0.016642646864056587,
0.015444890595972538,
-0.12251659482717514,
-0.11546587198972702,
0.0439193993806839,
-0.039507776498794556,
-0.1375783085823059,
0.007798520848155022,
0.09404310584068298,
-0.030631065368652344,
-0.019389541819691658,
-0.011535892263054848,
0.01950918510556221,
-0.019612479954957962,
0.20072060823440552,
0.04417601600289345,
0.07027208805084229,
-0.10745446383953094,
0.12542597949504852,
0.049184415489435196,
-0.05752343684434891,
0.05083918571472168,
0.06695841997861862,
-0.1048089861869812,
-0.012789416126906872,
0.12098808586597443,
0.16535814106464386,
-0.030159451067447662,
-0.019344503059983253,
-0.08245856314897537,
-0.0910029485821724,
0.06161094084382057,
0.15298691391944885,
0.0519549660384655,
-0.01726013422012329,
-0.04847455397248268,
0.0266439076513052,
-0.12588663399219513,
0.07493061572313309,
0.044247545301914215,
0.06283465772867203,
-0.09158855676651001,
0.11148276180028915,
-0.00404019933193922,
0.04265740141272545,
-0.019065016880631447,
0.025738123804330826,
-0.10354434698820114,
-0.010251100175082684,
-0.1472371220588684,
0.0004354696429800242,
-0.0041802506893873215,
0.01299790944904089,
-0.021101055666804314,
-0.052159011363983154,
-0.024831127375364304,
0.027083972468972206,
-0.09216298162937164,
-0.049425143748521805,
0.023720595985651016,
0.03742413595318794,
-0.14400693774223328,
-0.016033995896577835,
0.020811114460229874,
-0.09225837886333466,
0.08075933158397675,
0.05934755131602287,
0.011646587401628494,
0.029182234779000282,
-0.10637663304805756,
-0.04834016412496567,
0.00328949186950922,
0.025141729041934013,
0.08793123811483383,
-0.09070585668087006,
-0.020699143409729004,
-0.0315885990858078,
0.046916376799345016,
0.018044445663690567,
0.09697521477937698,
-0.11522269994020462,
0.008895781822502613,
-0.04234340414404869,
-0.045283157378435135,
-0.06283619999885559,
0.040432512760162354,
0.10873615741729736,
0.045940008014440536,
0.15987905859947205,
-0.07289019227027893,
0.035603590309619904,
-0.18521495163440704,
-0.04432275891304016,
0.00007320403528865427,
-0.04736955091357231,
-0.08928364515304565,
-0.04834206402301788,
0.09852497279644012,
-0.05207255110144615,
0.10930557548999786,
-0.0016846321523189545,
0.09931956231594086,
0.031285371631383896,
-0.011675124987959862,
-0.054069634526968,
0.00885610282421112,
0.15405219793319702,
0.04955892264842987,
-0.013983708806335926,
0.11008714884519577,
0.001828257692977786,
0.04525415971875191,
0.06691353768110275,
0.21136240661144257,
0.15455754101276398,
0.01039635855704546,
0.04914455488324165,
0.06029805541038513,
-0.1171756237745285,
-0.1356806606054306,
0.1342892348766327,
-0.04725823178887367,
0.12343063950538635,
-0.0723663866519928,
0.2167656570672989,
0.022664014250040054,
-0.18406042456626892,
0.06870043277740479,
-0.06147871911525726,
-0.12308130413293839,
-0.1164097711443901,
-0.022349918261170387,
-0.07031869143247604,
-0.10934575647115707,
0.020518064498901367,
-0.12074104696512222,
0.06263981759548187,
0.12379920482635498,
0.015602867119014263,
0.02338976226747036,
0.16315403580665588,
-0.03549828380346298,
0.021229729056358337,
0.06830858439207077,
0.016554495319724083,
-0.005335576832294464,
-0.06261680275201797,
-0.06514368951320648,
0.044622745364904404,
0.02317485399544239,
0.07187381386756897,
-0.04029706493020058,
0.0020946937147527933,
0.019490865990519524,
-0.014548500068485737,
-0.07356590777635574,
0.016769016161561012,
0.02403051033616066,
0.04564681276679039,
0.057375963777303696,
0.05295007303357124,
0.009290207177400589,
-0.037872955203056335,
0.2878393828868866,
-0.07457678765058517,
-0.09910968691110611,
-0.12942112982273102,
0.22642560303211212,
0.010419580154120922,
-0.022220991551876068,
0.07395975291728973,
-0.09458575397729874,
-0.0280169490724802,
0.17844964563846588,
0.14537928998470306,
-0.10868000984191895,
-0.022581415250897408,
-0.019373612478375435,
-0.01158006489276886,
-0.04490998759865761,
0.13574674725532532,
0.10917732119560242,
-0.016204969957470894,
-0.07854851335287094,
-0.02040555700659752,
-0.018513042479753494,
-0.04772417992353439,
-0.0737454891204834,
0.061208900064229965,
0.026890767738223076,
-0.002101150108501315,
-0.04431559145450592,
0.058220282196998596,
-0.005126279313117266,
-0.23694656789302826,
0.031865134835243225,
-0.1627446413040161,
-0.17782677710056305,
-0.038580521941185,
0.058969564735889435,
-0.007935626432299614,
0.041825518012046814,
-0.010013271123170853,
0.01171934511512518,
0.13988587260246277,
-0.033096302300691605,
-0.02673536166548729,
-0.1246478334069252,
0.11592483520507812,
-0.11013280600309372,
0.20073041319847107,
0.003261303063482046,
0.07366795092821121,
0.09561944752931595,
0.019454840570688248,
-0.1340591013431549,
0.04401428624987602,
0.07276837527751923,
-0.10357986390590668,
0.010314544662833214,
0.1443014144897461,
-0.04980545863509178,
0.07053174823522568,
0.02444741502404213,
-0.10975360125303268,
-0.008410735055804253,
-0.04692478105425835,
-0.0390673466026783,
-0.07467864453792572,
-0.015338019467890263,
-0.064491406083107,
0.15937989950180054,
0.22236473858356476,
-0.02425331622362137,
0.0173280518501997,
-0.09331300109624863,
0.01469774916768074,
0.04241566359996796,
0.04627872630953789,
-0.05319893732666969,
-0.20066474378108978,
0.029041381552815437,
0.04153512790799141,
0.019002944231033325,
-0.20775164663791656,
-0.0800153911113739,
0.0448782742023468,
-0.03148364648222923,
-0.04973485320806503,
0.10211692005395889,
0.029963910579681396,
0.04484405368566513,
-0.03607482090592384,
-0.1083478033542633,
-0.037855375558137894,
0.1488010436296463,
-0.15696027874946594,
-0.040981438010931015
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09941526502370834,
0.15223988890647888,
-0.002179159317165613,
0.0913267731666565,
0.13749800622463226,
0.039959866553545,
0.08920580893754959,
0.13614344596862793,
-0.08005421608686447,
0.0744950994849205,
0.07856578379869461,
0.054563313722610474,
0.05285842716693878,
0.12614287436008453,
-0.04793182387948036,
-0.20833860337734222,
0.013592449948191643,
-0.01901666633784771,
-0.05571480467915535,
0.09817498177289963,
0.08209393173456192,
-0.10510200262069702,
0.08067986369132996,
-0.019440677016973495,
-0.1519656628370285,
0.009139960631728172,
-0.03854852169752121,
-0.023866156116127968,
0.09475354105234146,
-0.0067938040010631084,
0.09739191830158234,
0.00978887639939785,
0.13858206570148468,
-0.21218597888946533,
-0.00016220154066104442,
0.07932575047016144,
0.03623782470822334,
0.08994295448064804,
0.04709337279200554,
0.024312609806656837,
0.050366710871458054,
-0.15324346721172333,
0.09039080888032913,
0.0267360620200634,
-0.07740745693445206,
-0.09147169440984726,
-0.09209047257900238,
0.019949806854128838,
0.07795243710279465,
0.08355816453695297,
0.009128999896347523,
0.13451142609119415,
-0.09057335555553436,
0.08258146047592163,
0.20028585195541382,
-0.2727622091770172,
-0.06624267995357513,
0.046740829944610596,
0.05240851268172264,
0.06618179380893707,
-0.118401437997818,
-0.027475759387016296,
0.028994567692279816,
0.03173396363854408,
0.09394770860671997,
-0.0189531147480011,
-0.13864149153232574,
0.0015350790927186608,
-0.13196930289268494,
-0.01410258561372757,
0.10965077579021454,
0.04825140908360481,
-0.04425780102610588,
-0.06489051133394241,
-0.07664613425731659,
-0.10878723114728928,
-0.023735007271170616,
-0.020724797621369362,
0.05280468612909317,
-0.05465317517518997,
-0.056020475924015045,
-0.038888461887836456,
-0.05890020728111267,
-0.08740998804569244,
-0.004366336390376091,
0.11902789026498795,
0.043194036930799484,
0.024955395609140396,
-0.037026822566986084,
0.10275159776210785,
0.000565594295039773,
-0.13360615074634552,
-0.01773567497730255,
0.008619713596999645,
-0.12248922139406204,
-0.05770624428987503,
-0.026435382664203644,
0.010228652507066727,
0.016360916197299957,
0.14936314523220062,
-0.04004120081663132,
0.08189994096755981,
0.015311949886381626,
-0.020280521363019943,
-0.010904679074883461,
0.14446544647216797,
-0.039370328187942505,
-0.05467728525400162,
0.0054671564139425755,
0.09619999676942825,
0.0006609840202145278,
-0.005753365810960531,
-0.07368934899568558,
-0.01906224526464939,
0.09251516312360764,
0.055833350867033005,
-0.052260104566812515,
0.033479444682598114,
-0.03214659169316292,
-0.025708982720971107,
0.0237593911588192,
-0.1236988976597786,
0.03721006214618683,
0.010143492370843887,
-0.08360117673873901,
-0.0366365946829319,
0.012699163518846035,
-0.017106207087635994,
-0.021078329533338547,
0.09125426411628723,
-0.08751461654901505,
-0.005739325191825628,
-0.07780822366476059,
-0.07385499775409698,
0.015713674947619438,
-0.14134502410888672,
-0.01526591181755066,
-0.04901353642344475,
-0.2031131237745285,
-0.037963852286338806,
0.04391082376241684,
-0.07630263268947601,
-0.039979368448257446,
-0.0572483204305172,
-0.08397504687309265,
0.01596413366496563,
-0.0043383934535086155,
0.16661451756954193,
-0.06071765720844269,
0.07706713676452637,
-0.014844200573861599,
0.04104131460189819,
0.01005041878670454,
0.04885638505220413,
-0.08767454326152802,
0.02274273708462715,
-0.14269480109214783,
0.07786483317613602,
-0.09597216546535492,
0.011090160347521305,
-0.12406419962644577,
-0.08793526142835617,
0.039905209094285965,
-0.028189605101943016,
0.07043255120515823,
0.14391876757144928,
-0.19383759796619415,
0.0012386677553877234,
0.11529127508401871,
-0.04891609027981758,
-0.04816652461886406,
0.07755699753761292,
-0.056014999747276306,
0.029082704335451126,
0.05156761780381203,
0.1668732911348343,
0.07399710267782211,
-0.15183687210083008,
-0.018096331506967545,
0.024311523884534836,
0.04920940101146698,
0.005230691749602556,
0.04471135884523392,
0.004695075564086437,
0.02692885510623455,
0.003572773654013872,
-0.07689802348613739,
-0.025728052482008934,
-0.08998031914234161,
-0.07102695107460022,
-0.0527958869934082,
-0.0884854793548584,
0.021457387134432793,
0.01228523999452591,
0.02074521966278553,
-0.05789691582322121,
-0.10211042314767838,
0.10316547006368637,
0.1281958371400833,
-0.05184432491660118,
0.007495982572436333,
-0.06903542578220367,
0.020665299147367477,
-0.029699811711907387,
-0.03598834201693535,
-0.19500651955604553,
-0.13589021563529968,
0.05296456813812256,
-0.054121021181344986,
0.034362152218818665,
0.032803263515233994,
0.0642322227358818,
0.062262002378702164,
-0.03330491483211517,
-0.01844719611108303,
-0.06993676722049713,
-0.0028302189894020557,
-0.11340732127428055,
-0.1994028091430664,
-0.06365232914686203,
-0.032741133123636246,
0.1406167447566986,
-0.20180781185626984,
-0.00026725840871222317,
-0.024663645774126053,
0.12200839817523956,
0.017523325979709625,
-0.058645013719797134,
0.00883437693119049,
0.029871473088860512,
0.005619209725409746,
-0.09966167062520981,
0.04050171375274658,
0.007900271564722061,
-0.05484101548790932,
-0.06059255823493004,
-0.10984697192907333,
-0.003702482208609581,
0.05904163047671318,
0.07939061522483826,
-0.10472561419010162,
-0.00738937221467495,
-0.05029013380408287,
-0.045521631836891174,
-0.08808885514736176,
0.00926819909363985,
0.19678261876106262,
0.03376780077815056,
0.12076833844184875,
-0.06078872084617615,
-0.07119730859994888,
0.0012951225508004427,
0.024156639352440834,
0.02233974263072014,
0.08987540751695633,
0.10671192407608032,
-0.0981888398528099,
0.08918775618076324,
0.06980106234550476,
-0.04739608243107796,
0.12056245654821396,
-0.044490691274404526,
-0.08205025643110275,
-0.019656119868159294,
0.004588006995618343,
-0.03332435339689255,
0.1416999101638794,
-0.08396051824092865,
0.011868826113641262,
0.0344419926404953,
0.025088191032409668,
0.01786101795732975,
-0.16877421736717224,
-0.00031992720323614776,
0.011811871081590652,
-0.0638483315706253,
-0.03944023326039314,
-0.024453165009617805,
0.03716946020722389,
0.09420370310544968,
0.026485148817300797,
-0.04275044798851013,
0.01621195673942566,
-0.012096312828361988,
-0.06960111856460571,
0.1916259229183197,
-0.1094459593296051,
-0.09162057191133499,
-0.10890022665262222,
0.027790110558271408,
-0.04536056146025658,
-0.036671724170446396,
0.005062581971287727,
-0.08252938091754913,
-0.05623335391283035,
-0.08891036361455917,
-0.02407977730035782,
-0.01610434055328369,
-0.0023581325076520443,
0.023899849504232407,
-0.015664147213101387,
0.0720226988196373,
-0.1325749009847641,
0.004169304855167866,
-0.029444003477692604,
-0.10232721269130707,
0.007739291992038488,
0.06367358565330505,
0.08526872098445892,
0.09946291893720627,
-0.013454955071210861,
0.013192190788686275,
-0.027945393696427345,
0.23901596665382385,
-0.056046873331069946,
0.016058575361967087,
0.08308052271604538,
-0.00634074816480279,
0.06049178168177605,
0.14343209564685822,
0.03182994946837425,
-0.10415538400411606,
0.025661388412117958,
0.08296659588813782,
-0.011130424216389656,
-0.24839958548545837,
-0.028727704659104347,
-0.022096484899520874,
-0.0705813467502594,
0.0880584865808487,
0.039721038192510605,
-0.044129878282547,
0.037695400416851044,
0.009433873929083347,
0.012834975495934486,
-0.05198964849114418,
0.07377691566944122,
0.08532468229532242,
0.04485032334923744,
0.09435269236564636,
-0.02216145023703575,
-0.010854261927306652,
0.06617498397827148,
0.015684377402067184,
0.26909056305885315,
-0.028944212943315506,
0.1266472190618515,
0.02575601264834404,
0.14713822305202484,
-0.02993549406528473,
0.04742380231618881,
0.015977943316102028,
0.005423222202807665,
-0.005773315206170082,
-0.05335817486047745,
-0.029925426468253136,
0.01000084076076746,
-0.03339475765824318,
0.0330144464969635,
-0.07241883873939514,
0.04466143622994423,
0.014299692586064339,
0.30001410841941833,
0.04490236937999725,
-0.28221219778060913,
-0.06140453368425369,
-0.0026035059709101915,
-0.04526810720562935,
-0.07499547302722931,
0.0032095059286803007,
0.14422057569026947,
-0.1255587786436081,
0.04000641033053398,
-0.05444671958684921,
0.08886795490980148,
-0.04728942736983299,
0.002040897263213992,
0.06494122743606567,
0.1454940140247345,
-0.010563316754996777,
0.06780153512954712,
-0.18990816175937653,
0.22614197432994843,
0.02907489240169525,
0.11094757914543152,
-0.06180788576602936,
0.013831517659127712,
0.010352159850299358,
0.027748046442866325,
0.10719691216945648,
-0.000029966131478431635,
-0.023834455758333206,
-0.16916853189468384,
-0.11569492518901825,
0.058259524405002594,
0.11615569144487381,
-0.016060905531048775,
0.09577427059412003,
-0.04158448055386543,
-0.0025687848683446646,
0.036550600081682205,
-0.07406723499298096,
-0.12877163290977478,
-0.0859781801700592,
0.0033267727121710777,
0.019347095862030983,
-0.03800387308001518,
-0.048496443778276443,
-0.09165467321872711,
-0.024200987070798874,
0.13571399450302124,
-0.003875290509313345,
-0.04341036453843117,
-0.13558891415596008,
0.0532117523252964,
0.14309155941009521,
-0.05778728798031807,
0.02013981342315674,
0.0039054309017956257,
0.10195297002792358,
0.05151650309562683,
-0.08343532681465149,
0.05232126638293266,
-0.06701157987117767,
-0.1604679375886917,
-0.062000419944524765,
0.11576095223426819,
0.07928947359323502,
0.05342135578393936,
-0.002489128615707159,
0.03279673680663109,
0.000682310201227665,
-0.086336649954319,
0.007497534155845642,
0.056726448237895966,
0.09000878781080246,
0.05053718760609627,
-0.09322959929704666,
0.04535873606801033,
-0.03425710275769234,
-0.003172459313645959,
0.12612970173358917,
0.21531735360622406,
-0.08576030284166336,
0.08893122524023056,
0.0692865177989006,
-0.07931508868932724,
-0.17650744318962097,
0.07089624553918839,
0.13167931139469147,
0.015451831743121147,
0.035745780915021896,
-0.20389613509178162,
0.13628043234348297,
0.1135968565940857,
-0.015439056791365147,
0.05631833150982857,
-0.29663801193237305,
-0.12411943078041077,
0.0700254812836647,
0.10379864275455475,
0.04295116290450096,
-0.12949256598949432,
-0.027690280228853226,
-0.010651328600943089,
-0.14884324371814728,
0.1461799144744873,
-0.07203696668148041,
0.12186673283576965,
-0.008354616351425648,
0.12346691638231277,
0.022797631099820137,
-0.04155924171209335,
0.1318701207637787,
0.0747295692563057,
0.08760105818510056,
-0.039506345987319946,
-0.002743842313066125,
0.04913035035133362,
-0.0671236515045166,
0.048201315104961395,
-0.04124125838279724,
0.06582514941692352,
-0.16524776816368103,
0.0009824370499700308,
-0.08322323113679886,
0.04433681443333626,
-0.04826401174068451,
-0.04740411043167114,
-0.02903127670288086,
0.04718456417322159,
0.06728606671094894,
-0.03493841364979744,
0.03204341232776642,
0.02095925807952881,
0.05718466639518738,
0.09228786081075668,
0.08919750899076462,
-0.019504409283399582,
-0.10968121886253357,
0.011408698745071888,
-0.006649184972047806,
0.05351962149143219,
-0.10034997016191483,
0.016232114285230637,
0.1352083832025528,
0.05833347514271736,
0.12674906849861145,
0.025768470019102097,
-0.033144500106573105,
-0.015598490834236145,
0.01639687828719616,
-0.1256825029850006,
-0.11671841144561768,
0.03816775232553482,
-0.043587032705545425,
-0.1554497927427292,
0.006262763869017363,
0.10219528526067734,
-0.03699866682291031,
-0.01261820737272501,
-0.01000900287181139,
0.024703029543161392,
-0.012336711399257183,
0.20148424804210663,
0.04084612429141998,
0.06214427202939987,
-0.10241727530956268,
0.1258758157491684,
0.057465825229883194,
-0.045717377215623856,
0.0567864365875721,
0.06752996146678925,
-0.09309926629066467,
-0.0052770632319152355,
0.10884062200784683,
0.1724693775177002,
-0.046054158359766006,
-0.01963474601507187,
-0.07210318744182587,
-0.07506605237722397,
0.05754825472831726,
0.16294114291667938,
0.04896555840969086,
-0.006311656907200813,
-0.04355774074792862,
0.02587798610329628,
-0.1253710687160492,
0.07346475124359131,
0.04761701449751854,
0.06779567152261734,
-0.1063326746225357,
0.11072100698947906,
-0.008983266539871693,
0.03359153866767883,
-0.01564883068203926,
0.028582919389009476,
-0.09803533554077148,
-0.02549617365002632,
-0.12318349629640579,
0.019891338422894478,
-0.010816339403390884,
0.007428344339132309,
-0.010592043399810791,
-0.054824523627758026,
-0.03818647190928459,
0.02568547986447811,
-0.0810096263885498,
-0.054864924401044846,
0.016550278291106224,
0.04516798257827759,
-0.15625837445259094,
-0.012202178128063679,
0.023797370493412018,
-0.09331075102090836,
0.0764167383313179,
0.06577790528535843,
0.01717323623597622,
0.02720281295478344,
-0.10933323949575424,
-0.04561813175678253,
0.012874769978225231,
0.02825341932475567,
0.08435852080583572,
-0.08797674626111984,
-0.014849362894892693,
-0.03369668126106262,
0.048735857009887695,
0.015322345308959484,
0.09012243896722794,
-0.1157318577170372,
-0.0034617844503372908,
-0.05588137358427048,
-0.033916864544153214,
-0.05924026295542717,
0.034244269132614136,
0.11657381057739258,
0.0352935828268528,
0.16833819448947906,
-0.06778517365455627,
0.038980498909950256,
-0.19380603730678558,
-0.0333947017788887,
0.002046803943812847,
-0.04093324765563011,
-0.08412955701351166,
-0.04184312745928764,
0.09247542172670364,
-0.051570963114500046,
0.09301196038722992,
-0.007256359327584505,
0.08915497362613678,
0.032221175730228424,
-0.0062035066075623035,
-0.057732321321964264,
0.0016461829654872417,
0.14755843579769135,
0.05939650535583496,
-0.018801482394337654,
0.10239774733781815,
-0.007233870681375265,
0.0551518015563488,
0.04248862713575363,
0.22735385596752167,
0.147431418299675,
-0.021512677893042564,
0.0640985295176506,
0.06981911510229111,
-0.12177982181310654,
-0.1251949965953827,
0.13959258794784546,
-0.04720662906765938,
0.12290347367525101,
-0.055129583925008774,
0.22081837058067322,
0.02375810220837593,
-0.1766694337129593,
0.05227532610297203,
-0.05584150552749634,
-0.119635671377182,
-0.12020082026720047,
-0.015080612152814865,
-0.08003398776054382,
-0.09958119690418243,
0.02528679184615612,
-0.12121669948101044,
0.06570269167423248,
0.117780402302742,
0.016317203640937805,
0.021486766636371613,
0.15373152494430542,
-0.042795468121767044,
0.017678100615739822,
0.0600510835647583,
0.024538759142160416,
-0.006632994394749403,
-0.06212194636464119,
-0.06458031386137009,
0.049871835857629776,
0.03490598872303963,
0.08546072989702225,
-0.04537927359342575,
0.017812810838222504,
0.032692801207304,
-0.014122534543275833,
-0.07415901869535446,
0.0125349722802639,
0.02567332051694393,
0.03873903304338455,
0.05680897831916809,
0.05341317132115364,
0.017893942072987556,
-0.035580553114414215,
0.26743966341018677,
-0.07260844111442566,
-0.08425816148519516,
-0.131089985370636,
0.19929231703281403,
0.016856128349900246,
-0.019063347950577736,
0.07480539381504059,
-0.1044238805770874,
-0.023717602714896202,
0.16788426041603088,
0.12468919903039932,
-0.10188432037830353,
-0.02912547066807747,
-0.01644952967762947,
-0.011629467830061913,
-0.03764792159199715,
0.11606327444314957,
0.09298653155565262,
0.009439806453883648,
-0.07607261836528778,
-0.026773883029818535,
-0.01689523458480835,
-0.04459168016910553,
-0.06373999267816544,
0.03718294948339462,
0.014482411555945873,
0.001099799876101315,
-0.03934725746512413,
0.0522322840988636,
-0.012157535180449486,
-0.24338872730731964,
0.03200064226984978,
-0.15807349979877472,
-0.18101777136325836,
-0.031049834564328194,
0.0649448111653328,
-0.006498979404568672,
0.03885664790868759,
-0.017859967425465584,
0.0037804553285241127,
0.14918670058250427,
-0.03507878631353378,
-0.048150353133678436,
-0.12355633825063705,
0.10188248753547668,
-0.1007438376545906,
0.206479012966156,
0.007357841823250055,
0.08239957690238953,
0.09776555746793747,
0.02323235757648945,
-0.13490726053714752,
0.034034911543130875,
0.07413248717784882,
-0.11122461408376694,
0.013489846140146255,
0.1514429748058319,
-0.055817555636167526,
0.08087686449289322,
0.02604234218597412,
-0.10372836887836456,
-0.018681542947888374,
-0.02701829932630062,
-0.03204270452260971,
-0.07916325330734253,
-0.015188250690698624,
-0.06326913833618164,
0.16548769176006317,
0.21996691823005676,
-0.024544507265090942,
0.015869038179516792,
-0.08630608767271042,
0.01713457889854908,
0.04446308687329292,
0.05087520182132721,
-0.04366718605160713,
-0.20603680610656738,
0.030927561223506927,
0.021806834265589714,
0.023999901488423347,
-0.19568918645381927,
-0.08341196179389954,
0.046098947525024414,
-0.028614023700356483,
-0.05232914164662361,
0.09963012486696243,
0.02350926212966442,
0.04371105879545212,
-0.03461574390530586,
-0.09475787729024887,
-0.04384653642773628,
0.1439928114414215,
-0.16190694272518158,
-0.05404648184776306
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10019687563180923,
0.15129895508289337,
-0.0021241654176265,
0.09084924310445786,
0.13746048510074615,
0.03990688920021057,
0.09047441184520721,
0.134832501411438,
-0.07892819494009018,
0.07376674562692642,
0.07802602648735046,
0.05600525811314583,
0.053315211087465286,
0.12744149565696716,
-0.04702063277363777,
-0.20960117876529694,
0.013626058585941792,
-0.018175499513745308,
-0.05454424396157265,
0.09833437204360962,
0.08221855759620667,
-0.10535737127065659,
0.07963892817497253,
-0.020019711926579475,
-0.15146176517009735,
0.008889822289347649,
-0.03912218660116196,
-0.023638715967535973,
0.09487303346395493,
-0.008424975909292698,
0.0969843715429306,
0.009815801866352558,
0.14080817997455597,
-0.211683452129364,
-0.00022599393560085446,
0.07822569459676743,
0.03600586950778961,
0.08965414762496948,
0.04704590141773224,
0.026093848049640656,
0.050091929733753204,
-0.15422643721103668,
0.09146052598953247,
0.02597680129110813,
-0.07742177695035934,
-0.08969228714704514,
-0.09275902062654495,
0.020961681380867958,
0.08114556223154068,
0.08261577785015106,
0.008216612972319126,
0.1337934285402298,
-0.09122228622436523,
0.0832943245768547,
0.2010432630777359,
-0.2742118537425995,
-0.0664127767086029,
0.04904039576649666,
0.05540548264980316,
0.06673333793878555,
-0.11854526400566101,
-0.02841019444167614,
0.029708899557590485,
0.031101534143090248,
0.09436846524477005,
-0.01974511705338955,
-0.1373053789138794,
0.0010028990218415856,
-0.13337677717208862,
-0.013758435845375061,
0.10904722660779953,
0.049762971699237823,
-0.04453388974070549,
-0.06604193150997162,
-0.076706662774086,
-0.11023081094026566,
-0.025076974183321,
-0.022447943687438965,
0.05256951227784157,
-0.055649496614933014,
-0.05547945573925972,
-0.039242684841156006,
-0.0579695999622345,
-0.08795081824064255,
-0.0018113980768248439,
0.11682265251874924,
0.043737445026636124,
0.023552540689706802,
-0.03606127202510834,
0.10230212658643723,
-0.0011342588113620877,
-0.1330031156539917,
-0.01651899889111519,
0.008846603333950043,
-0.1243402436375618,
-0.05906865373253822,
-0.026175236329436302,
0.011385873891413212,
0.015066278167068958,
0.1506841629743576,
-0.03702642023563385,
0.08223515003919601,
0.01725117675960064,
-0.020696252584457397,
-0.011334184557199478,
0.14437951147556305,
-0.040426261723041534,
-0.05718621984124184,
0.004635311663150787,
0.0974171981215477,
-0.000018620616174302995,
-0.004570499062538147,
-0.07430092990398407,
-0.019570782780647278,
0.09345180541276932,
0.056035228073596954,
-0.05370664969086647,
0.03358052298426628,
-0.03118683397769928,
-0.02516765333712101,
0.024141447618603706,
-0.12411735206842422,
0.03749941289424896,
0.009209726005792618,
-0.0844818577170372,
-0.03764905780553818,
0.011162906885147095,
-0.015945332124829292,
-0.021191561594605446,
0.09016767144203186,
-0.0871545597910881,
-0.0059504383243620396,
-0.0780109092593193,
-0.07518968731164932,
0.016407005488872528,
-0.14588996767997742,
-0.014428720809519291,
-0.04756784811615944,
-0.2063450813293457,
-0.03929247707128525,
0.043267376720905304,
-0.07522878050804138,
-0.04055585712194443,
-0.059324879199266434,
-0.08435814082622528,
0.014165914617478848,
-0.003491276176646352,
0.1679762899875641,
-0.05938722565770149,
0.07634153962135315,
-0.015143983066082,
0.042617782950401306,
0.010644487105309963,
0.049038685858249664,
-0.08656149357557297,
0.02316044270992279,
-0.14156706631183624,
0.07773049175739288,
-0.09602906554937363,
0.013881259597837925,
-0.12420740723609924,
-0.0884738489985466,
0.03887776657938957,
-0.027978437021374702,
0.07074569165706635,
0.14394362270832062,
-0.19550777971744537,
0.0025859945453703403,
0.11639963090419769,
-0.04792841896414757,
-0.047502484172582626,
0.07679896801710129,
-0.05645640566945076,
0.030874885618686676,
0.05311177670955658,
0.1667034924030304,
0.0753304585814476,
-0.15048103034496307,
-0.018590040504932404,
0.023854995146393776,
0.0491664782166481,
0.003838528646156192,
0.04495390132069588,
0.005821886006742716,
0.02928897924721241,
0.0032597738318145275,
-0.07569364458322525,
-0.025972414761781693,
-0.08986540883779526,
-0.07147687673568726,
-0.052063949406147,
-0.08900527656078339,
0.021392954513430595,
0.012675397098064423,
0.020914072170853615,
-0.05805185064673424,
-0.1020120307803154,
0.1034969910979271,
0.12835951149463654,
-0.05230694264173508,
0.006632730830460787,
-0.06892356276512146,
0.02011171728372574,
-0.030240261927247047,
-0.03563373535871506,
-0.19497425854206085,
-0.13348208367824554,
0.05287666618824005,
-0.05457191914319992,
0.033845968544483185,
0.03523818776011467,
0.06490391492843628,
0.062052708119153976,
-0.032757945358753204,
-0.019442016258835793,
-0.07003083825111389,
-0.002998675685375929,
-0.11515389382839203,
-0.19838428497314453,
-0.06546779721975327,
-0.03343033418059349,
0.14081627130508423,
-0.20359699428081512,
0.0010145159903913736,
-0.0266937967389822,
0.12095719575881958,
0.01646166294813156,
-0.05884606018662453,
0.009394115768373013,
0.02949685975909233,
0.00634654751047492,
-0.10050264745950699,
0.040586698800325394,
0.006711015477776527,
-0.054600633680820465,
-0.06270571798086166,
-0.11140651255846024,
-0.006190374027937651,
0.05829406902194023,
0.08115940541028976,
-0.1045733168721199,
-0.007763539906591177,
-0.050357069820165634,
-0.044986240565776825,
-0.08624445647001266,
0.008962411433458328,
0.1945723593235016,
0.033086828887462616,
0.12009517848491669,
-0.061013028025627136,
-0.0722760483622551,
0.0016561512602493167,
0.02530590072274208,
0.023312129080295563,
0.09021250903606415,
0.10622508823871613,
-0.09823859483003616,
0.08864761143922806,
0.07084917277097702,
-0.04665190726518631,
0.11926672607660294,
-0.044542428106069565,
-0.08278964459896088,
-0.01952158659696579,
0.005148167721927166,
-0.033941689878702164,
0.1411692500114441,
-0.08501628786325455,
0.010107187554240227,
0.033907949924468994,
0.023769337683916092,
0.01738925091922283,
-0.16945555806159973,
0.00009246889385394752,
0.011553490534424782,
-0.06309166550636292,
-0.040780533105134964,
-0.02515890821814537,
0.035571616142988205,
0.09349363297224045,
0.02597811631858349,
-0.04383571818470955,
0.015709519386291504,
-0.012021574191749096,
-0.06937921792268753,
0.19218479096889496,
-0.10774527490139008,
-0.09048823267221451,
-0.10745788365602493,
0.02717406302690506,
-0.043015528470277786,
-0.03692420572042465,
0.004525115247815847,
-0.08321031183004379,
-0.055546294897794724,
-0.0882563665509224,
-0.02520587109029293,
-0.0158686526119709,
-0.002805408788844943,
0.025766082108020782,
-0.015055766329169273,
0.07022585719823837,
-0.13243040442466736,
0.0047114547342062,
-0.030181771144270897,
-0.10226219147443771,
0.007114263717085123,
0.06268820911645889,
0.08537525683641434,
0.10013396292924881,
-0.013848785310983658,
0.013466928154230118,
-0.028329212218523026,
0.23881936073303223,
-0.056530945003032684,
0.017448481172323227,
0.08260469138622284,
-0.006255842745304108,
0.06046459823846817,
0.1444827914237976,
0.0312538668513298,
-0.10419628769159317,
0.02527981624007225,
0.08233294636011124,
-0.010638820938766003,
-0.24952542781829834,
-0.027595985680818558,
-0.02136521227657795,
-0.07097925990819931,
0.08899673074483871,
0.039017289876937866,
-0.041923560202121735,
0.03926439583301544,
0.009016244672238827,
0.014857475645840168,
-0.05079445615410805,
0.07382238656282425,
0.08271732926368713,
0.04375742748379707,
0.09444060921669006,
-0.022459493950009346,
-0.011844892986118793,
0.0639265775680542,
0.01623408868908882,
0.27069923281669617,
-0.02707647532224655,
0.1263483464717865,
0.025087779387831688,
0.1459854394197464,
-0.030942223966121674,
0.05033278465270996,
0.01707746647298336,
0.004791476298123598,
-0.0057040853425860405,
-0.052707739174366,
-0.02819843590259552,
0.01057352777570486,
-0.03301520273089409,
0.03236820921301842,
-0.07197368144989014,
0.04444852098822594,
0.013829250819981098,
0.29907193779945374,
0.04662308096885681,
-0.28339165449142456,
-0.06054605543613434,
-0.0022723469883203506,
-0.047112323343753815,
-0.0751068964600563,
0.0026798381004482508,
0.14306901395320892,
-0.12584108114242554,
0.040605369955301285,
-0.05503389611840248,
0.08933725208044052,
-0.04647224768996239,
0.002401142381131649,
0.06374742090702057,
0.14466992020606995,
-0.010358905419707298,
0.06872400641441345,
-0.19142533838748932,
0.22581009566783905,
0.02866353467106819,
0.11221081763505936,
-0.06256759911775589,
0.0139816515147686,
0.009922860190272331,
0.02645069733262062,
0.10908842086791992,
-0.00027144912746734917,
-0.023911328986287117,
-0.17006650567054749,
-0.11553365737199783,
0.05805160850286484,
0.11672364175319672,
-0.015141577459871769,
0.09701205044984818,
-0.04037415608763695,
-0.003555664326995611,
0.036575403064489365,
-0.07503222674131393,
-0.12998662889003754,
-0.0854448601603508,
0.002822115318849683,
0.01763293333351612,
-0.03780870512127876,
-0.04814551770687103,
-0.09163586050271988,
-0.027015121653676033,
0.13424734771251678,
-0.003380316309630871,
-0.0431966632604599,
-0.13509337604045868,
0.05550302565097809,
0.14367784559726715,
-0.05723325163125992,
0.021518316119909286,
0.005221781320869923,
0.10169406235218048,
0.05051908642053604,
-0.08184905350208282,
0.05187879130244255,
-0.06699033826589584,
-0.1602734476327896,
-0.061186183243989944,
0.11738342046737671,
0.07955999672412872,
0.05361853912472725,
-0.0008918426465243101,
0.03204996511340141,
0.0009084322955459356,
-0.086133673787117,
0.006985269021242857,
0.055919043719768524,
0.08962317556142807,
0.04988207668066025,
-0.09371576458215714,
0.04454328119754791,
-0.035252925008535385,
-0.0012173937866464257,
0.12688301503658295,
0.2138698250055313,
-0.08538205176591873,
0.08775409311056137,
0.06916284561157227,
-0.07957430928945541,
-0.17578484117984772,
0.07158273458480835,
0.13231061398983002,
0.016169574111700058,
0.03570752963423729,
-0.20331402122974396,
0.13593575358390808,
0.11330677568912506,
-0.014625935815274715,
0.05502379313111305,
-0.29704976081848145,
-0.12378233671188354,
0.07089032977819443,
0.10421640425920486,
0.040797505527734756,
-0.12900184094905853,
-0.027692580595612526,
-0.011551332660019398,
-0.14877404272556305,
0.14554008841514587,
-0.07220599800348282,
0.12138494849205017,
-0.007631102576851845,
0.1217806488275528,
0.022922737523913383,
-0.04203878715634346,
0.1305117905139923,
0.07675313204526901,
0.08787854760885239,
-0.03939038887619972,
-0.004559756256639957,
0.05176360160112381,
-0.06670495867729187,
0.04957395792007446,
-0.04031131789088249,
0.064857117831707,
-0.16473932564258575,
0.00035879036295227706,
-0.08314228057861328,
0.043509021401405334,
-0.048718806356191635,
-0.04724888876080513,
-0.02826111763715744,
0.047184113413095474,
0.06707454472780228,
-0.03437446802854538,
0.03012627549469471,
0.022093607112765312,
0.056575994938611984,
0.0879833921790123,
0.09034118056297302,
-0.017985479906201363,
-0.10861939936876297,
0.011686976067721844,
-0.0063534509390592575,
0.05268535763025284,
-0.10191701352596283,
0.014966730959713459,
0.13557665050029755,
0.05866716429591179,
0.12724566459655762,
0.02555393986403942,
-0.033554211258888245,
-0.01600017584860325,
0.016099223867058754,
-0.1231655403971672,
-0.11829464137554169,
0.038693636655807495,
-0.04484152793884277,
-0.15643030405044556,
0.008080324158072472,
0.10002145171165466,
-0.03789978474378586,
-0.013643397018313408,
-0.011105850338935852,
0.024399397894740105,
-0.0124781783670187,
0.20311933755874634,
0.041604846715927124,
0.06319767981767654,
-0.10270840674638748,
0.12489877641201019,
0.057438209652900696,
-0.04649509862065315,
0.05626313015818596,
0.06817375868558884,
-0.0938427746295929,
-0.005545901134610176,
0.10948114097118378,
0.17182044684886932,
-0.042701270431280136,
-0.01963008940219879,
-0.07233321666717529,
-0.07466423511505127,
0.05812728404998779,
0.16331525146961212,
0.04908285662531853,
-0.007395837921649218,
-0.043038275092840195,
0.02662235125899315,
-0.12592551112174988,
0.0732303038239479,
0.04811522737145424,
0.06793369352817535,
-0.10635750740766525,
0.11200638115406036,
-0.009034289047122002,
0.03440055251121521,
-0.015118056908249855,
0.029561510309576988,
-0.09747111797332764,
-0.02567128837108612,
-0.12123104929924011,
0.01785317435860634,
-0.012783232145011425,
0.007155572529882193,
-0.011389074847102165,
-0.05386139824986458,
-0.03746401146054268,
0.02496485598385334,
-0.08087033033370972,
-0.05545056611299515,
0.01617506705224514,
0.043958671391010284,
-0.15492598712444305,
-0.012085330672562122,
0.022823316976428032,
-0.093108169734478,
0.0759011059999466,
0.06460611522197723,
0.0175778791308403,
0.02781246416270733,
-0.10884243994951248,
-0.04481605067849159,
0.01418033055961132,
0.028039321303367615,
0.08403347432613373,
-0.08752239495515823,
-0.014581505209207535,
-0.03391992673277855,
0.05027509853243828,
0.01470137108117342,
0.08701292425394058,
-0.11477163434028625,
-0.004027331247925758,
-0.05740079656243324,
-0.03366968780755997,
-0.06006299704313278,
0.03505392745137215,
0.11533981561660767,
0.033991504460573196,
0.16886869072914124,
-0.06580386310815811,
0.039282090961933136,
-0.19446779787540436,
-0.03403735160827637,
0.001392655074596405,
-0.04126802086830139,
-0.0834403932094574,
-0.04290527105331421,
0.09276629239320755,
-0.05130871757864952,
0.093236543238163,
-0.007965439930558205,
0.09039559960365295,
0.031055763363838196,
-0.003149725031107664,
-0.05787428468465805,
0.0024932848755270243,
0.14821818470954895,
0.05926106870174408,
-0.019437260925769806,
0.10101402550935745,
-0.006285306066274643,
0.05524181202054024,
0.041215844452381134,
0.2246885448694229,
0.14783047139644623,
-0.022251928225159645,
0.0644628182053566,
0.07031030207872391,
-0.12201344966888428,
-0.12367753684520721,
0.14161257445812225,
-0.04757937788963318,
0.12158846110105515,
-0.05531030148267746,
0.22338546812534332,
0.023776475340127945,
-0.17678594589233398,
0.05260229855775833,
-0.054771553725004196,
-0.12051401287317276,
-0.11880019307136536,
-0.016568684950470924,
-0.0803799033164978,
-0.09813118726015091,
0.025972213596105576,
-0.12123369425535202,
0.06472162157297134,
0.1181812658905983,
0.016320189461112022,
0.02100503258407116,
0.15485577285289764,
-0.04270196706056595,
0.01797603815793991,
0.06068425253033638,
0.024184217676520348,
-0.005885491147637367,
-0.062288105487823486,
-0.06371662020683289,
0.05066579952836037,
0.0337214358150959,
0.08617039024829865,
-0.047417737543582916,
0.01684059388935566,
0.032960083335638046,
-0.013346482068300247,
-0.07387632131576538,
0.012907768599689007,
0.025347160175442696,
0.039428070187568665,
0.05565603822469711,
0.05362715944647789,
0.017134834080934525,
-0.03602122515439987,
0.26551687717437744,
-0.07293473184108734,
-0.08574344962835312,
-0.1311889886856079,
0.1963656097650528,
0.01843108981847763,
-0.01909004896879196,
0.07534568011760712,
-0.10461162030696869,
-0.02093161828815937,
0.16872800886631012,
0.12438931316137314,
-0.09908626973628998,
-0.029848869889974594,
-0.015592330135405064,
-0.01210827101022005,
-0.03876782953739166,
0.1155838593840599,
0.09364912658929825,
0.0067258430644869804,
-0.07596851885318756,
-0.02637322060763836,
-0.015520981512963772,
-0.04644104838371277,
-0.06375417113304138,
0.03661554679274559,
0.01576758734881878,
0.0008839633665047586,
-0.037388838827610016,
0.054690033197402954,
-0.010005777701735497,
-0.24304543435573578,
0.03110678493976593,
-0.15675589442253113,
-0.18109852075576782,
-0.03169457986950874,
0.06455306708812714,
-0.0044567035511136055,
0.03891608864068985,
-0.017560584470629692,
0.0037822870071977377,
0.14812017977237701,
-0.03430190682411194,
-0.048464085906744,
-0.12363092601299286,
0.10198719799518585,
-0.10226491838693619,
0.20500825345516205,
0.007078561931848526,
0.08334916830062866,
0.09803303331136703,
0.021468650549650192,
-0.13438090682029724,
0.03431951254606247,
0.0744183138012886,
-0.10899709165096283,
0.015319837257266045,
0.15183040499687195,
-0.0553937591612339,
0.07784906774759293,
0.024993745610117912,
-0.10462933778762817,
-0.017936907708644867,
-0.025817018002271652,
-0.031578339636325836,
-0.08042700588703156,
-0.01246459037065506,
-0.06230755150318146,
0.1660284698009491,
0.21984991431236267,
-0.025114988908171654,
0.01656498573720455,
-0.08693571388721466,
0.016097744926810265,
0.04471290484070778,
0.05125144496560097,
-0.0436912439763546,
-0.20659610629081726,
0.030108805745840073,
0.017909293994307518,
0.024633008986711502,
-0.19392983615398407,
-0.08236671984195709,
0.04507921636104584,
-0.029970478266477585,
-0.05323033779859543,
0.09868516772985458,
0.025144053623080254,
0.04301299899816513,
-0.03415941447019577,
-0.09538908302783966,
-0.0441507026553154,
0.14530175924301147,
-0.1628185659646988,
-0.05344719812273979
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09896048903465271,
0.1525704562664032,
-0.0021925747860223055,
0.09122615307569504,
0.13698431849479675,
0.03995095193386078,
0.08917948603630066,
0.13551120460033417,
-0.08010555058717728,
0.07371117174625397,
0.07876118272542953,
0.05424270033836365,
0.052560340613126755,
0.12672977149486542,
-0.04701472073793411,
-0.20962285995483398,
0.013064729981124401,
-0.019701499491930008,
-0.05645009130239487,
0.09828635305166245,
0.08304326236248016,
-0.10441876947879791,
0.07994496077299118,
-0.019197331741452217,
-0.1532125174999237,
0.009033774957060814,
-0.03807871788740158,
-0.02231532707810402,
0.09485619515180588,
-0.007350957486778498,
0.09763666242361069,
0.010789645835757256,
0.14044766128063202,
-0.21227379143238068,
-0.000445468642283231,
0.07767176628112793,
0.03675922378897667,
0.0896172970533371,
0.04736916720867157,
0.026109768077731133,
0.0496474914252758,
-0.15297573804855347,
0.090103879570961,
0.026175888255238533,
-0.07726442813873291,
-0.09031935036182404,
-0.09237568080425262,
0.02088417112827301,
0.08075400441884995,
0.08152078837156296,
0.00926815252751112,
0.13225677609443665,
-0.09089789539575577,
0.08273801952600479,
0.19961819052696228,
-0.27431026101112366,
-0.06691908091306686,
0.04751875624060631,
0.05340488627552986,
0.0676480233669281,
-0.1171991154551506,
-0.027688898146152496,
0.029856927692890167,
0.03138769418001175,
0.09392158687114716,
-0.019564051181077957,
-0.13794517517089844,
0.0010715039679780602,
-0.13339921832084656,
-0.015589375980198383,
0.11000978946685791,
0.048779234290122986,
-0.04543110728263855,
-0.06334222108125687,
-0.0770830363035202,
-0.10775817930698395,
-0.0236525796353817,
-0.022781075909733772,
0.0529562309384346,
-0.053844649344682693,
-0.055104997009038925,
-0.04159623757004738,
-0.05993250757455826,
-0.08836961537599564,
-0.0032808477990329266,
0.11911329627037048,
0.0435529388487339,
0.02506379783153534,
-0.036260128021240234,
0.1039644330739975,
0.0010822969488799572,
-0.13251540064811707,
-0.01717977598309517,
0.009449547156691551,
-0.12337647378444672,
-0.05846192315220833,
-0.026622122153639793,
0.012127546593546867,
0.01591780036687851,
0.14911717176437378,
-0.03933969512581825,
0.0812777429819107,
0.015876363962888718,
-0.019247470423579216,
-0.011078838258981705,
0.14447633922100067,
-0.03819164261221886,
-0.05395497754216194,
0.005276311654597521,
0.0965859591960907,
0.00029064808040857315,
-0.004648847039788961,
-0.0745215117931366,
-0.020101971924304962,
0.09337262064218521,
0.056923672556877136,
-0.053328242152929306,
0.03158791735768318,
-0.03337996453046799,
-0.025233658030629158,
0.022044485434889793,
-0.12411117553710938,
0.03692558780312538,
0.009386599995195866,
-0.0829307958483696,
-0.03859105333685875,
0.012161683291196823,
-0.0158053208142519,
-0.019522493705153465,
0.08920355886220932,
-0.0860806480050087,
-0.004920339677482843,
-0.07716325670480728,
-0.07260001450777054,
0.015794461593031883,
-0.14214228093624115,
-0.01507199089974165,
-0.048649001866579056,
-0.20391662418842316,
-0.03758282586932182,
0.044045496731996536,
-0.07629308849573135,
-0.04317719489336014,
-0.05820600315928459,
-0.08299361169338226,
0.015102493576705456,
-0.004181237425655127,
0.16789257526397705,
-0.05959102138876915,
0.07734110951423645,
-0.015413504093885422,
0.04223814606666565,
0.010514986701309681,
0.04816918820142746,
-0.08578309416770935,
0.02413041703402996,
-0.14255835115909576,
0.07880230247974396,
-0.09518250823020935,
0.010286001488566399,
-0.1253300905227661,
-0.08771365135908127,
0.040000248700380325,
-0.028513891622424126,
0.06949727237224579,
0.14375026524066925,
-0.19437125325202942,
0.002729361644014716,
0.11577228456735611,
-0.04950908198952675,
-0.046826355159282684,
0.07856693863868713,
-0.05651593208312988,
0.03305338695645332,
0.0510454960167408,
0.16710086166858673,
0.07495984435081482,
-0.15042449533939362,
-0.01576787792146206,
0.026139264926314354,
0.048372283577919006,
0.005317282862961292,
0.046315327286720276,
0.004353204742074013,
0.025001192465424538,
0.0027893499936908484,
-0.0788947269320488,
-0.02593114599585533,
-0.09073152393102646,
-0.07201509922742844,
-0.052269887179136276,
-0.0887942835688591,
0.022529983893036842,
0.0103396475315094,
0.021260278299450874,
-0.057199325412511826,
-0.10044584423303604,
0.10325384140014648,
0.12923267483711243,
-0.05149124562740326,
0.008211418054997921,
-0.06936844438314438,
0.01987724006175995,
-0.030821766704320908,
-0.03653588145971298,
-0.19448243081569672,
-0.1341259926557541,
0.05359754338860512,
-0.054036110639572144,
0.03405112400650978,
0.03515071049332619,
0.06364285200834274,
0.061774659901857376,
-0.03272922709584236,
-0.019361823797225952,
-0.07022866606712341,
-0.003627241123467684,
-0.11459988355636597,
-0.1991611272096634,
-0.06412824243307114,
-0.0331999771296978,
0.14189817011356354,
-0.2025391310453415,
-0.00022379134315997362,
-0.02607038803398609,
0.12059290707111359,
0.016795342788100243,
-0.05822905898094177,
0.007992630824446678,
0.028843626379966736,
0.005820069927722216,
-0.09959658980369568,
0.040679868310689926,
0.008362533524632454,
-0.05546140298247337,
-0.05987454205751419,
-0.10857754200696945,
-0.004287936259061098,
0.056709662079811096,
0.08015243709087372,
-0.10516956448554993,
-0.008331558667123318,
-0.05046014115214348,
-0.046039748936891556,
-0.08778123557567596,
0.009300461038947105,
0.19594398140907288,
0.03254972770810127,
0.12067074328660965,
-0.06135949864983559,
-0.07119569927453995,
0.0007907753461040556,
0.02290605753660202,
0.022637614980340004,
0.08967417478561401,
0.10648936778306961,
-0.10022599250078201,
0.08745121955871582,
0.07096128165721893,
-0.04748564213514328,
0.11922593414783478,
-0.04437382519245148,
-0.08224190026521683,
-0.02118493989109993,
0.007308514788746834,
-0.03369244560599327,
0.14040891826152802,
-0.08365296572446823,
0.012228668667376041,
0.033664435148239136,
0.024698244407773018,
0.017515089362859726,
-0.17010009288787842,
-0.0003877005947288126,
0.012306326068937778,
-0.06491052359342575,
-0.03806307911872864,
-0.02539004199206829,
0.03647533059120178,
0.09402912855148315,
0.026335103437304497,
-0.043559953570365906,
0.017209649085998535,
-0.011756537482142448,
-0.07026827335357666,
0.19168798625469208,
-0.10816092789173126,
-0.09171050041913986,
-0.10977213084697723,
0.02816464938223362,
-0.04333861172199249,
-0.036469291895627975,
0.004900800995528698,
-0.08170995861291885,
-0.054913632571697235,
-0.08891789615154266,
-0.02368106134235859,
-0.017407754436135292,
-0.0020512868650257587,
0.025537066161632538,
-0.015272366814315319,
0.07269003242254257,
-0.13183444738388062,
0.004500729031860828,
-0.029547108337283134,
-0.10390126705169678,
0.0081814955919981,
0.06339468061923981,
0.08486423641443253,
0.09941355139017105,
-0.013275633566081524,
0.013034014962613583,
-0.02753310464322567,
0.23935729265213013,
-0.05523629114031792,
0.016445765271782875,
0.08309023827314377,
-0.008073221892118454,
0.061132289469242096,
0.14381898939609528,
0.030688559636473656,
-0.10431862622499466,
0.025953762233257294,
0.08163107186555862,
-0.011038008145987988,
-0.24897339940071106,
-0.0284225232899189,
-0.020724663510918617,
-0.0707658901810646,
0.08836710453033447,
0.039247170090675354,
-0.04642176628112793,
0.03808034583926201,
0.010029531084001064,
0.013381707482039928,
-0.0520780123770237,
0.07355021685361862,
0.0843670442700386,
0.04353627189993858,
0.09433440864086151,
-0.021902009844779968,
-0.011848367750644684,
0.06583738327026367,
0.01648692786693573,
0.27008160948753357,
-0.027658648788928986,
0.12561944127082825,
0.026215050369501114,
0.14808818697929382,
-0.030173281207680702,
0.04741824045777321,
0.01739512011408806,
0.005407729651778936,
-0.0065496391616761684,
-0.05233270674943924,
-0.029696788638830185,
0.011068694293498993,
-0.03195236995816231,
0.03250180929899216,
-0.07279933243989944,
0.047076787799596786,
0.013146075420081615,
0.3009730875492096,
0.045938827097415924,
-0.2810386121273041,
-0.06019127368927002,
-0.0018683809321373701,
-0.04651669040322304,
-0.07471029460430145,
0.003055026987567544,
0.14578765630722046,
-0.12702466547489166,
0.03798813000321388,
-0.05411168932914734,
0.08837953954935074,
-0.04741634428501129,
0.001610225299373269,
0.06261120736598969,
0.1440889835357666,
-0.009700875729322433,
0.06894052773714066,
-0.18947921693325043,
0.226405531167984,
0.02852742373943329,
0.1103932186961174,
-0.06035618111491203,
0.01408607978373766,
0.009845548309385777,
0.02755713276565075,
0.10907329618930817,
0.0009013321832753718,
-0.025028733536601067,
-0.17028950154781342,
-0.11681237816810608,
0.05810396000742912,
0.11713329702615738,
-0.01726331003010273,
0.09594956785440445,
-0.041293296962976456,
-0.0022203929256647825,
0.036234308034181595,
-0.07376648485660553,
-0.12914054095745087,
-0.08486288040876389,
0.002256471896544099,
0.016948333010077477,
-0.03809606656432152,
-0.04929003119468689,
-0.09163624048233032,
-0.022496649995446205,
0.13634897768497467,
-0.0037354726810008287,
-0.04340695962309837,
-0.13446609675884247,
0.0543861910700798,
0.14274707436561584,
-0.05839828774333,
0.020098760724067688,
0.004934961907565594,
0.10236785560846329,
0.05104810371994972,
-0.08193442225456238,
0.05160623416304588,
-0.06715316325426102,
-0.16111771762371063,
-0.06142124906182289,
0.11740478128194809,
0.0797603651881218,
0.053861405700445175,
-0.0009413051302544773,
0.03152628242969513,
0.0018293826142325997,
-0.08618604391813278,
0.007994327694177628,
0.056871525943279266,
0.08968495577573776,
0.05036772042512894,
-0.09363982826471329,
0.047783177345991135,
-0.034837059676647186,
-0.0034498630557209253,
0.12854638695716858,
0.217262402176857,
-0.08621593564748764,
0.08931028842926025,
0.06832049787044525,
-0.08079566061496735,
-0.17621539533138275,
0.06985089927911758,
0.13399696350097656,
0.015516777522861958,
0.037847407162189484,
-0.2041638344526291,
0.1356952041387558,
0.11329668015241623,
-0.016035865992307663,
0.05380372330546379,
-0.2982475459575653,
-0.12393644452095032,
0.06964975595474243,
0.10403447598218918,
0.043810710310935974,
-0.12780733406543732,
-0.028063546866178513,
-0.00981133058667183,
-0.14799900352954865,
0.14457851648330688,
-0.07002902030944824,
0.12146817892789841,
-0.007735132239758968,
0.1229071170091629,
0.023012535646557808,
-0.04086962342262268,
0.13293087482452393,
0.07511954009532928,
0.08652495592832565,
-0.03892330080270767,
-0.003283581929281354,
0.048816800117492676,
-0.06737486273050308,
0.04808460548520088,
-0.03994075581431389,
0.06594525277614594,
-0.16598033905029297,
0.00026189029449597,
-0.08269107341766357,
0.04359126091003418,
-0.04926004633307457,
-0.04705997556447983,
-0.028596950694918633,
0.046827781945466995,
0.06715118139982224,
-0.03401912748813629,
0.028724610805511475,
0.022254327312111855,
0.0546577014029026,
0.09296517074108124,
0.08947992324829102,
-0.016724711284041405,
-0.1092996671795845,
0.010240268893539906,
-0.006537417881190777,
0.05299123004078865,
-0.10064227133989334,
0.01652168482542038,
0.13452357053756714,
0.05766476318240166,
0.12735705077648163,
0.025512252002954483,
-0.03360888361930847,
-0.014683995395898819,
0.015953144058585167,
-0.12466071546077728,
-0.11977624148130417,
0.03773465380072594,
-0.04389285296201706,
-0.1565195769071579,
0.005159350577741861,
0.10248719155788422,
-0.03726267069578171,
-0.013239732012152672,
-0.010839066468179226,
0.0249004103243351,
-0.01309062447398901,
0.2012481391429901,
0.04038092866539955,
0.0630275160074234,
-0.10146450251340866,
0.12493161857128143,
0.05735764652490616,
-0.044854555279016495,
0.056237172335386276,
0.06641333550214767,
-0.09265051037073135,
-0.004882903303951025,
0.10986120998859406,
0.17187869548797607,
-0.04552818462252617,
-0.019290078431367874,
-0.07200140506029129,
-0.07478519529104233,
0.05727017670869827,
0.16154049336910248,
0.05008179694414139,
-0.006394519470632076,
-0.043263182044029236,
0.02613610401749611,
-0.12470318377017975,
0.07324833422899246,
0.049126241356134415,
0.06793814152479172,
-0.10697328299283981,
0.11242903769016266,
-0.009470007382333279,
0.035856179893016815,
-0.01549078244715929,
0.028165407478809357,
-0.09716138988733292,
-0.025803212076425552,
-0.12282852828502655,
0.020501302555203438,
-0.01015810389071703,
0.00708266394212842,
-0.01005313079804182,
-0.054664164781570435,
-0.03754379227757454,
0.025817450135946274,
-0.08113760501146317,
-0.05519051104784012,
0.015912577509880066,
0.045174259692430496,
-0.15462803840637207,
-0.01288582757115364,
0.024409176781773567,
-0.09351783990859985,
0.07714413851499557,
0.06534574925899506,
0.018063044175505638,
0.02707984298467636,
-0.10699813812971115,
-0.04510217532515526,
0.014067392796278,
0.028086388483643532,
0.0831775963306427,
-0.0883057564496994,
-0.015764541923999786,
-0.03361646458506584,
0.04938320443034172,
0.014720261096954346,
0.09039397537708282,
-0.11543577909469604,
-0.0031559898052364588,
-0.0574333481490612,
-0.03460747376084328,
-0.059620779007673264,
0.0335201732814312,
0.11450278013944626,
0.03663716837763786,
0.16878637671470642,
-0.06739930063486099,
0.03914467617869377,
-0.19416941702365875,
-0.03365321829915047,
0.00154290406499058,
-0.03906094282865524,
-0.08463627099990845,
-0.04329903423786163,
0.09141064435243607,
-0.051317308098077774,
0.0920187309384346,
-0.007850034162402153,
0.08814462274312973,
0.03133397921919823,
-0.006100561004132032,
-0.055543433874845505,
0.0016686638118699193,
0.1467006951570511,
0.05914546176791191,
-0.018710795789957047,
0.10201012343168259,
-0.007920267060399055,
0.05607221648097038,
0.04090427607297897,
0.2268306016921997,
0.1475781500339508,
-0.021926304325461388,
0.0642741248011589,
0.06969840824604034,
-0.12084785848855972,
-0.12609942257404327,
0.1401122808456421,
-0.04720032587647438,
0.12125783413648605,
-0.05369662493467331,
0.22151154279708862,
0.0242452472448349,
-0.17750556766986847,
0.05122889578342438,
-0.05462505295872688,
-0.12043353915214539,
-0.12047421932220459,
-0.015386846847832203,
-0.08134916424751282,
-0.0987972617149353,
0.025393836200237274,
-0.12125399708747864,
0.06550829857587814,
0.11803292483091354,
0.015571324154734612,
0.021553929895162582,
0.1522049456834793,
-0.04216644912958145,
0.01771533489227295,
0.0599549375474453,
0.023652493953704834,
-0.004968385212123394,
-0.06127346679568291,
-0.06522779911756516,
0.05053035914897919,
0.03573209419846535,
0.08548583090305328,
-0.04621020704507828,
0.018053852021694183,
0.031716689467430115,
-0.014629310928285122,
-0.07401849329471588,
0.01224654819816351,
0.02578665316104889,
0.03904418647289276,
0.05460406839847565,
0.05423756688833237,
0.01731692999601364,
-0.03525242581963539,
0.26723915338516235,
-0.07221081852912903,
-0.08480261266231537,
-0.13079752027988434,
0.19921894371509552,
0.016276516020298004,
-0.018230101093649864,
0.07604940235614777,
-0.10359815508127213,
-0.023131053894758224,
0.1668596714735031,
0.12329896539449692,
-0.1026468500494957,
-0.02935182861983776,
-0.016487494111061096,
-0.011973035521805286,
-0.037797246128320694,
0.1175597533583641,
0.09293139725923538,
0.008060707710683346,
-0.07695503532886505,
-0.027344753965735435,
-0.01740707829594612,
-0.04458590969443321,
-0.06454966962337494,
0.03670312464237213,
0.014352994970977306,
0.0021981324534863234,
-0.038628190755844116,
0.05302206426858902,
-0.010999244637787342,
-0.24243012070655823,
0.030739257112145424,
-0.1560664027929306,
-0.18140435218811035,
-0.030618911609053612,
0.06531023979187012,
-0.006675512995570898,
0.038689177483320236,
-0.019366944208741188,
0.003827463136985898,
0.14904356002807617,
-0.035210683941841125,
-0.04837603121995926,
-0.12206985801458359,
0.10299120098352432,
-0.10207558423280716,
0.2064913809299469,
0.008180767297744751,
0.08390315622091293,
0.09692217409610748,
0.02254580147564411,
-0.1351996660232544,
0.03335266932845116,
0.07381395250558853,
-0.10920532047748566,
0.014553550630807877,
0.15226738154888153,
-0.055476799607276917,
0.07967071980237961,
0.026822667568922043,
-0.10382917523384094,
-0.019237732514739037,
-0.026185588911175728,
-0.03163585811853409,
-0.07971859723329544,
-0.015361916273832321,
-0.06449729204177856,
0.16489432752132416,
0.21933604776859283,
-0.025346972048282623,
0.01690429449081421,
-0.08604616671800613,
0.01677265390753746,
0.04486215114593506,
0.05340868979692459,
-0.042892936617136,
-0.20623816549777985,
0.02972445823252201,
0.020090730860829353,
0.024226972833275795,
-0.19503293931484222,
-0.08434601128101349,
0.04530324414372444,
-0.02871193177998066,
-0.05232631042599678,
0.10025361180305481,
0.023637238889932632,
0.04284542426466942,
-0.03355925902724266,
-0.09701137989759445,
-0.04463896155357361,
0.14435286819934845,
-0.16182750463485718,
-0.05369797721505165
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09966001659631729,
0.15220943093299866,
-0.002165052341297269,
0.09215453267097473,
0.13804569840431213,
0.04054972156882286,
0.08960509300231934,
0.13466548919677734,
-0.08075296133756638,
0.07343575358390808,
0.0786828026175499,
0.05385884642601013,
0.052196282893419266,
0.126234233379364,
-0.04678627476096153,
-0.20935961604118347,
0.012704918161034584,
-0.0192357636988163,
-0.056068360805511475,
0.09804561734199524,
0.0820346549153328,
-0.10505608469247818,
0.08053681999444962,
-0.019672470167279243,
-0.15352873504161835,
0.00960222166031599,
-0.038497861474752426,
-0.022706039249897003,
0.09477122873067856,
-0.0074058836326003075,
0.0975036695599556,
0.010169883258640766,
0.14042021334171295,
-0.21041348576545715,
-0.00026746554067358375,
0.07774300873279572,
0.036039236932992935,
0.08926691114902496,
0.04716551676392555,
0.026001039892435074,
0.04875082150101662,
-0.15313035249710083,
0.09030001610517502,
0.025933530181646347,
-0.07735767215490341,
-0.09127272665500641,
-0.09212630242109299,
0.02008129470050335,
0.07996765524148941,
0.08263108134269714,
0.008864750154316425,
0.131179079413414,
-0.09113243222236633,
0.08294201642274857,
0.197071373462677,
-0.2752186357975006,
-0.06713058799505234,
0.04754006862640381,
0.05322122946381569,
0.06790133565664291,
-0.11811918020248413,
-0.027205146849155426,
0.030074773356318474,
0.032353196293115616,
0.09411422163248062,
-0.01986590027809143,
-0.13778512179851532,
0.0013736315304413438,
-0.13285449147224426,
-0.014064356684684753,
0.1111459955573082,
0.04903457313776016,
-0.04532456398010254,
-0.06411879509687424,
-0.0759543627500534,
-0.10884923487901688,
-0.024030182510614395,
-0.021734291687607765,
0.05320807546377182,
-0.05501750856637955,
-0.05559831112623215,
-0.03966689109802246,
-0.05957271158695221,
-0.08717699348926544,
-0.003048204816877842,
0.11760546267032623,
0.043406158685684204,
0.024852050468325615,
-0.03575218468904495,
0.10303238779306412,
0.0012051158118993044,
-0.13233454525470734,
-0.016187047585844994,
0.008980801329016685,
-0.12272998690605164,
-0.05845564231276512,
-0.027013063430786133,
0.014030718244612217,
0.016176780685782433,
0.1495962142944336,
-0.0393727570772171,
0.08214591443538666,
0.016575787216424942,
-0.0204132292419672,
-0.011112288571894169,
0.14284686744213104,
-0.03953510895371437,
-0.05557180941104889,
0.005002637393772602,
0.09698481112718582,
-0.00039002360426820815,
-0.004911903757601976,
-0.07433242350816727,
-0.01938813365995884,
0.09220698475837708,
0.05636081099510193,
-0.05456320196390152,
0.03339283913373947,
-0.032197125256061554,
-0.024928508326411247,
0.022647690027952194,
-0.12442096322774887,
0.03656330332159996,
0.00949500035494566,
-0.08315524458885193,
-0.03772253915667534,
0.011465087532997131,
-0.016762226819992065,
-0.02019207552075386,
0.08924215286970139,
-0.08705513924360275,
-0.005969824269413948,
-0.07743440568447113,
-0.07344028353691101,
0.015321916900575161,
-0.14180201292037964,
-0.014116620644927025,
-0.04824189096689224,
-0.20377400517463684,
-0.03892144188284874,
0.043949078768491745,
-0.0762372836470604,
-0.042043328285217285,
-0.05797605589032173,
-0.08368546515703201,
0.014769353903830051,
-0.004116421099752188,
0.16868270933628082,
-0.060010410845279694,
0.07636011391878128,
-0.013927878811955452,
0.04237302392721176,
0.010204142890870571,
0.0482030063867569,
-0.08522050827741623,
0.023359447717666626,
-0.14260464906692505,
0.07768642902374268,
-0.09586871415376663,
0.011908934451639652,
-0.1240435540676117,
-0.08906842768192291,
0.04084507375955582,
-0.02766989916563034,
0.06892213970422745,
0.14354756474494934,
-0.19336554408073425,
0.002385365078225732,
0.11482873558998108,
-0.048611391335725784,
-0.046989940106868744,
0.07802224159240723,
-0.05657326802611351,
0.03147205337882042,
0.05160949006676674,
0.16711470484733582,
0.07509761303663254,
-0.15020303428173065,
-0.017121881246566772,
0.024944312870502472,
0.04861777648329735,
0.005018238443881273,
0.04495386406779289,
0.0051955003291368484,
0.02539741061627865,
0.0035454381722956896,
-0.07739879190921783,
-0.026046667248010635,
-0.08993244171142578,
-0.07160119712352753,
-0.05150625854730606,
-0.08870459347963333,
0.02233649604022503,
0.010805407539010048,
0.021277068182826042,
-0.058218274265527725,
-0.10068118572235107,
0.10255446285009384,
0.12891080975532532,
-0.05198251083493233,
0.006634147837758064,
-0.06988196074962616,
0.020404282957315445,
-0.030817218124866486,
-0.036036621779203415,
-0.193924218416214,
-0.1352517455816269,
0.05255378782749176,
-0.0520625039935112,
0.03371346369385719,
0.03537900745868683,
0.06453011184930801,
0.062334220856428146,
-0.03317844122648239,
-0.019651249051094055,
-0.06934987753629684,
-0.00317585957236588,
-0.11426711082458496,
-0.19944466650485992,
-0.06457191705703735,
-0.03283195197582245,
0.14234529435634613,
-0.203607976436615,
0.0001841207267716527,
-0.025221632793545723,
0.12023179978132248,
0.016392210498452187,
-0.057909104973077774,
0.008831894025206566,
0.029860176146030426,
0.006860300898551941,
-0.09950869530439377,
0.04114740714430809,
0.008146759122610092,
-0.05401112884283066,
-0.06125500053167343,
-0.10917924344539642,
-0.002547134179621935,
0.058039698749780655,
0.07942721247673035,
-0.10546071827411652,
-0.008170080371201038,
-0.05017293989658356,
-0.045956388115882874,
-0.08627726137638092,
0.008929876610636711,
0.1970568597316742,
0.031860653311014175,
0.12055668979883194,
-0.06052636727690697,
-0.07086396217346191,
0.001350529957562685,
0.02441892959177494,
0.023600706830620766,
0.08940600603818893,
0.10541684925556183,
-0.09763853996992111,
0.08774875849485397,
0.07014866918325424,
-0.04737154394388199,
0.11989672482013702,
-0.044796258211135864,
-0.08185828477144241,
-0.020526494830846786,
0.006223208270967007,
-0.03404749929904938,
0.14133663475513458,
-0.08448724448680878,
0.01097305491566658,
0.033483877778053284,
0.024210063740611076,
0.01778339222073555,
-0.1691184937953949,
-0.000231937097851187,
0.011296060867607594,
-0.06396014243364334,
-0.03916468471288681,
-0.0250251442193985,
0.036011043936014175,
0.09376409649848938,
0.02645948901772499,
-0.04444193094968796,
0.01724371127784252,
-0.011772391386330128,
-0.06986042857170105,
0.19183386862277985,
-0.10847464948892593,
-0.09081920236349106,
-0.10973808169364929,
0.026527373120188713,
-0.04441426694393158,
-0.03687955439090729,
0.004396634642034769,
-0.08339981734752655,
-0.05544544756412506,
-0.08854059129953384,
-0.024801690131425858,
-0.017318399623036385,
-0.0022517507895827293,
0.02561561018228531,
-0.015422639437019825,
0.07247995585203171,
-0.1316290944814682,
0.004571620374917984,
-0.02978774718940258,
-0.10409142822027206,
0.008548418991267681,
0.06398473680019379,
0.08510599285364151,
0.09870053827762604,
-0.012977675534784794,
0.01307336613535881,
-0.0273914635181427,
0.24022479355335236,
-0.055464860051870346,
0.016777418553829193,
0.08318735659122467,
-0.007139922119677067,
0.05995578318834305,
0.14368131756782532,
0.03174428269267082,
-0.10489795356988907,
0.02565295435488224,
0.08218108117580414,
-0.010469658300280571,
-0.2489979863166809,
-0.027786821126937866,
-0.021290019154548645,
-0.07099706679582596,
0.08847323805093765,
0.03927362337708473,
-0.04569166526198387,
0.0385587140917778,
0.010914985090494156,
0.0147231575101614,
-0.05213556066155434,
0.07318290323019028,
0.08630907535552979,
0.04307786747813225,
0.09491388499736786,
-0.022168811410665512,
-0.011853022500872612,
0.06524203717708588,
0.016775010153651237,
0.2709018886089325,
-0.027774330228567123,
0.12553425133228302,
0.026496684178709984,
0.14792662858963013,
-0.02992030791938305,
0.04849877953529358,
0.01676376536488533,
0.004864733666181564,
-0.006110796704888344,
-0.052436672151088715,
-0.02851644903421402,
0.010231032967567444,
-0.03335561975836754,
0.031879983842372894,
-0.07305830717086792,
0.04520178586244583,
0.013250493444502354,
0.30029183626174927,
0.04561356082558632,
-0.2832711338996887,
-0.06112014502286911,
-0.0025319717824459076,
-0.04614933207631111,
-0.07427556067705154,
0.0032466037664562464,
0.14520485699176788,
-0.12652887403964996,
0.03917689993977547,
-0.054091066122055054,
0.0879734605550766,
-0.04762985557317734,
0.0025210806634277105,
0.06469283998012543,
0.1448581963777542,
-0.010452811606228352,
0.06840676069259644,
-0.1895495504140854,
0.22489696741104126,
0.028785832226276398,
0.11117812246084213,
-0.060671187937259674,
0.014277487993240356,
0.010537032037973404,
0.028604790568351746,
0.10864710807800293,
0.00040300097316503525,
-0.024580879136919975,
-0.17034049332141876,
-0.11578671634197235,
0.05920439586043358,
0.11649778485298157,
-0.01610378362238407,
0.09661505371332169,
-0.04044679179787636,
-0.0027193191926926374,
0.035854026675224304,
-0.07539072632789612,
-0.12945306301116943,
-0.08581603318452835,
0.0018252598820254207,
0.017781658098101616,
-0.037475332617759705,
-0.04873034730553627,
-0.09218013286590576,
-0.02420111559331417,
0.13567040860652924,
-0.002865725662559271,
-0.04361303523182869,
-0.13498662412166595,
0.05331111699342728,
0.14289774000644684,
-0.057163745164871216,
0.0204527098685503,
0.005629686638712883,
0.10155630856752396,
0.052106041461229324,
-0.08173253387212753,
0.051732465624809265,
-0.06760963052511215,
-0.16033931076526642,
-0.06145937368273735,
0.11688801646232605,
0.07919981330633163,
0.053035933524370193,
-0.001258792937733233,
0.03190884739160538,
0.0011750234989449382,
-0.08680959790945053,
0.008744730614125729,
0.05499257892370224,
0.09124116599559784,
0.049512334167957306,
-0.09459221363067627,
0.04724783077836037,
-0.034135330468416214,
-0.0025350037030875683,
0.12640784680843353,
0.21451468765735626,
-0.08537665754556656,
0.08761562407016754,
0.06888794153928757,
-0.08027439564466476,
-0.1755870133638382,
0.07095903158187866,
0.1325872391462326,
0.016017083078622818,
0.036179158836603165,
-0.20511756837368011,
0.13708263635635376,
0.11314617842435837,
-0.01503576897084713,
0.05594577640295029,
-0.29511794447898865,
-0.1234029084444046,
0.06945284456014633,
0.10479096323251724,
0.04396124556660652,
-0.1282176822423935,
-0.02746015414595604,
-0.009982014074921608,
-0.148260697722435,
0.14360396564006805,
-0.07279561460018158,
0.12141499668359756,
-0.008079038001596928,
0.12325935065746307,
0.022306568920612335,
-0.041117846965789795,
0.13212327659130096,
0.07608543336391449,
0.08721240609884262,
-0.039292171597480774,
-0.002833222271874547,
0.04890643432736397,
-0.06655426323413849,
0.048123713582754135,
-0.04065606743097305,
0.06534260511398315,
-0.16671797633171082,
0.00034680255339480937,
-0.08328348398208618,
0.043267957866191864,
-0.04921264201402664,
-0.046689342707395554,
-0.027583876624703407,
0.0470605194568634,
0.06607058644294739,
-0.03424680978059769,
0.027470670640468597,
0.021748879924416542,
0.05543002486228943,
0.09202290326356888,
0.09029625356197357,
-0.017954645678400993,
-0.11022771149873734,
0.011349568143486977,
-0.0070024821907281876,
0.05285274237394333,
-0.10028666257858276,
0.015385285951197147,
0.1353379786014557,
0.0570400096476078,
0.1271010935306549,
0.02612900920212269,
-0.0325230173766613,
-0.01516580767929554,
0.01711028441786766,
-0.1250055432319641,
-0.11830759048461914,
0.03802631050348282,
-0.04769841209053993,
-0.15635517239570618,
0.006389346439391375,
0.10220877081155777,
-0.037059396505355835,
-0.01299059484153986,
-0.010884345509111881,
0.024617578834295273,
-0.013165687210857868,
0.20202726125717163,
0.04069337621331215,
0.06267828494310379,
-0.1025850921869278,
0.12467391043901443,
0.05746536701917648,
-0.04587482288479805,
0.05610952898859978,
0.0676707923412323,
-0.09380531311035156,
-0.0057639568112790585,
0.1085953414440155,
0.1733095645904541,
-0.04393785446882248,
-0.020032957196235657,
-0.07308433204889297,
-0.07527095079421997,
0.05720575526356697,
0.16001231968402863,
0.04940120875835419,
-0.007240422070026398,
-0.04340599849820137,
0.025812799111008644,
-0.12525323033332825,
0.07290199398994446,
0.0478348471224308,
0.06807563453912735,
-0.10677378624677658,
0.1118406280875206,
-0.009196361526846886,
0.034962207078933716,
-0.015490993857383728,
0.028709614649415016,
-0.09766937047243118,
-0.026168406009674072,
-0.1228296235203743,
0.019555021077394485,
-0.010733161121606827,
0.007675709202885628,
-0.010533052496612072,
-0.05356600135564804,
-0.038215041160583496,
0.026105636730790138,
-0.08116709440946579,
-0.0547514408826828,
0.017602527514100075,
0.04564778506755829,
-0.15401820838451385,
-0.012269148603081703,
0.023315630853176117,
-0.09368530660867691,
0.07710755616426468,
0.06537647545337677,
0.017873041331768036,
0.02773183584213257,
-0.10484474152326584,
-0.04544182866811752,
0.013417410664260387,
0.0274751428514719,
0.08380039036273956,
-0.08779881149530411,
-0.015147981233894825,
-0.03377440944314003,
0.049741920083761215,
0.014961875975131989,
0.08958198875188828,
-0.1149221733212471,
-0.002684369683265686,
-0.056523457169532776,
-0.03358052670955658,
-0.0603463351726532,
0.034029021859169006,
0.11558514088392258,
0.03554188460111618,
0.16887690126895905,
-0.06751292198896408,
0.038329191505908966,
-0.19441814720630646,
-0.034069787710905075,
0.0013257261598482728,
-0.04051951318979263,
-0.08407057821750641,
-0.042942773550748825,
0.09219774603843689,
-0.051980793476104736,
0.09366534650325775,
-0.008188367821276188,
0.08814318478107452,
0.031133420765399933,
-0.006072328891605139,
-0.05719812959432602,
0.0022951632272452116,
0.14647114276885986,
0.05877910926938057,
-0.019422927871346474,
0.10122380405664444,
-0.006980814505368471,
0.05505213886499405,
0.04211560636758804,
0.2250090092420578,
0.14681094884872437,
-0.021185435354709625,
0.06406230479478836,
0.07047390937805176,
-0.12171704322099686,
-0.12512849271297455,
0.14133675396442413,
-0.04712661728262901,
0.12205696105957031,
-0.054729923605918884,
0.21988819539546967,
0.023698318749666214,
-0.17668965458869934,
0.051519040018320084,
-0.0557587668299675,
-0.1205553188920021,
-0.1191430538892746,
-0.014051279053092003,
-0.08057130128145218,
-0.09930123388767242,
0.02558492310345173,
-0.12183596193790436,
0.06456919759511948,
0.11914625018835068,
0.016252299770712852,
0.020987175405025482,
0.1537276953458786,
-0.04128335416316986,
0.018310507759451866,
0.06009745970368385,
0.02346956729888916,
-0.005267795640975237,
-0.06072842702269554,
-0.06441469490528107,
0.04980854317545891,
0.03421756252646446,
0.08519703894853592,
-0.046812523156404495,
0.017563773319125175,
0.032217010855674744,
-0.013715646229684353,
-0.07352891564369202,
0.012273364700376987,
0.02580481953918934,
0.0387406200170517,
0.055680010467767715,
0.05368656665086746,
0.016755135729908943,
-0.035511862486600876,
0.2659822404384613,
-0.07220644503831863,
-0.08327852189540863,
-0.13165751099586487,
0.19845330715179443,
0.01617288775742054,
-0.018808752298355103,
0.0752326026558876,
-0.10300939530134201,
-0.022025374695658684,
0.16834795475006104,
0.1252242624759674,
-0.1020842120051384,
-0.029591167345643044,
-0.01588175818324089,
-0.011957569047808647,
-0.03764400631189346,
0.1178724616765976,
0.09289658069610596,
0.00762938940897584,
-0.07663208991289139,
-0.027271952480077744,
-0.01686769351363182,
-0.04443778470158577,
-0.06432624906301498,
0.03655987232923508,
0.015092923305928707,
0.0018676294712349772,
-0.038086120039224625,
0.05312030762434006,
-0.010689923539757729,
-0.24298760294914246,
0.03188848868012428,
-0.15569210052490234,
-0.18190670013427734,
-0.031532786786556244,
0.06449554860591888,
-0.005829358007758856,
0.03937122970819473,
-0.018777167424559593,
0.003520403290167451,
0.15010762214660645,
-0.03524013236165047,
-0.04770885780453682,
-0.12302052229642868,
0.10315456241369247,
-0.10200274735689163,
0.20528100430965424,
0.007650632876902819,
0.08379316329956055,
0.09744634479284286,
0.022242719307541847,
-0.1350107342004776,
0.033932123333215714,
0.07321755588054657,
-0.10971549153327942,
0.014991046860814095,
0.15146900713443756,
-0.0553487166762352,
0.07880373299121857,
0.025656983256340027,
-0.10277531296014786,
-0.018663931638002396,
-0.025754086673259735,
-0.03140614181756973,
-0.07968080788850784,
-0.014978645369410515,
-0.0641019269824028,
0.16523194313049316,
0.22081373631954193,
-0.025210561230778694,
0.016596393659710884,
-0.08592212200164795,
0.016949733719229698,
0.04543443024158478,
0.05154825747013092,
-0.0435904935002327,
-0.20648851990699768,
0.02991086058318615,
0.020267443731427193,
0.023792775347828865,
-0.19593128561973572,
-0.08310695737600327,
0.04507557302713394,
-0.028430601581931114,
-0.05255468189716339,
0.10002361983060837,
0.02449454925954342,
0.04368523880839348,
-0.03407728672027588,
-0.09397220611572266,
-0.044574838131666183,
0.1440066695213318,
-0.16211247444152832,
-0.05421438068151474
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09938522428274155,
0.15168806910514832,
-0.0021347582805901766,
0.09179674834012985,
0.13840579986572266,
0.040038228034973145,
0.08954925835132599,
0.13527023792266846,
-0.07936917245388031,
0.0738215371966362,
0.07802405953407288,
0.05503440275788307,
0.05260271951556206,
0.1256939321756363,
-0.04679461568593979,
-0.20917989313602448,
0.012993685901165009,
-0.018971389159560204,
-0.055875420570373535,
0.09824998676776886,
0.08207226544618607,
-0.10523631423711777,
0.07976184785366058,
-0.020128881558775902,
-0.15398386120796204,
0.009787905029952526,
-0.03896987438201904,
-0.022142531350255013,
0.09446476399898529,
-0.008054855279624462,
0.09709391742944717,
0.010364314541220665,
0.14021418988704681,
-0.21125145256519318,
-0.00021655253658536822,
0.07806023210287094,
0.036192163825035095,
0.08941151201725006,
0.04805903509259224,
0.02597297541797161,
0.050449006259441376,
-0.1528567671775818,
0.09008294343948364,
0.026522168889641762,
-0.07724771648645401,
-0.08896004408597946,
-0.09250917285680771,
0.01925162971019745,
0.08041635900735855,
0.08332756161689758,
0.008315544575452805,
0.13237948715686798,
-0.09179893136024475,
0.08307988941669464,
0.1990719735622406,
-0.27358704805374146,
-0.06715110689401627,
0.04878498986363411,
0.05321338027715683,
0.06675276160240173,
-0.11906043440103531,
-0.028223278000950813,
0.029799124225974083,
0.03207790106534958,
0.09284670650959015,
-0.0191204771399498,
-0.13934628665447235,
0.0010447441600263119,
-0.13329888880252838,
-0.014296181499958038,
0.10941167175769806,
0.049291450530290604,
-0.044575467705726624,
-0.06397457420825958,
-0.07631652802228928,
-0.10771448165178299,
-0.02385650761425495,
-0.0216361116617918,
0.0530819408595562,
-0.055009689182043076,
-0.05528499558568001,
-0.03951699286699295,
-0.05933690071105957,
-0.08833868056535721,
-0.0026505375280976295,
0.11709621548652649,
0.04341864585876465,
0.02451176941394806,
-0.03649595007300377,
0.10290251672267914,
0.0012046372285112739,
-0.13278751075267792,
-0.017023157328367233,
0.00961272232234478,
-0.12296188622713089,
-0.05848250910639763,
-0.026526689529418945,
0.01190057210624218,
0.0157996267080307,
0.1469329595565796,
-0.039370737969875336,
0.08258505165576935,
0.0158075001090765,
-0.020357251167297363,
-0.011604313738644123,
0.14314980804920197,
-0.03878336772322655,
-0.053389083594083786,
0.0041526914574205875,
0.09714730083942413,
-0.0009800087427720428,
-0.004467541817575693,
-0.0738212987780571,
-0.019302653148770332,
0.0930342823266983,
0.05616084113717079,
-0.05388835817575455,
0.03284863010048866,
-0.032711442559957504,
-0.02508571185171604,
0.02327878214418888,
-0.1241484135389328,
0.036857523024082184,
0.00908144935965538,
-0.08351469039916992,
-0.038867197930812836,
0.011576812714338303,
-0.017015239223837852,
-0.020535258576273918,
0.09014511108398438,
-0.08743433654308319,
-0.005613011308014393,
-0.07832089811563492,
-0.07299406081438065,
0.015912773087620735,
-0.1433657556772232,
-0.0143772903829813,
-0.047221072018146515,
-0.2041957527399063,
-0.03900926932692528,
0.04337874427437782,
-0.07655609399080276,
-0.04157692939043045,
-0.05892046168446541,
-0.08453276753425598,
0.015225724317133427,
-0.003596886061131954,
0.16964033246040344,
-0.05956718325614929,
0.07710956037044525,
-0.014133434742689133,
0.041933268308639526,
0.010740099474787712,
0.04897545650601387,
-0.0865275114774704,
0.02302921749651432,
-0.14192073047161102,
0.07801628857851028,
-0.09689681977033615,
0.01165004726499319,
-0.12496659904718399,
-0.08828428387641907,
0.03958776965737343,
-0.028402645140886307,
0.06982622295618057,
0.14441558718681335,
-0.19367174804210663,
0.002823098096996546,
0.11539719253778458,
-0.04913922771811485,
-0.047272689640522,
0.07695362716913223,
-0.05628751218318939,
0.03114151768386364,
0.05086155980825424,
0.16753976047039032,
0.07461196929216385,
-0.14991065859794617,
-0.01855393312871456,
0.02439543977379799,
0.04946271330118179,
0.004589853808283806,
0.04470448940992355,
0.005147486925125122,
0.026986800134181976,
0.0034241920802742243,
-0.07743671536445618,
-0.02628890424966812,
-0.0902489498257637,
-0.07111549377441406,
-0.05230821669101715,
-0.0887347087264061,
0.021737579256296158,
0.012519326992332935,
0.02089383639395237,
-0.05772750824689865,
-0.10028503090143204,
0.10235355794429779,
0.1286972165107727,
-0.051333487033843994,
0.006817708257585764,
-0.06928465515375137,
0.01946377567946911,
-0.03161539509892464,
-0.03586287423968315,
-0.19561251997947693,
-0.13615739345550537,
0.0531795360147953,
-0.05267345905303955,
0.0342748761177063,
0.03463178128004074,
0.06442049890756607,
0.06118517741560936,
-0.03333136439323425,
-0.019828690215945244,
-0.07020695507526398,
-0.00368922995403409,
-0.11442001909017563,
-0.19954560697078705,
-0.06466621160507202,
-0.033697906881570816,
0.14003896713256836,
-0.20281179249286652,
-0.00022225134307518601,
-0.025178616866469383,
0.12104606628417969,
0.01645190641283989,
-0.05846288427710533,
0.009106027893722057,
0.02959083393216133,
0.006543932016938925,
-0.0997503250837326,
0.04099868983030319,
0.007272091694176197,
-0.05382265895605087,
-0.06119980290532112,
-0.10912647843360901,
-0.004822456743568182,
0.057532403618097305,
0.08100126683712006,
-0.10547660291194916,
-0.008254081010818481,
-0.05019223690032959,
-0.046147845685482025,
-0.08723043650388718,
0.010094711557030678,
0.19646117091178894,
0.032316904515028,
0.11966366320848465,
-0.06083675101399422,
-0.07113160938024521,
0.001801409525796771,
0.024568820372223854,
0.022745586931705475,
0.09070082008838654,
0.10722294449806213,
-0.10010474175214767,
0.08878139406442642,
0.07088890671730042,
-0.046588484197854996,
0.12016676366329193,
-0.04513711854815483,
-0.08237317204475403,
-0.019031140953302383,
0.006781751289963722,
-0.03383442014455795,
0.14097461104393005,
-0.08395031094551086,
0.01151549257338047,
0.03333301097154617,
0.024572422727942467,
0.017662718892097473,
-0.16929206252098083,
-0.0002604943874757737,
0.010950617492198944,
-0.06384873390197754,
-0.039633575826883316,
-0.02471771091222763,
0.03600483387708664,
0.09400234371423721,
0.026122797280550003,
-0.04403270035982132,
0.016605861485004425,
-0.012056460604071617,
-0.06961514055728912,
0.19269467890262604,
-0.10802953690290451,
-0.0897054597735405,
-0.10843979567289352,
0.02724086493253708,
-0.04409925639629364,
-0.037130460143089294,
0.004492780193686485,
-0.08460129052400589,
-0.055311329662799835,
-0.08820530027151108,
-0.02455066330730915,
-0.016736451536417007,
-0.003286543535068631,
0.02475030906498432,
-0.015203517861664295,
0.07180484384298325,
-0.13221053779125214,
0.004830381833016872,
-0.03031158447265625,
-0.10424540936946869,
0.008153185248374939,
0.06350855529308319,
0.08522236347198486,
0.09915366768836975,
-0.013326100073754787,
0.013118944130837917,
-0.02734486386179924,
0.2391972541809082,
-0.055902622640132904,
0.016087766736745834,
0.08294591307640076,
-0.006146969273686409,
0.06063070520758629,
0.1439129263162613,
0.031100312247872353,
-0.10475089401006699,
0.026086457073688507,
0.08286333084106445,
-0.010861990042030811,
-0.25006091594696045,
-0.028228970244526863,
-0.02156367339193821,
-0.07146900147199631,
0.08845669776201248,
0.03966891020536423,
-0.046198442578315735,
0.038613177835941315,
0.010124455206096172,
0.012797706760466099,
-0.052251651883125305,
0.07355183362960815,
0.08413912355899811,
0.04410748928785324,
0.09454724937677383,
-0.02225085347890854,
-0.011430047452449799,
0.06435053795576096,
0.017169402912259102,
0.2717325985431671,
-0.02815925143659115,
0.12521572411060333,
0.02618541568517685,
0.14752280712127686,
-0.030552774667739868,
0.04871305823326111,
0.016296297311782837,
0.004788742866367102,
-0.006069813389331102,
-0.052277740091085434,
-0.02855929546058178,
0.010027180425822735,
-0.034059636294841766,
0.03234920650720596,
-0.07284247875213623,
0.04580772668123245,
0.012853845953941345,
0.30068424344062805,
0.045853205025196075,
-0.28219088912010193,
-0.06024933606386185,
-0.0025336588732898235,
-0.04631829634308815,
-0.07485853135585785,
0.002852414269000292,
0.14433272182941437,
-0.1260746419429779,
0.038977980613708496,
-0.05454239994287491,
0.08887079358100891,
-0.04591735452413559,
0.0018730999436229467,
0.06399189680814743,
0.14508147537708282,
-0.010199591517448425,
0.06910015642642975,
-0.1905580461025238,
0.22683854401111603,
0.028751302510499954,
0.11098826676607132,
-0.06114780902862549,
0.013968859799206257,
0.009819988161325455,
0.026689475402235985,
0.10931307077407837,
0.000279416999546811,
-0.02502025105059147,
-0.1692919135093689,
-0.11512584239244461,
0.05893277749419212,
0.11739374697208405,
-0.01620447263121605,
0.09595130383968353,
-0.04046744480729103,
-0.0026295578572899103,
0.03666966035962105,
-0.07520927488803864,
-0.1294252872467041,
-0.08600933849811554,
0.0021255428437143564,
0.017171606421470642,
-0.03844057396054268,
-0.04833027720451355,
-0.0918874442577362,
-0.02268231473863125,
0.1353587508201599,
-0.0024773061741143465,
-0.04305308684706688,
-0.1348579227924347,
0.05442480742931366,
0.1430695652961731,
-0.05772765353322029,
0.02023024670779705,
0.004915163852274418,
0.10131708532571793,
0.05132555961608887,
-0.08225134760141373,
0.052581675350666046,
-0.06752932816743851,
-0.16074632108211517,
-0.061404433101415634,
0.11665334552526474,
0.07972503453493118,
0.053514670580625534,
-0.0014943518908694386,
0.03213369846343994,
0.001289775362238288,
-0.08649939298629761,
0.00903804786503315,
0.054943010210990906,
0.09074712544679642,
0.05042213574051857,
-0.09421181678771973,
0.04609581083059311,
-0.03501271829009056,
-0.00304826139472425,
0.12620621919631958,
0.2152780443429947,
-0.08515891432762146,
0.08792301267385483,
0.06954451650381088,
-0.08049467951059341,
-0.17627304792404175,
0.0716429352760315,
0.13302084803581238,
0.015359985642135143,
0.036931052803993225,
-0.20480671525001526,
0.13692434132099152,
0.1124112457036972,
-0.01547236554324627,
0.05504253879189491,
-0.2961054742336273,
-0.12314621359109879,
0.06958504766225815,
0.10476340353488922,
0.04194345697760582,
-0.1283199042081833,
-0.027568578720092773,
-0.009508021175861359,
-0.1476355642080307,
0.14472666382789612,
-0.07085616886615753,
0.1216941773891449,
-0.008547171019017696,
0.12389136105775833,
0.02255210280418396,
-0.041622281074523926,
0.1305641531944275,
0.07618188858032227,
0.08740385621786118,
-0.039173781871795654,
-0.0029837151523679495,
0.04907465726137161,
-0.0666172057390213,
0.04832148551940918,
-0.04088686406612396,
0.06594536453485489,
-0.1651737540960312,
0.0007922607474029064,
-0.08416906744241714,
0.043731510639190674,
-0.049244146794080734,
-0.04674053564667702,
-0.027702102437615395,
0.04746639356017113,
0.0664224699139595,
-0.03447785601019859,
0.02934839390218258,
0.021747056394815445,
0.057167913764715195,
0.09188942611217499,
0.09076666831970215,
-0.017481649294495583,
-0.10904301702976227,
0.010563774965703487,
-0.006271330174058676,
0.05289343371987343,
-0.1012546494603157,
0.014985881745815277,
0.1352483332157135,
0.05824211612343788,
0.12690655887126923,
0.026593606919050217,
-0.033523425459861755,
-0.015150223858654499,
0.01628297194838524,
-0.12443606555461884,
-0.11930177360773087,
0.03828930854797363,
-0.04322982206940651,
-0.15702252089977264,
0.0069002388045191765,
0.10056212544441223,
-0.03776795044541359,
-0.01313331164419651,
-0.010646238923072815,
0.02513076923787594,
-0.012361006811261177,
0.20260387659072876,
0.04092046245932579,
0.0632404237985611,
-0.1024128869175911,
0.12512198090553284,
0.057171329855918884,
-0.04693707451224327,
0.056276481598615646,
0.06795741617679596,
-0.0931553840637207,
-0.005801820196211338,
0.1099809855222702,
0.171928271651268,
-0.04423101246356964,
-0.01858178898692131,
-0.07184908539056778,
-0.07499945163726807,
0.05754516273736954,
0.1616055965423584,
0.049400679767131805,
-0.007459884975105524,
-0.04294200241565704,
0.026073431596159935,
-0.12602102756500244,
0.07340093702077866,
0.047346193343400955,
0.0684485211968422,
-0.10654241591691971,
0.10924242436885834,
-0.009316805750131607,
0.03539629280567169,
-0.015298251062631607,
0.029250791296362877,
-0.09745907038450241,
-0.02607540972530842,
-0.12024848908185959,
0.01929716020822525,
-0.010651089251041412,
0.007195597980171442,
-0.011037543416023254,
-0.0543624684214592,
-0.03718359395861626,
0.026234088465571404,
-0.0814032256603241,
-0.05538737028837204,
0.016962025314569473,
0.04532446712255478,
-0.15466691553592682,
-0.012347258627414703,
0.023776236921548843,
-0.0929659977555275,
0.076107919216156,
0.06474912911653519,
0.017839794978499413,
0.028248490765690804,
-0.10804776847362518,
-0.045268166810274124,
0.0138663649559021,
0.027616946026682854,
0.08381272852420807,
-0.08714504539966583,
-0.015456786379218102,
-0.03400107100605965,
0.04992689564824104,
0.014701693318784237,
0.08885392546653748,
-0.11548138409852982,
-0.003274933435022831,
-0.05802597478032112,
-0.0343547984957695,
-0.05964023619890213,
0.034833937883377075,
0.11580678075551987,
0.03572980314493179,
0.16856835782527924,
-0.0671905055642128,
0.039246175438165665,
-0.19432976841926575,
-0.03379394859075546,
0.0015888343332335353,
-0.04040972888469696,
-0.08457359671592712,
-0.042152080684900284,
0.09240078926086426,
-0.0520687997341156,
0.09091967344284058,
-0.007872465997934341,
0.08901319652795792,
0.0314733162522316,
-0.005271920934319496,
-0.05739486217498779,
0.0022485433146357536,
0.1459052562713623,
0.0585494190454483,
-0.018986329436302185,
0.10318879038095474,
-0.006756998598575592,
0.054467376321554184,
0.04328714683651924,
0.22681157290935516,
0.14808714389801025,
-0.02255120314657688,
0.06406299024820328,
0.07076523452997208,
-0.12202142924070358,
-0.12486843019723892,
0.14115899801254272,
-0.04642342031002045,
0.12185068428516388,
-0.055195361375808716,
0.22073230147361755,
0.023527728393673897,
-0.17627713084220886,
0.05214143171906471,
-0.056326884776353836,
-0.12037050724029541,
-0.11919742077589035,
-0.014134400524199009,
-0.08058907091617584,
-0.09902966022491455,
0.026106800884008408,
-0.12193460017442703,
0.0653689056634903,
0.11941827088594437,
0.015966804698109627,
0.02114577777683735,
0.15480218827724457,
-0.04064125940203667,
0.018504973500967026,
0.060510702431201935,
0.023531032726168633,
-0.005761665292084217,
-0.061060670763254166,
-0.0636550709605217,
0.05087986961007118,
0.034540705382823944,
0.08512594550848007,
-0.046462204307317734,
0.016499685123562813,
0.03185965493321419,
-0.013893328607082367,
-0.07373128086328506,
0.012467521242797375,
0.026194192469120026,
0.038623932749032974,
0.056102313101291656,
0.053484901785850525,
0.017363475635647774,
-0.035657040774822235,
0.26789718866348267,
-0.07263273000717163,
-0.08322165161371231,
-0.1303585022687912,
0.1991032212972641,
0.01718057505786419,
-0.018801914528012276,
0.0751943364739418,
-0.10392244160175323,
-0.022327326238155365,
0.16636428236961365,
0.12328138202428818,
-0.10126281529664993,
-0.029328733682632446,
-0.016277939081192017,
-0.011747827753424644,
-0.03767300397157669,
0.1173161044716835,
0.09344495087862015,
0.008376006036996841,
-0.07716566324234009,
-0.026714326813817024,
-0.016912654042243958,
-0.04520859569311142,
-0.06330473721027374,
0.0364115871489048,
0.015638668090105057,
0.001495819422416389,
-0.03835752233862877,
0.05334778130054474,
-0.011273865588009357,
-0.24205534160137177,
0.030600793659687042,
-0.15696588158607483,
-0.1814410537481308,
-0.03167017176747322,
0.06432055681943893,
-0.005671018268913031,
0.03935149684548378,
-0.018766529858112335,
0.00400667916983366,
0.14736652374267578,
-0.03501074016094208,
-0.04794243350625038,
-0.12371645867824554,
0.10232842713594437,
-0.10275804996490479,
0.2060110867023468,
0.007671484723687172,
0.08277203142642975,
0.09744898974895477,
0.022226519882678986,
-0.13576801121234894,
0.03342512995004654,
0.0736277848482132,
-0.11005357652902603,
0.014885645359754562,
0.15268252789974213,
-0.055218618363142014,
0.07900401204824448,
0.025472311303019524,
-0.10438857972621918,
-0.01833130046725273,
-0.026669325307011604,
-0.03079959936439991,
-0.08010972291231155,
-0.013058532029390335,
-0.06385594606399536,
0.16550536453723907,
0.22092154622077942,
-0.025023620575666428,
0.016268664970993996,
-0.0865335762500763,
0.016708020120859146,
0.04431403800845146,
0.05276259407401085,
-0.04330530762672424,
-0.20627984404563904,
0.0302711334079504,
0.020473040640354156,
0.02373982034623623,
-0.19635483622550964,
-0.08277904987335205,
0.04557685926556587,
-0.02935543656349182,
-0.052675873041152954,
0.09940487891435623,
0.02463161200284958,
0.043011657893657684,
-0.034219712018966675,
-0.09606406837701797,
-0.04446857050061226,
0.14458473026752472,
-0.16224467754364014,
-0.05371061712503433
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10028117150068283,
0.15329673886299133,
-0.0021560241002589464,
0.09219719469547272,
0.13824830949306488,
0.04064027592539787,
0.08915290236473083,
0.13539980351924896,
-0.07873748242855072,
0.07381702959537506,
0.07827338576316833,
0.05388198047876358,
0.052346039563417435,
0.12504296004772186,
-0.04698167368769646,
-0.20868314802646637,
0.012573715299367905,
-0.018368292599916458,
-0.05348511040210724,
0.09764616191387177,
0.08202508836984634,
-0.10530125349760056,
0.07958079874515533,
-0.020244332030415535,
-0.15280106663703918,
0.009678567759692669,
-0.03852682560682297,
-0.02263079397380352,
0.0947282463312149,
-0.007189096882939339,
0.09693874418735504,
0.009595907293260098,
0.14019936323165894,
-0.21179533004760742,
-0.0005181224551051855,
0.07833365350961685,
0.03594488650560379,
0.0894283577799797,
0.04700343310832977,
0.02720264531672001,
0.049296505749225616,
-0.15323470532894135,
0.09021001309156418,
0.026139413937926292,
-0.0770123302936554,
-0.08940635621547699,
-0.09195361286401749,
0.02064315229654312,
0.08004279434680939,
0.08272289484739304,
0.009268229827284813,
0.13383741676807404,
-0.09031765908002853,
0.08368551731109619,
0.1994560807943344,
-0.2733742594718933,
-0.066207155585289,
0.04725056514143944,
0.0534004345536232,
0.06820697337388992,
-0.11807696521282196,
-0.028296874836087227,
0.029862796887755394,
0.03179096058011055,
0.09290634095668793,
-0.019905077293515205,
-0.1407725214958191,
0.0009251297451555729,
-0.13299815356731415,
-0.014809378422796726,
0.10975358635187149,
0.049443092197179794,
-0.044938553124666214,
-0.06386221200227737,
-0.07743526995182037,
-0.10966514050960541,
-0.02415921352803707,
-0.022012969478964806,
0.0526164285838604,
-0.05461534857749939,
-0.05351817235350609,
-0.03945101052522659,
-0.058802470564842224,
-0.08722677081823349,
-0.0021468570921570063,
0.11730475723743439,
0.043704185634851456,
0.024433432146906853,
-0.03561810031533241,
0.10249163210391998,
-0.0009780015097931027,
-0.13260358572006226,
-0.016658853739500046,
0.010296420194208622,
-0.12303617596626282,
-0.058615997433662415,
-0.0267128087580204,
0.010968400165438652,
0.015332563780248165,
0.14924868941307068,
-0.039059799164533615,
0.08236172050237656,
0.016219612210989,
-0.01967667229473591,
-0.011378525756299496,
0.14425645768642426,
-0.03952524811029434,
-0.0546964630484581,
0.005188326351344585,
0.09711414575576782,
-0.000017554817532072775,
-0.0053025949746370316,
-0.07466762512922287,
-0.02024814486503601,
0.09385033696889877,
0.055545300245285034,
-0.05386853590607643,
0.032165758311748505,
-0.03263106197118759,
-0.025216907262802124,
0.024444954469799995,
-0.12409313023090363,
0.037046194076538086,
0.008844424970448017,
-0.0831158235669136,
-0.03775439038872719,
0.01270358171314001,
-0.01640496961772442,
-0.020843492820858955,
0.08893941342830658,
-0.08660344779491425,
-0.005202861037105322,
-0.07775114476680756,
-0.07312465459108353,
0.016305092722177505,
-0.14331340789794922,
-0.013790237717330456,
-0.048460666090250015,
-0.20425398647785187,
-0.038819484412670135,
0.043183062225580215,
-0.07577488571405411,
-0.04131486639380455,
-0.057702142745256424,
-0.08305754512548447,
0.015006831847131252,
-0.004182631149888039,
0.1669756919145584,
-0.05975883826613426,
0.07670599967241287,
-0.013911894522607327,
0.04160653427243233,
0.009268560446798801,
0.049187660217285156,
-0.08591338247060776,
0.022993532940745354,
-0.14161500334739685,
0.0775144025683403,
-0.09616193920373917,
0.011913576163351536,
-0.12364639341831207,
-0.0879812091588974,
0.0411226712167263,
-0.027716554701328278,
0.0705888494849205,
0.143466055393219,
-0.19343627989292145,
0.003137460444122553,
0.11435277760028839,
-0.04871201515197754,
-0.04728696867823601,
0.07896067947149277,
-0.056147877126932144,
0.030430912971496582,
0.051343344151973724,
0.16687534749507904,
0.07617365568876266,
-0.14969952404499054,
-0.017677273601293564,
0.02569238469004631,
0.05020974949002266,
0.003252306254580617,
0.044709283858537674,
0.005383246578276157,
0.02747434750199318,
0.003621090203523636,
-0.07624232023954391,
-0.02569280005991459,
-0.09019536525011063,
-0.07141811400651932,
-0.052503831684589386,
-0.08839589357376099,
0.021208852529525757,
0.012933510355651379,
0.020904334262013435,
-0.057657208293676376,
-0.10097494721412659,
0.10343632847070694,
0.1286713033914566,
-0.05168536305427551,
0.007140558212995529,
-0.06894344836473465,
0.020018458366394043,
-0.030845047906041145,
-0.03612290695309639,
-0.19491958618164062,
-0.13421951234340668,
0.05337816849350929,
-0.052449509501457214,
0.033387187868356705,
0.03529121354222298,
0.06359340250492096,
0.061999812722206116,
-0.03264870494604111,
-0.019158681854605675,
-0.06939636170864105,
-0.003539129626005888,
-0.1157764121890068,
-0.19837547838687897,
-0.06478932499885559,
-0.03312431648373604,
0.1396826207637787,
-0.20406462252140045,
0.0002522250288166106,
-0.025902530178427696,
0.12047003209590912,
0.016073614358901978,
-0.05839235708117485,
0.008950606919825077,
0.030304094776511192,
0.006079723127186298,
-0.09994106739759445,
0.0410182848572731,
0.00778299430385232,
-0.054536204785108566,
-0.062458332628011703,
-0.10961038619279861,
-0.0036072740331292152,
0.05698411539196968,
0.07998508960008621,
-0.10527272522449493,
-0.008611523546278477,
-0.0496932715177536,
-0.0462496355175972,
-0.08647964894771576,
0.009022006765007973,
0.19701358675956726,
0.03245050087571144,
0.12099271267652512,
-0.060584817081689835,
-0.07062938809394836,
0.001752579351887107,
0.024458713829517365,
0.02337873913347721,
0.08989259600639343,
0.10679330676794052,
-0.09742508828639984,
0.08787107467651367,
0.06991346925497055,
-0.047734592109918594,
0.11945881694555283,
-0.044661734253168106,
-0.08209212869405746,
-0.019268617033958435,
0.006754773668944836,
-0.0336361862719059,
0.14071151614189148,
-0.08615433424711227,
0.010738802142441273,
0.03363283351063728,
0.024031788110733032,
0.01806481182575226,
-0.1694173812866211,
-0.00033041168353520334,
0.011845680885016918,
-0.0632660984992981,
-0.03832162171602249,
-0.025552939623594284,
0.035619039088487625,
0.09337012469768524,
0.02544376067817211,
-0.04456929489970207,
0.017211901023983955,
-0.011639120057225227,
-0.07012461125850677,
0.19183076918125153,
-0.10789230465888977,
-0.0909900888800621,
-0.11019173264503479,
0.027538161724805832,
-0.04402429983019829,
-0.03713959455490112,
0.005082308314740658,
-0.08265335112810135,
-0.05524492636322975,
-0.0887114629149437,
-0.026813926175236702,
-0.016628526151180267,
-0.0030498120468109846,
0.025279901921749115,
-0.015821188688278198,
0.07260633260011673,
-0.13209235668182373,
0.004615850280970335,
-0.0294859167188406,
-0.10320545732975006,
0.008536090143024921,
0.06360217183828354,
0.085927315056324,
0.09957833588123322,
-0.014110758900642395,
0.012408630922436714,
-0.026963602751493454,
0.2386421263217926,
-0.05537887290120125,
0.016301125288009644,
0.08328123390674591,
-0.007512744516134262,
0.06086106225848198,
0.14322243630886078,
0.031168723478913307,
-0.10482848435640335,
0.02550971508026123,
0.08172166347503662,
-0.011129224672913551,
-0.24938127398490906,
-0.02797641046345234,
-0.021549025550484657,
-0.07064000517129898,
0.08876564353704453,
0.039677415043115616,
-0.045410048216581345,
0.038820020854473114,
0.010008003562688828,
0.014970386400818825,
-0.053514085710048676,
0.07320529967546463,
0.0845886841416359,
0.04363512620329857,
0.09436329454183578,
-0.021685291081666946,
-0.011254597455263138,
0.06474141776561737,
0.016286907717585564,
0.2695923447608948,
-0.028186511248350143,
0.1263265162706375,
0.02495095692574978,
0.1483798623085022,
-0.030392853543162346,
0.048852428793907166,
0.01616777293384075,
0.00450565991923213,
-0.0060801198706030846,
-0.05232205241918564,
-0.030004821717739105,
0.010831319727003574,
-0.03439631313085556,
0.03270137310028076,
-0.07314163446426392,
0.04664445295929909,
0.0124209588393569,
0.30032235383987427,
0.045664139091968536,
-0.2824093997478485,
-0.06054146587848663,
-0.0030402233824133873,
-0.04651014506816864,
-0.07548478245735168,
0.0030634840950369835,
0.14565089344978333,
-0.12558172643184662,
0.038244862109422684,
-0.053933050483465195,
0.08895367383956909,
-0.04728405550122261,
0.0021980523597449064,
0.06359908729791641,
0.14480425417423248,
-0.00991735514253378,
0.06928113102912903,
-0.19109317660331726,
0.22526566684246063,
0.029000815004110336,
0.11115089803934097,
-0.06130757927894592,
0.014566400088369846,
0.009526216425001621,
0.028328165411949158,
0.10848471522331238,
0.0007627462618984282,
-0.02433038502931595,
-0.17111749947071075,
-0.11577665060758591,
0.05883128196001053,
0.11602713912725449,
-0.014534326270222664,
0.09581562876701355,
-0.04054475203156471,
-0.002504236064851284,
0.03684775158762932,
-0.0744948610663414,
-0.12925393879413605,
-0.08622860163450241,
0.0020792463328689337,
0.018665283918380737,
-0.03822871670126915,
-0.04828837141394615,
-0.0915767177939415,
-0.024661419913172722,
0.1352119892835617,
-0.002293797442689538,
-0.04310174286365509,
-0.1348491907119751,
0.055153291672468185,
0.14237435162067413,
-0.05798926576972008,
0.020229792222380638,
0.005147518590092659,
0.10111905634403229,
0.05105477198958397,
-0.08123182505369186,
0.05257119610905647,
-0.06739558279514313,
-0.16028733551502228,
-0.061947256326675415,
0.11573214083909988,
0.0795358344912529,
0.05354577675461769,
-0.0014597887638956308,
0.03178585693240166,
0.0012527162907645106,
-0.08626247197389603,
0.00796129833906889,
0.05626186355948448,
0.08971979469060898,
0.050210967659950256,
-0.0940849557518959,
0.04702077433466911,
-0.034545328468084335,
-0.0031532039865851402,
0.1272302269935608,
0.21526572108268738,
-0.08537346124649048,
0.08692321926355362,
0.07027267664670944,
-0.08051127195358276,
-0.17559297382831573,
0.07091120630502701,
0.13247965276241302,
0.015584648586809635,
0.0371498242020607,
-0.20391122996807098,
0.136969655752182,
0.1139543280005455,
-0.014957955107092857,
0.05545540899038315,
-0.29604053497314453,
-0.12338246405124664,
0.07052187621593475,
0.10437630861997604,
0.04426678642630577,
-0.12917520105838776,
-0.02762432023882866,
-0.010013102553784847,
-0.14917288720607758,
0.1429995894432068,
-0.0698789581656456,
0.12159843742847443,
-0.00868162326514721,
0.12191124260425568,
0.022696780040860176,
-0.04175250977277756,
0.13116209208965302,
0.07688677310943604,
0.08727474510669708,
-0.03922099992632866,
-0.0029463907703757286,
0.04844208061695099,
-0.06677287071943283,
0.04894509166479111,
-0.04052712395787239,
0.06572218239307404,
-0.1676861196756363,
0.0005905032157897949,
-0.08271430432796478,
0.04391036182641983,
-0.048992592841386795,
-0.04679277911782265,
-0.027606016024947166,
0.04621401056647301,
0.06639743596315384,
-0.034357231110334396,
0.030337199568748474,
0.02220434695482254,
0.056269872933626175,
0.09375585615634918,
0.08893147855997086,
-0.020740913227200508,
-0.10936880856752396,
0.010274206288158894,
-0.006401540711522102,
0.05329270660877228,
-0.10068738460540771,
0.015501060523092747,
0.135713130235672,
0.058527931571006775,
0.12726901471614838,
0.02535530924797058,
-0.032941997051239014,
-0.014826207421720028,
0.015712641179561615,
-0.12420212477445602,
-0.11820757389068604,
0.03759133815765381,
-0.043865397572517395,
-0.15656884014606476,
0.0059899259358644485,
0.10145171731710434,
-0.0387658067047596,
-0.012662054039537907,
-0.011167104355990887,
0.023974565789103508,
-0.01221571583300829,
0.20243385434150696,
0.04141736030578613,
0.06312688440084457,
-0.1021478921175003,
0.12479475140571594,
0.05740296468138695,
-0.04609167203307152,
0.05659009516239166,
0.06766729056835175,
-0.09320720285177231,
-0.005876688752323389,
0.10974931716918945,
0.17176960408687592,
-0.045178450644016266,
-0.01967000961303711,
-0.07289957255125046,
-0.07342709600925446,
0.05738673731684685,
0.1606569141149521,
0.04982207715511322,
-0.007684455253183842,
-0.043259814381599426,
0.025422172620892525,
-0.12594760954380035,
0.07292136549949646,
0.0475916713476181,
0.06870722770690918,
-0.10693207383155823,
0.11137912422418594,
-0.00882582925260067,
0.03531147539615631,
-0.015198537148535252,
0.029056143015623093,
-0.09721287339925766,
-0.025892850011587143,
-0.12237676978111267,
0.019620757550001144,
-0.010572507977485657,
0.0075583625584840775,
-0.011006748303771019,
-0.053981196135282516,
-0.037263769656419754,
0.026141898706555367,
-0.08056361973285675,
-0.055143456906080246,
0.017253180965781212,
0.04490785300731659,
-0.15389502048492432,
-0.012578071095049381,
0.023251237347722054,
-0.09298168867826462,
0.07633990049362183,
0.06410709768533707,
0.017430292442440987,
0.027486903592944145,
-0.10790402442216873,
-0.045209575444459915,
0.013941359706223011,
0.0283722635358572,
0.08372607082128525,
-0.08536416292190552,
-0.014729185961186886,
-0.03359116613864899,
0.04979226365685463,
0.014435814693570137,
0.08876383304595947,
-0.1157342866063118,
-0.00378181179985404,
-0.05764156952500343,
-0.034191083163022995,
-0.060228362679481506,
0.034324001520872116,
0.115529865026474,
0.035142119973897934,
0.16834671795368195,
-0.06728014349937439,
0.03919981047511101,
-0.19455863535404205,
-0.03401396796107292,
0.0018193732248619199,
-0.040015459060668945,
-0.0841887816786766,
-0.04268407076597214,
0.0921683982014656,
-0.051363326609134674,
0.09280665963888168,
-0.008341951295733452,
0.08965487778186798,
0.0310151819139719,
-0.006469620391726494,
-0.057378895580768585,
0.0014906905125826597,
0.1468493789434433,
0.05933795124292374,
-0.018938224762678146,
0.10204184800386429,
-0.006624717731028795,
0.05601010099053383,
0.04260789602994919,
0.22423230111598969,
0.14809708297252655,
-0.02303929254412651,
0.06442709267139435,
0.07003801316022873,
-0.12176145613193512,
-0.12526872754096985,
0.1406463086605072,
-0.04659853130578995,
0.1225704625248909,
-0.05495253950357437,
0.2206837236881256,
0.023636847734451294,
-0.17590121924877167,
0.05167112499475479,
-0.05560486018657684,
-0.12050708383321762,
-0.11903563886880875,
-0.013420860283076763,
-0.08064254373311996,
-0.0992296040058136,
0.025654815137386322,
-0.1214645728468895,
0.06528204679489136,
0.11984362453222275,
0.015700500458478928,
0.02079075388610363,
0.15368399024009705,
-0.04073386266827583,
0.018794303759932518,
0.06043432652950287,
0.02359757199883461,
-0.0052262237295508385,
-0.06210031732916832,
-0.06482000648975372,
0.05111996456980705,
0.03452300280332565,
0.08570674061775208,
-0.046713244169950485,
0.0184150543063879,
0.032500531524419785,
-0.013470916077494621,
-0.07435096055269241,
0.012428872287273407,
0.025378772988915443,
0.03852894902229309,
0.05478377267718315,
0.0538606233894825,
0.01768406294286251,
-0.03578101471066475,
0.2659859359264374,
-0.07204615324735641,
-0.08345096558332443,
-0.13044589757919312,
0.19750085473060608,
0.017734555527567863,
-0.01853993721306324,
0.07519116252660751,
-0.1041049137711525,
-0.02147100307047367,
0.16665545105934143,
0.12178701162338257,
-0.10164302587509155,
-0.029400737956166267,
-0.015788499265909195,
-0.0119381258264184,
-0.03863513842225075,
0.11751476675271988,
0.09361658245325089,
0.006925847847014666,
-0.07613220810890198,
-0.027465179562568665,
-0.01703525334596634,
-0.04529120773077011,
-0.06438031792640686,
0.03534063696861267,
0.015822911635041237,
0.0024155795108526945,
-0.038066036999225616,
0.052607372403144836,
-0.009956558234989643,
-0.2427821159362793,
0.031114351004362106,
-0.15693651139736176,
-0.18134602904319763,
-0.031196830794215202,
0.06499140709638596,
-0.005172485485672951,
0.03865857049822807,
-0.018707960844039917,
0.004496993962675333,
0.14872963726520538,
-0.03547303378582001,
-0.048483967781066895,
-0.12265291810035706,
0.10124168545007706,
-0.10145994275808334,
0.20582392811775208,
0.007786461617797613,
0.08335229754447937,
0.09724893420934677,
0.0227223951369524,
-0.13498175144195557,
0.033740364015102386,
0.07329615205526352,
-0.1092020571231842,
0.014541925862431526,
0.15142755210399628,
-0.054976049810647964,
0.07903369516134262,
0.02582641690969467,
-0.10418771952390671,
-0.01918276585638523,
-0.027530299499630928,
-0.03101971745491028,
-0.07985620200634003,
-0.014598403126001358,
-0.06329457461833954,
0.1656971275806427,
0.2200639396905899,
-0.024936994537711143,
0.015695421025156975,
-0.08639485388994217,
0.0164747703820467,
0.04487403482198715,
0.05281610041856766,
-0.04321829229593277,
-0.20636217296123505,
0.03100254200398922,
0.019740412011742592,
0.02405431494116783,
-0.19539874792099,
-0.08332385122776031,
0.04547341540455818,
-0.029871299862861633,
-0.05261373147368431,
0.09958065301179886,
0.024577848613262177,
0.04321998730301857,
-0.03421556577086449,
-0.09655275195837021,
-0.044849976897239685,
0.14469116926193237,
-0.16229011118412018,
-0.05403633415699005
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10189153999090195,
0.13504299521446228,
-0.0019977872725576162,
0.092535100877285,
0.13582339882850647,
0.03338521718978882,
0.08243459463119507,
0.14052695035934448,
-0.07978494465351105,
0.054188720881938934,
0.07432279735803604,
0.06005602702498436,
0.04767511039972305,
0.12810060381889343,
-0.04354850575327873,
-0.2144199162721634,
0.0035577542148530483,
-0.013320364989340305,
-0.0646466463804245,
0.10329626500606537,
0.08773693442344666,
-0.11131848394870758,
0.0696721225976944,
-0.021798603236675262,
-0.15305489301681519,
0.009339464828372002,
-0.032448191195726395,
-0.025147559121251106,
0.10533327609300613,
-0.007192350458353758,
0.09305493533611298,
0.014558536931872368,
0.14170588552951813,
-0.2142014503479004,
0.000604775152169168,
0.07743365317583084,
0.04576592147350311,
0.0903187170624733,
0.035598836839199066,
0.018280139192938805,
0.029011191800236702,
-0.15158243477344513,
0.09291800111532211,
0.026705080643296242,
-0.0796455666422844,
-0.1334059238433838,
-0.0902169868350029,
0.025591127574443817,
0.08194795995950699,
0.08134644478559494,
0.008052698336541653,
0.13144001364707947,
-0.09840339422225952,
0.08229287713766098,
0.19939954578876495,
-0.27965694665908813,
-0.06509329378604889,
0.04530337080359459,
0.05044010654091835,
0.07190346717834473,
-0.11905530840158463,
-0.01827538013458252,
0.02089444361627102,
0.031213004142045975,
0.11134171485900879,
-0.019905945286154747,
-0.1307142972946167,
0.008764047175645828,
-0.12677592039108276,
-0.015802737325429916,
0.10770737379789352,
0.04088180512189865,
-0.04683304205536842,
-0.07874980568885803,
-0.06635770201683044,
-0.09707817435264587,
-0.029156947508454323,
-0.010685513727366924,
0.05243480205535889,
-0.05559193715453148,
-0.06772966682910919,
-0.04775116965174675,
-0.056918878108263016,
-0.0904383435845375,
0.0018848663894459605,
0.12590967118740082,
0.03984014317393303,
0.019476905465126038,
-0.03243274241685867,
0.11689813435077667,
0.0009065406629815698,
-0.12986725568771362,
-0.012054502964019775,
0.004305437207221985,
-0.11961483210325241,
-0.05124317854642868,
-0.031675148755311966,
0.01648375205695629,
0.013656001538038254,
0.15007716417312622,
-0.03769778832793236,
0.07556252926588058,
0.013337509706616402,
-0.019356241449713707,
-0.020352501422166824,
0.16041700541973114,
-0.033384378999471664,
-0.04668606072664261,
0.001934809610247612,
0.09760890901088715,
-0.0010013154242187738,
-0.00678316131234169,
-0.08092903345823288,
-0.01728692092001438,
0.07771067321300507,
0.06558473408222198,
-0.054884329438209534,
0.038451943546533585,
-0.041725389659404755,
-0.025549771264195442,
0.022002901881933212,
-0.12835177779197693,
0.03697049617767334,
0.006764869671314955,
-0.0815819725394249,
-0.05709060654044151,
0.009813621640205383,
-0.016444575041532516,
-0.027351465076208115,
0.07704159617424011,
-0.07190260291099548,
-0.0063376761972904205,
-0.08776075392961502,
-0.07988560199737549,
0.0020619379356503487,
-0.13753435015678406,
-0.013082184828817844,
-0.04742320254445076,
-0.19811074435710907,
-0.033625528216362,
0.0507870577275753,
-0.07669835537672043,
-0.04299544543027878,
-0.05029236897826195,
-0.07798776775598526,
0.010684902779757977,
-0.007429386489093304,
0.18781538307666779,
-0.058527275919914246,
0.08191829174757004,
-0.016386577859520912,
0.04551661014556885,
0.021931925788521767,
0.05172650143504143,
-0.08196733146905899,
0.02685321867465973,
-0.14720888435840607,
0.08746979385614395,
-0.10409978032112122,
0.012538476847112179,
-0.12705522775650024,
-0.0859798938035965,
0.05697975680232048,
-0.02171359211206436,
0.07095526158809662,
0.1399044543504715,
-0.1926833838224411,
0.00064880057470873,
0.11915267258882523,
-0.03757929429411888,
-0.043699078261852264,
0.08005890250205994,
-0.05882939323782921,
0.0347222164273262,
0.05496646836400032,
0.18009217083454132,
0.0814899131655693,
-0.14418134093284607,
0.0077721248380839825,
0.04293891414999962,
0.061245184391736984,
0.007003753911703825,
0.041096072643995285,
0.0025991094298660755,
0.025255272164940834,
0.009256691671907902,
-0.09441330283880234,
-0.023267364129424095,
-0.08859723061323166,
-0.0719902515411377,
-0.04995815455913544,
-0.09284791350364685,
0.03568781539797783,
0.01277422159910202,
0.01890634559094906,
-0.06409595906734467,
-0.10910328477621078,
0.09836336225271225,
0.12496712058782578,
-0.05824614316225052,
0.011594801209867,
-0.07848919928073883,
0.019534897059202194,
-0.01837967149913311,
-0.027198778465390205,
-0.19586573541164398,
-0.12712867558002472,
0.05046534165740013,
-0.04545991122722626,
0.025347916409373283,
0.02975301630795002,
0.06564393639564514,
0.05525112524628639,
-0.044807735830545425,
-0.022206509485840797,
-0.06878141313791275,
-0.0021992740221321583,
-0.11295753717422485,
-0.2004157453775406,
-0.0719904750585556,
-0.03728652000427246,
0.14282061159610748,
-0.19997839629650116,
0.00525608891621232,
-0.02092910185456276,
0.11040671169757843,
0.018509650602936745,
-0.05052570626139641,
0.011582513339817524,
0.03285755217075348,
0.014015666209161282,
-0.09908656775951385,
0.04976009950041771,
0.010581601411104202,
-0.06997356563806534,
-0.04418393597006798,
-0.1023494303226471,
-0.0004332370008341968,
0.05885693058371544,
0.08407427370548248,
-0.10068138688802719,
-0.017570529133081436,
-0.05238662287592888,
-0.037354834377765656,
-0.08509141951799393,
0.01499699242413044,
0.20576484501361847,
0.03527054190635681,
0.12432265281677246,
-0.05825471132993698,
-0.08014845103025436,
-0.004177701659500599,
0.021617744117975235,
0.033580977469682693,
0.10162261128425598,
0.07550713419914246,
-0.07932408154010773,
0.08265920728445053,
0.07997613400220871,
-0.042216550558805466,
0.11546921730041504,
-0.05455221235752106,
-0.07574597746133804,
-0.015231222845613956,
0.0011078522074967623,
-0.028463954105973244,
0.14919134974479675,
-0.08193840831518173,
0.0013549833092838526,
0.038845647126436234,
0.01951003447175026,
0.00796054769307375,
-0.17660565674304962,
-0.006534046493470669,
0.016585299745202065,
-0.06796346604824066,
-0.050115328282117844,
-0.031708866357803345,
0.03457184135913849,
0.09260953217744827,
0.028771743178367615,
-0.0308468546718359,
0.022014208137989044,
-0.011835972778499126,
-0.06069125607609749,
0.1939125508069992,
-0.11300777643918991,
-0.0938439592719078,
-0.098240926861763,
0.01953873410820961,
-0.03847160562872887,
-0.03544365614652634,
0.0036590255331248045,
-0.09410510212182999,
-0.05430149659514427,
-0.08733582496643066,
-0.019374964758753777,
-0.010932432487607002,
-0.005805294495075941,
0.022354163229465485,
-0.015176981687545776,
0.07772603631019592,
-0.1336967796087265,
0.003920016810297966,
-0.028844758868217468,
-0.10011396557092667,
0.01812758855521679,
0.06188053637742996,
0.08170923590660095,
0.10249863564968109,
-0.012491249479353428,
0.016002366319298744,
-0.03089500218629837,
0.22840547561645508,
-0.06590713560581207,
0.01698721945285797,
0.08872856199741364,
-0.0018086530035361648,
0.05537980794906616,
0.13951633870601654,
0.033638324588537216,
-0.10760927945375443,
0.02878516912460327,
0.07913684099912643,
-0.019298341125249863,
-0.2509423494338989,
-0.0265609472990036,
-0.020579082891345024,
-0.0716298520565033,
0.08605030924081802,
0.038764119148254395,
-0.05232936143875122,
0.03415272384881973,
0.013269802555441856,
0.00425030617043376,
-0.034759633243083954,
0.06832467764616013,
0.0918995663523674,
0.0379919558763504,
0.10180645436048508,
-0.02099420502781868,
-0.008284289389848709,
0.06320895254611969,
0.025018135085701942,
0.2698497474193573,
-0.04413178935647011,
0.1359870433807373,
0.029190247878432274,
0.14870059490203857,
-0.02313360758125782,
0.04962814971804619,
0.009661233052611351,
-0.000008603637070336845,
-0.005003974307328463,
-0.04894425347447395,
-0.021833574399352074,
0.0012851277133449912,
-0.032412633299827576,
0.023880353197455406,
-0.0721132755279541,
0.03058258257806301,
0.017612947151064873,
0.3134126663208008,
0.04237670823931694,
-0.27302679419517517,
-0.07357741892337799,
0.003627902129665017,
-0.0464627631008625,
-0.07784523069858551,
0.005099647678434849,
0.1455545574426651,
-0.12772786617279053,
0.029720624908804893,
-0.05652519688010216,
0.09125439822673798,
-0.0479554608464241,
0.009342042729258537,
0.06821001321077347,
0.14404486119747162,
-0.010022995062172413,
0.07278218120336533,
-0.206375390291214,
0.23437027633190155,
0.02684031054377556,
0.10633549094200134,
-0.06267699599266052,
0.01289608795195818,
0.007799474056810141,
0.042471639811992645,
0.11788664758205414,
0.002688617678359151,
-0.00104280817322433,
-0.17844580113887787,
-0.10217759013175964,
0.057256847620010376,
0.11087621748447418,
-0.028835354372859,
0.0922941043972969,
-0.04400116205215454,
-0.0055876425467431545,
0.03543175756931305,
-0.07977239787578583,
-0.11665073037147522,
-0.09411992132663727,
-0.007494617719203234,
0.0034880905877798796,
-0.02487237937748432,
-0.05873286724090576,
-0.09193208068609238,
-0.01522332988679409,
0.13928894698619843,
0.0051148924976587296,
-0.053064700216054916,
-0.14044101536273956,
0.0541982501745224,
0.13453342020511627,
-0.05124804005026817,
0.013852785341441631,
0.007351904641836882,
0.1109313815832138,
0.04823976755142212,
-0.07763254642486572,
0.05982396379113197,
-0.07233689725399017,
-0.16482065618038177,
-0.05840849503874779,
0.11986187100410461,
0.08392145484685898,
0.05812297761440277,
0.0016880716430023313,
0.0287303626537323,
-0.002685049781575799,
-0.08189491927623749,
0.015302201732993126,
0.05681653693318367,
0.08969412744045258,
0.03647753968834877,
-0.09751290827989578,
0.06389731168746948,
-0.037389758974313736,
-0.008730332367122173,
0.12656466662883759,
0.20492179691791534,
-0.09563405066728592,
0.10262149572372437,
0.07345885783433914,
-0.0797208696603775,
-0.1878703385591507,
0.07004610449075699,
0.12546305358409882,
0.023067694157361984,
0.03693896904587746,
-0.21153821051120758,
0.13263386487960815,
0.10447949916124344,
-0.015867428854107857,
0.04097673296928406,
-0.29599082469940186,
-0.12503348290920258,
0.0761110857129097,
0.10341069847345352,
0.034059327095746994,
-0.12295597791671753,
-0.024454746395349503,
-0.013193564489483833,
-0.13023652136325836,
0.13280178606510162,
-0.07908192276954651,
0.11892955750226974,
-0.005745396949350834,
0.1196599081158638,
0.02449142187833786,
-0.037588298320770264,
0.131451815366745,
0.07148689031600952,
0.09048127382993698,
-0.03775682672858238,
0.009497213177382946,
0.05667872354388237,
-0.06713996082544327,
0.03909334912896156,
-0.04196958988904953,
0.06959149241447449,
-0.17034342885017395,
-0.004752810578793287,
-0.0893944501876831,
0.0398394912481308,
-0.04871852323412895,
-0.05719366297125816,
-0.020368553698062897,
0.054243654012680054,
0.0688127651810646,
-0.03981155529618263,
0.02644030749797821,
0.006493748165667057,
0.0637018084526062,
0.09347625076770782,
0.10151359438896179,
-0.023205073550343513,
-0.11097967624664307,
0.014821489341557026,
-0.007698062341660261,
0.05533419921994209,
-0.11232516914606094,
0.02291727252304554,
0.13056138157844543,
0.06135445088148117,
0.12028754502534866,
0.02579275332391262,
-0.030870411545038223,
-0.014893323183059692,
0.009193584322929382,
-0.11667275428771973,
-0.12432262301445007,
0.03984585031867027,
-0.03877151384949684,
-0.14788775146007538,
0.00639639375731349,
0.09932107478380203,
-0.03290373459458351,
-0.01919088326394558,
-0.008631899952888489,
0.020713916048407555,
-0.016500025987625122,
0.1993287056684494,
0.041056305170059204,
0.07159531116485596,
-0.10636816918849945,
0.11908228695392609,
0.05396321415901184,
-0.058824874460697174,
0.05188445746898651,
0.0633634477853775,
-0.10266818851232529,
-0.012194598093628883,
0.11523187905550003,
0.16076624393463135,
-0.03479231148958206,
-0.014780029654502869,
-0.07756488770246506,
-0.09246345609426498,
0.06109074503183365,
0.14867742359638214,
0.051659826189279556,
-0.011878970079123974,
-0.04986210912466049,
0.026112813502550125,
-0.12531940639019012,
0.08130452781915665,
0.045238811522722244,
0.061641525477170944,
-0.09570786356925964,
0.10448118299245834,
-0.006321245338767767,
0.04080956429243088,
-0.017298217862844467,
0.024274829775094986,
-0.09918004274368286,
-0.010886741802096367,
-0.14876429736614227,
0.007827789522707462,
0.001041207928210497,
0.013299680314958096,
-0.02058437094092369,
-0.05047424137592316,
-0.02395635098218918,
0.02875036559998989,
-0.08865354210138321,
-0.05404038354754448,
0.019748693332076073,
0.04150811582803726,
-0.14922046661376953,
-0.015297147445380688,
0.022845089435577393,
-0.09585745632648468,
0.0811380073428154,
0.0594099760055542,
0.012568339705467224,
0.02760310284793377,
-0.09841588139533997,
-0.04821503162384033,
0.0022796352859586477,
0.02741972915828228,
0.08730435371398926,
-0.09241373091936111,
-0.016567692160606384,
-0.037019532173871994,
0.041251927614212036,
0.017976859584450722,
0.09087690711021423,
-0.11464528739452362,
0.005484458524733782,
-0.04790490120649338,
-0.03873598575592041,
-0.06655506789684296,
0.04656193032860756,
0.11034418642520905,
0.046926356852054596,
0.16022183001041412,
-0.07157706469297409,
0.03521822765469551,
-0.18999843299388885,
-0.04023144766688347,
0.002560719382017851,
-0.04572850093245506,
-0.0884929671883583,
-0.05018136277794838,
0.09680752456188202,
-0.04932909831404686,
0.0984564945101738,
-0.0014294213615357876,
0.09851760417222977,
0.032850250601768494,
-0.021307578310370445,
-0.05132655054330826,
0.008173191919922829,
0.15464995801448822,
0.050092604011297226,
-0.01447750348597765,
0.1125892698764801,
-0.006040186621248722,
0.04627048596739769,
0.07098766416311264,
0.21041622757911682,
0.15361978113651276,
0.005802356172353029,
0.04878013953566551,
0.06272279471158981,
-0.11903371661901474,
-0.14228494465351105,
0.1386934071779251,
-0.04774193838238716,
0.1272081434726715,
-0.06937524676322937,
0.22355467081069946,
0.022438498213887215,
-0.18570899963378906,
0.0666162520647049,
-0.061676185578107834,
-0.12452349811792374,
-0.115000419318676,
-0.03204638883471489,
-0.07409962266683578,
-0.10975632071495056,
0.021397219970822334,
-0.12081613391637802,
0.06094811484217644,
0.12780632078647614,
0.018042245879769325,
0.026070430874824524,
0.1581127941608429,
-0.03461474925279617,
0.018601069226861,
0.06991230696439743,
0.019629312679171562,
-0.005985704716295004,
-0.06432300806045532,
-0.06121611222624779,
0.04990281164646149,
0.02580653317272663,
0.07621053606271744,
-0.04241323471069336,
0.01000306848436594,
0.02217220887541771,
-0.015560679137706757,
-0.07277786731719971,
0.014088933356106281,
0.02615375630557537,
0.043854787945747375,
0.0626758560538292,
0.05172480270266533,
0.007406160235404968,
-0.036909542977809906,
0.2894403040409088,
-0.07909709960222244,
-0.09324754774570465,
-0.1345047950744629,
0.22618454694747925,
0.010452084243297577,
-0.021812979131937027,
0.07818164676427841,
-0.09549076855182648,
-0.025344129651784897,
0.1750507652759552,
0.1393970400094986,
-0.10048740357160568,
-0.024990009143948555,
-0.015404571779072285,
-0.012771219946444035,
-0.04560910537838936,
0.1290304958820343,
0.10860563814640045,
-0.008772485889494419,
-0.08057308942079544,
-0.022942563518881798,
-0.010351010598242283,
-0.051902372390031815,
-0.06741343438625336,
0.06265116482973099,
0.018697476014494896,
0.00031064680661074817,
-0.04370863363146782,
0.05602089315652847,
-0.004086431115865707,
-0.25121021270751953,
0.036577630788087845,
-0.1583879292011261,
-0.1779213696718216,
-0.03855764865875244,
0.053155653178691864,
-0.005327626131474972,
0.03961446136236191,
-0.012209341861307621,
0.012417653575539589,
0.15466199815273285,
-0.03389331325888634,
-0.03300093859434128,
-0.12026763707399368,
0.1170508936047554,
-0.10434556007385254,
0.2002336084842682,
0.0030716476030647755,
0.07055146247148514,
0.0976194441318512,
0.014264321886003017,
-0.13404254615306854,
0.04016564413905144,
0.0760032907128334,
-0.10670652985572815,
0.01141999289393425,
0.15231704711914062,
-0.053412262350320816,
0.08383263647556305,
0.021576980128884315,
-0.10930315405130386,
-0.011299739591777325,
-0.045292045921087265,
-0.03706514835357666,
-0.07887106388807297,
-0.017088986933231354,
-0.06428008526563644,
0.1591486930847168,
0.22837474942207336,
-0.023736853152513504,
0.015556824393570423,
-0.09305823594331741,
0.012681500054895878,
0.04396886005997658,
0.04787392541766167,
-0.0464661531150341,
-0.1932387501001358,
0.026192214339971542,
0.034822072833776474,
0.019368767738342285,
-0.21155783534049988,
-0.0849994570016861,
0.04581523314118385,
-0.028705550357699394,
-0.04717458412051201,
0.10463796555995941,
0.030252056196331978,
0.045900411903858185,
-0.03501629829406738,
-0.09728115797042847,
-0.0394207127392292,
0.14245612919330597,
-0.1603739857673645,
-0.04235120490193367
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.1026349738240242,
0.13430117070674896,
-0.0019417083822190762,
0.09200722724199295,
0.13570722937583923,
0.03341246023774147,
0.0838303193449974,
0.1393270492553711,
-0.0786173865199089,
0.0534355565905571,
0.0735878124833107,
0.06161586195230484,
0.04813551530241966,
0.1295224279165268,
-0.04264214262366295,
-0.21572233736515045,
0.003625100012868643,
-0.01252224575728178,
-0.06345254927873611,
0.10345279425382614,
0.08794134110212326,
-0.11155374348163605,
0.06859634071588516,
-0.022453784942626953,
-0.15235275030136108,
0.009199501946568489,
-0.03298604488372803,
-0.024990985170006752,
0.10524094104766846,
-0.00899344403296709,
0.09255240857601166,
0.014534945599734783,
0.14398041367530823,
-0.2137140929698944,
0.0004829785320907831,
0.07637742906808853,
0.04538925364613533,
0.09001924842596054,
0.03555804863572121,
0.019878750666975975,
0.02863091416656971,
-0.15288051962852478,
0.09411812573671341,
0.02596287615597248,
-0.07958585768938065,
-0.13143488764762878,
-0.0909038558602333,
0.02659689448773861,
0.08533576875925064,
0.080447718501091,
0.007053818553686142,
0.13079816102981567,
-0.09916365891695023,
0.08293142914772034,
0.20015087723731995,
-0.2809286117553711,
-0.06522319465875626,
0.04767625406384468,
0.0535137914121151,
0.07250136882066727,
-0.11920876801013947,
-0.019203949719667435,
0.021551473066210747,
0.03044798970222473,
0.11174122244119644,
-0.02076723985373974,
-0.12945257127285004,
0.008153467439115047,
-0.1281953901052475,
-0.015399886295199394,
0.10733310133218765,
0.04252241179347038,
-0.04700601473450661,
-0.0801205188035965,
-0.06644715368747711,
-0.0985654890537262,
-0.03051508590579033,
-0.012540945783257484,
0.052198152989149094,
-0.056722406297922134,
-0.06714417785406113,
-0.0480252206325531,
-0.05597834289073944,
-0.09091842174530029,
0.004773192573338747,
0.12343773245811462,
0.04042164608836174,
0.018087048083543777,
-0.03151967376470566,
0.11622915416955948,
-0.0009753046906553209,
-0.1292589008808136,
-0.010792787186801434,
0.004507112316787243,
-0.12165414541959763,
-0.052582163363695145,
-0.03136264905333519,
0.017685234546661377,
0.012298541143536568,
0.1515093296766281,
-0.03450945019721985,
0.07590843737125397,
0.015158383175730705,
-0.020006509497761726,
-0.020667748525738716,
0.16009259223937988,
-0.034564994275569916,
-0.04936501011252403,
0.0010749725624918938,
0.09891223907470703,
-0.0015790046891197562,
-0.0055374642834067345,
-0.0815005823969841,
-0.017708251252770424,
0.07867176085710526,
0.06585405021905899,
-0.05634962022304535,
0.03845278173685074,
-0.04062393307685852,
-0.025133688002824783,
0.02266567572951317,
-0.12879042327404022,
0.03732871264219284,
0.0058557987213134766,
-0.08253418654203415,
-0.05797136202454567,
0.008196481503546238,
-0.015217555686831474,
-0.027588697150349617,
0.07593235373497009,
-0.07163383811712265,
-0.006551649887114763,
-0.0878865197300911,
-0.0812210664153099,
0.002863967092707753,
-0.14237567782402039,
-0.012178230099380016,
-0.04568558558821678,
-0.2017093449831009,
-0.034904852509498596,
0.04994215443730354,
-0.0756654366850853,
-0.0435517244040966,
-0.05234908685088158,
-0.07847309112548828,
0.008820995688438416,
-0.006510346196591854,
0.18917471170425415,
-0.057113390415906906,
0.08099353313446045,
-0.016656484454870224,
0.046996768563985825,
0.02249402552843094,
0.05187258869409561,
-0.08093773573637009,
0.02720355987548828,
-0.14610205590724945,
0.08734355866909027,
-0.10425150394439697,
0.015323392115533352,
-0.1271747499704361,
-0.08652269095182419,
0.056010980159044266,
-0.021539650857448578,
0.07112511247396469,
0.1399887055158615,
-0.19442729651927948,
0.0020065829157829285,
0.12008138000965118,
-0.03663517162203789,
-0.042900294065475464,
0.0790402889251709,
-0.059253156185150146,
0.03645513579249382,
0.056569185107946396,
0.17981864511966705,
0.08299129456281662,
-0.142649844288826,
0.006964285857975483,
0.042215511202812195,
0.06118468940258026,
0.00549284927546978,
0.04143697768449783,
0.003851557383313775,
0.027824565768241882,
0.008986059576272964,
-0.09317130595445633,
-0.023746225982904434,
-0.08833616226911545,
-0.07249072939157486,
-0.04925665259361267,
-0.09333150833845139,
0.03570742905139923,
0.013308881781995296,
0.01911739632487297,
-0.0642399936914444,
-0.10913185775279999,
0.09865513443946838,
0.12515610456466675,
-0.058667056262493134,
0.010643118992447853,
-0.07829378545284271,
0.018923763185739517,
-0.01919534243643284,
-0.026896296069025993,
-0.19584621489048004,
-0.12479721754789352,
0.050353702157735825,
-0.04594551771879196,
0.02483985386788845,
0.03224717825651169,
0.0663183405995369,
0.054954394698143005,
-0.04431556165218353,
-0.023073410615324974,
-0.06893878430128098,
-0.0023637174163013697,
-0.11468488723039627,
-0.19934114813804626,
-0.07384508848190308,
-0.03803757205605507,
0.14309237897396088,
-0.20180679857730865,
0.006648280657827854,
-0.023004144430160522,
0.10954627394676208,
0.017425255849957466,
-0.05089184269309044,
0.01238451711833477,
0.03227175027132034,
0.014814499765634537,
-0.09984540194272995,
0.04978584125638008,
0.009135989472270012,
-0.06951884925365448,
-0.04649147391319275,
-0.10397741943597794,
-0.003125972580164671,
0.058105986565351486,
0.08606082946062088,
-0.10036662966012955,
-0.017790863290429115,
-0.052472073584795,
-0.03675401583313942,
-0.08327249437570572,
0.014629905112087727,
0.20322762429714203,
0.0344773530960083,
0.1235140711069107,
-0.05847136303782463,
-0.08127181977033615,
-0.0036647445522248745,
0.022998850792646408,
0.03439768776297569,
0.10191406309604645,
0.0752573162317276,
-0.07943329215049744,
0.08223400264978409,
0.08115372806787491,
-0.041229650378227234,
0.11405853182077408,
-0.05460270866751671,
-0.07655452936887741,
-0.015140180476009846,
0.0014180182479321957,
-0.02926056832075119,
0.14885234832763672,
-0.08304587751626968,
-0.0004080412327311933,
0.03838934004306793,
0.018165428191423416,
0.007605720777064562,
-0.17717395722866058,
-0.006072429474443197,
0.01615784503519535,
-0.06710918247699738,
-0.051705896854400635,
-0.032390251755714417,
0.03300580382347107,
0.09189549088478088,
0.028112076222896576,
-0.03192600607872009,
0.021457627415657043,
-0.011801992543041706,
-0.06056956201791763,
0.1943192183971405,
-0.11125649511814117,
-0.09270589053630829,
-0.09668857604265213,
0.019130930304527283,
-0.036160681396722794,
-0.03579355403780937,
0.0030624058563262224,
-0.09472418576478958,
-0.053643446415662766,
-0.0866561159491539,
-0.020680541172623634,
-0.010817628353834152,
-0.006197172217071056,
0.024380136281251907,
-0.014548498205840588,
0.0758802592754364,
-0.13352949917316437,
0.0046084364876151085,
-0.02969013713300228,
-0.10002576559782028,
0.017340684309601784,
0.06087430939078331,
0.08168955892324448,
0.10299207270145416,
-0.012857367284595966,
0.016352221369743347,
-0.03141340985894203,
0.22805190086364746,
-0.06646527349948883,
0.01841198466718197,
0.08801661431789398,
-0.001642080838792026,
0.05525641143321991,
0.14068131148815155,
0.03301572799682617,
-0.1076429933309555,
0.02818637154996395,
0.07868103682994843,
-0.018703486770391464,
-0.25219035148620605,
-0.025367125868797302,
-0.01977124623954296,
-0.07198359817266464,
0.08697284758090973,
0.03806453198194504,
-0.04964868724346161,
0.035841260105371475,
0.012795950286090374,
0.0062477136962115765,
-0.033461958169937134,
0.06830339878797531,
0.08910513669252396,
0.03700824826955795,
0.10184139758348465,
-0.021322332322597504,
-0.009220198728144169,
0.06091702729463577,
0.025556711480021477,
0.2714802026748657,
-0.042171698063611984,
0.1355372667312622,
0.028419604524970055,
0.14740023016929626,
-0.024299820885062218,
0.05267118662595749,
0.010905014351010323,
-0.0006051342934370041,
-0.004909111652523279,
-0.0482804998755455,
-0.01981271244585514,
0.0019533815793693066,
-0.03195749968290329,
0.023221058771014214,
-0.07158979028463364,
0.03022564947605133,
0.017218736931681633,
0.31223064661026,
0.044342998415231705,
-0.27408096194267273,
-0.07254738360643387,
0.004046306014060974,
-0.04847101494669914,
-0.0777229517698288,
0.004548665136098862,
0.14428164064884186,
-0.12786820530891418,
0.030702201649546623,
-0.05724962800741196,
0.09161366522312164,
-0.046906813979148865,
0.009636118076741695,
0.06691419333219528,
0.1431138962507248,
-0.009779891930520535,
0.07384898513555527,
-0.207915261387825,
0.23386867344379425,
0.02647736296057701,
0.10759855061769485,
-0.06368149071931839,
0.013054640032351017,
0.007256896700710058,
0.04092636704444885,
0.11989734321832657,
0.0023413989692926407,
-0.0013187187723815441,
-0.17911967635154724,
-0.10219714790582657,
0.05711273476481438,
0.11149770766496658,
-0.027989383786916733,
0.09362220764160156,
-0.04265616461634636,
-0.00672103650867939,
0.03556453436613083,
-0.08076124638319016,
-0.11806116253137589,
-0.09345906227827072,
-0.007757855113595724,
0.0018040260765701532,
-0.02495996095240116,
-0.05826380476355553,
-0.09195844829082489,
-0.01817806251347065,
0.13754703104496002,
0.005606554448604584,
-0.052765436470508575,
-0.13994386792182922,
0.056709785014390945,
0.13534609973430634,
-0.05054476484656334,
0.015314078889787197,
0.008828401565551758,
0.11060173809528351,
0.04741404578089714,
-0.07606837898492813,
0.05941290408372879,
-0.07224071025848389,
-0.16473886370658875,
-0.057503841817379,
0.1217053234577179,
0.08415065705776215,
0.058301590383052826,
0.003194612916558981,
0.028022387996315956,
-0.0022784823086112738,
-0.08161262422800064,
0.014965180307626724,
0.05621784180402756,
0.08934647589921951,
0.036062419414520264,
-0.09786369651556015,
0.0626290813088417,
-0.038391731679439545,
-0.006617635954171419,
0.1272580921649933,
0.20322303473949432,
-0.09524118155241013,
0.10144618898630142,
0.07315714657306671,
-0.07988305389881134,
-0.1868814080953598,
0.07080382853746414,
0.12600411474704742,
0.023719673976302147,
0.03693694621324539,
-0.21081741154193878,
0.13219361007213593,
0.10417120158672333,
-0.015004735440015793,
0.03959840163588524,
-0.29646167159080505,
-0.12471353262662888,
0.0768817663192749,
0.1037161648273468,
0.03160194307565689,
-0.12241490930318832,
-0.02449459582567215,
-0.014119520783424377,
-0.1300113946199417,
0.132660374045372,
-0.07944094389677048,
0.11836947500705719,
-0.004988161381334066,
0.11780502647161484,
0.024554187431931496,
-0.0381251722574234,
0.12993663549423218,
0.07373273372650146,
0.09066940099000931,
-0.03754138574004173,
0.007464224006980658,
0.059685591608285904,
-0.06664356589317322,
0.04059740528464317,
-0.04092564806342125,
0.0685061365365982,
-0.16995273530483246,
-0.00531378760933876,
-0.08945631980895996,
0.03891109302639961,
-0.04913870617747307,
-0.05696145445108414,
-0.019614355638623238,
0.054113876074552536,
0.06863868981599808,
-0.03937303647398949,
0.024446433410048485,
0.007775365374982357,
0.06302063167095184,
0.08876920491456985,
0.10275106132030487,
-0.0214555524289608,
-0.1099759116768837,
0.015053888782858849,
-0.0074209715239703655,
0.054436516016721725,
-0.11407382041215897,
0.02158327028155327,
0.1308491826057434,
0.06171141192317009,
0.12086789309978485,
0.02545093186199665,
-0.03128160163760185,
-0.01531989686191082,
0.008883285336196423,
-0.11400800198316574,
-0.12591224908828735,
0.04035389423370361,
-0.04024876281619072,
-0.1490430384874344,
0.008506912738084793,
0.097144216299057,
-0.03375590965151787,
-0.020305247977375984,
-0.009753873571753502,
0.020588267594575882,
-0.016574833542108536,
0.20106561481952667,
0.04187491163611412,
0.07265132665634155,
-0.10675752907991409,
0.11836520582437515,
0.053989019244909286,
-0.05973352491855621,
0.05136530473828316,
0.06417684257030487,
-0.1033744141459465,
-0.012545958161354065,
0.1157311350107193,
0.16004139184951782,
-0.031235432252287865,
-0.014808399602770805,
-0.07783347368240356,
-0.09192785620689392,
0.061710309237241745,
0.14919520914554596,
0.051707424223423004,
-0.013134864158928394,
-0.04928331822156906,
0.026782676577568054,
-0.1259518265724182,
0.08121275901794434,
0.045769739896059036,
0.06170498579740524,
-0.09585519134998322,
0.10570928454399109,
-0.006296229083091021,
0.041539277881383896,
-0.01675727404654026,
0.025420069694519043,
-0.0984850525856018,
-0.0111453328281641,
-0.14656327664852142,
0.0057468172162771225,
-0.0012325223069638014,
0.012982567772269249,
-0.021358896046876907,
-0.04963160306215286,
-0.023409554734826088,
0.027924492955207825,
-0.08849024772644043,
-0.05465948209166527,
0.01931188814342022,
0.04022044688463211,
-0.14790618419647217,
-0.015161572955548763,
0.021691346541047096,
-0.09556754678487778,
0.08062762022018433,
0.05827433615922928,
0.013037008233368397,
0.02827894501388073,
-0.09783093631267548,
-0.047456782311201096,
0.0036666998639702797,
0.027067244052886963,
0.08692479133605957,
-0.09198439866304398,
-0.016203174367547035,
-0.037115272134542465,
0.04290967434644699,
0.01734713464975357,
0.087529756128788,
-0.11353188008069992,
0.004709188360720873,
-0.04941290616989136,
-0.0384320504963398,
-0.0672788918018341,
0.04729180783033371,
0.10920103639364243,
0.04534902051091194,
0.16082246601581573,
-0.06941493600606918,
0.035725317895412445,
-0.19078397750854492,
-0.04090135172009468,
0.0017702446784824133,
-0.04611172527074814,
-0.08769426494836807,
-0.051017530262470245,
0.09706553816795349,
-0.04914449155330658,
0.0988597646355629,
-0.002171345055103302,
0.10007933527231216,
0.03171238303184509,
-0.01790553331375122,
-0.05151825398206711,
0.009230194613337517,
0.15523479878902435,
0.050022780895233154,
-0.015146410092711449,
0.11119302362203598,
-0.005086872726678848,
0.04631622135639191,
0.06958294659852982,
0.20769745111465454,
0.15396015346050262,
0.004931401461362839,
0.04918014258146286,
0.06329478323459625,
-0.11949656158685684,
-0.1406882256269455,
0.14097504317760468,
-0.04812297597527504,
0.12583212554454803,
-0.0695817768573761,
0.22593344748020172,
0.02230994403362274,
-0.18575814366340637,
0.06696952879428864,
-0.06049029901623726,
-0.12540008127689362,
-0.11346897482872009,
-0.03361692279577255,
-0.07445209473371506,
-0.10828249156475067,
0.022105498239398003,
-0.12083451449871063,
0.059931956231594086,
0.12835019826889038,
0.01798638515174389,
0.025510456413030624,
0.15924063324928284,
-0.03455629572272301,
0.018932290375232697,
0.07047019898891449,
0.019397754222154617,
-0.0051390016451478004,
-0.06447400152683258,
-0.060132965445518494,
0.050714749842882156,
0.024695901200175285,
0.07702688127756119,
-0.04470467194914818,
0.009083768352866173,
0.022506775334477425,
-0.014811379835009575,
-0.07251235097646713,
0.01447737030684948,
0.025821542367339134,
0.044565148651599884,
0.06144694983959198,
0.05187665671110153,
0.006683964282274246,
-0.037388477474451065,
0.28730425238609314,
-0.07941341400146484,
-0.09472192078828812,
-0.1347973793745041,
0.22294262051582336,
0.012082585133612156,
-0.02197165973484516,
0.07878545671701431,
-0.09590493142604828,
-0.022347651422023773,
0.17583349347114563,
0.13927331566810608,
-0.09730663895606995,
-0.0257362462580204,
-0.014515393413603306,
-0.01325082778930664,
-0.04670979455113411,
0.12818042933940887,
0.10905063897371292,
-0.011574478819966316,
-0.0804043784737587,
-0.022549282759428024,
-0.008739560842514038,
-0.05379095673561096,
-0.0672663152217865,
0.062040846794843674,
0.02004094608128071,
-0.000005589404736383585,
-0.04162892326712608,
0.05857761949300766,
-0.0016252914210781455,
-0.2507243752479553,
0.03563578799366951,
-0.15714360773563385,
-0.17801301181316376,
-0.03923093155026436,
0.05282696709036827,
-0.0031362101435661316,
0.039545945823192596,
-0.01177615113556385,
0.012364926747977734,
0.15359660983085632,
-0.0330301932990551,
-0.0335705503821373,
-0.12046024203300476,
0.11713612079620361,
-0.10594408214092255,
0.19875141978263855,
0.0027433675713837147,
0.07156975567340851,
0.09789308160543442,
0.012319658882915974,
-0.1335667073726654,
0.04034435749053955,
0.07635847479104996,
-0.10425558686256409,
0.01331592071801424,
0.153011292219162,
-0.053042102605104446,
0.08087820559740067,
0.020439013838768005,
-0.11032464355230331,
-0.010559098795056343,
-0.04400858283042908,
-0.036691322922706604,
-0.08008112013339996,
-0.014264167286455631,
-0.0632854625582695,
0.1597372144460678,
0.22817127406597137,
-0.024423176422715187,
0.01630266010761261,
-0.09355558454990387,
0.011612419039011002,
0.0440794974565506,
0.04837862029671669,
-0.0463968962430954,
-0.19392211735248566,
0.025358691811561584,
0.030828025192022324,
0.020118916407227516,
-0.20951399207115173,
-0.0838092789053917,
0.044919703155756,
-0.030269986018538475,
-0.04821840301156044,
0.10358571261167526,
0.031987037509679794,
0.04516834393143654,
-0.03459293767809868,
-0.09774154424667358,
-0.03970703110098839,
0.14373384416103363,
-0.16141054034233093,
-0.041653458029031754
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10129622370004654,
0.13538409769535065,
-0.00202020350843668,
0.09247631579637527,
0.13511700928211212,
0.03357446566224098,
0.08252904564142227,
0.13997970521450043,
-0.07970508933067322,
0.05354825407266617,
0.07442149519920349,
0.05959178879857063,
0.04743950814008713,
0.12875688076019287,
-0.042665380984544754,
-0.21573208272457123,
0.003211371600627899,
-0.014327232725918293,
-0.06542910635471344,
0.10337856411933899,
0.08889717608690262,
-0.11037412285804749,
0.06901811063289642,
-0.02152267098426819,
-0.15406061708927155,
0.009146793745458126,
-0.031749170273542404,
-0.023460831493139267,
0.10525007545948029,
-0.0077282460406422615,
0.0931873768568039,
0.015685681253671646,
0.14359284937381744,
-0.21432147920131683,
0.00025444888160564005,
0.0758427232503891,
0.04621018096804619,
0.0899588093161583,
0.03597749024629593,
0.01989927887916565,
0.028230303898453712,
-0.15134279429912567,
0.09256638586521149,
0.026201393455266953,
-0.07954226434230804,
-0.13222068548202515,
-0.09078467637300491,
0.026530403643846512,
0.08497194200754166,
0.0791136771440506,
0.008253540843725204,
0.12903665006160736,
-0.09883061796426773,
0.08234786987304688,
0.19864627718925476,
-0.2812919318675995,
-0.06563160568475723,
0.04610816016793251,
0.051490455865859985,
0.0734744518995285,
-0.11755667626857758,
-0.018454905599355698,
0.02171122469007969,
0.030710505321621895,
0.11107312887907028,
-0.020410709083080292,
-0.1298963725566864,
0.008244561031460762,
-0.12820997834205627,
-0.017440736293792725,
0.10818619281053543,
0.04151832312345505,
-0.04804055020213127,
-0.0772315040230751,
-0.06689651310443878,
-0.09594052284955978,
-0.028840040788054466,
-0.012812839820981026,
0.05252664536237717,
-0.0546223409473896,
-0.0666917935013771,
-0.05062604695558548,
-0.058010268956422806,
-0.09138178825378418,
0.003000285942107439,
0.12586823105812073,
0.04024883732199669,
0.019834954291582108,
-0.03166642412543297,
0.11804341524839401,
0.0012775888899341226,
-0.12873151898384094,
-0.011512172408401966,
0.005093670915812254,
-0.12059813737869263,
-0.051912881433963776,
-0.031873442232608795,
0.018536167219281197,
0.013244085013866425,
0.14978662133216858,
-0.03695166856050491,
0.07476087659597397,
0.013709938153624535,
-0.018264058977365494,
-0.02038266323506832,
0.16033315658569336,
-0.032145872712135315,
-0.04595862329006195,
0.0018807792803272605,
0.09804701805114746,
-0.0010310939978808165,
-0.005605529062449932,
-0.08175085484981537,
-0.018461942672729492,
0.07860478013753891,
0.06687119603157043,
-0.055873531848192215,
0.03628380224108696,
-0.04303726181387901,
-0.02516750432550907,
0.020426560193300247,
-0.12876032292842865,
0.036794621497392654,
0.0061866119503974915,
-0.08084122836589813,
-0.05883300304412842,
0.00932830199599266,
-0.015042580664157867,
-0.02578555978834629,
0.07475774735212326,
-0.0703972727060318,
-0.005375867709517479,
-0.08693857491016388,
-0.07852966338396072,
0.002259603701531887,
-0.13829416036605835,
-0.013055803254246712,
-0.046952035278081894,
-0.1991156041622162,
-0.0330212339758873,
0.050884801894426346,
-0.07688375562429428,
-0.04644443467259407,
-0.05123191699385643,
-0.07693392783403397,
0.009826860390603542,
-0.007196637336164713,
0.188847616314888,
-0.0573652982711792,
0.08220268785953522,
-0.017030984163284302,
0.04675392061471939,
0.022432222962379456,
0.050947725772857666,
-0.08002177625894547,
0.028372593224048615,
-0.14711245894432068,
0.08848188072443008,
-0.10319380462169647,
0.011307118460536003,
-0.12840035557746887,
-0.08563998341560364,
0.05724439397454262,
-0.022184163331985474,
0.06976901739835739,
0.13971039652824402,
-0.19329823553562164,
0.002201061462983489,
0.11948240548372269,
-0.03848491609096527,
-0.04223209619522095,
0.08110404759645462,
-0.05929850786924362,
0.03864792361855507,
0.0543411485850811,
0.18014298379421234,
0.08252611011266708,
-0.14275085926055908,
0.01000241655856371,
0.04469232261180878,
0.060223765671253204,
0.007224125787615776,
0.043026942759752274,
0.002268638927489519,
0.023340769112110138,
0.008451901376247406,
-0.09649960696697235,
-0.02354050613939762,
-0.08923853933811188,
-0.07311902940273285,
-0.049509089440107346,
-0.0931420624256134,
0.036889076232910156,
0.010678567923605442,
0.019383441656827927,
-0.06328530609607697,
-0.10751386731863022,
0.09809450060129166,
0.12607349455356598,
-0.05789107456803322,
0.012442788109183311,
-0.07875377684831619,
0.018745386973023415,
-0.01975216157734394,
-0.027911946177482605,
-0.19523192942142487,
-0.1255805790424347,
0.05126893147826195,
-0.04561462625861168,
0.02498190850019455,
0.032269187271595,
0.06495700031518936,
0.05486258119344711,
-0.04429100081324577,
-0.023051900789141655,
-0.06910989433526993,
-0.0029982654377818108,
-0.11405026167631149,
-0.2003367394208908,
-0.07246990501880646,
-0.037710145115852356,
0.14430510997772217,
-0.20069167017936707,
0.005185991991311312,
-0.022316191345453262,
0.10923656076192856,
0.017880504950881004,
-0.05019905045628548,
0.010736276395618916,
0.031478703022003174,
0.014081370085477829,
-0.09894129633903503,
0.04982214793562889,
0.01098677609115839,
-0.07045672088861465,
-0.04336925968527794,
-0.10103318095207214,
-0.001319609466008842,
0.05631878599524498,
0.08505965769290924,
-0.10097173601388931,
-0.018344512209296227,
-0.05264413356781006,
-0.03789297118782997,
-0.08497513830661774,
0.014860481023788452,
0.20435591042041779,
0.03398260846734047,
0.12415126711130142,
-0.05885865166783333,
-0.08017084747552872,
-0.004665524698793888,
0.020212244242429733,
0.03357265144586563,
0.10126328468322754,
0.07532180845737457,
-0.08155672252178192,
0.08093678951263428,
0.08116435259580612,
-0.04218316450715065,
0.11391495913267136,
-0.054405730217695236,
-0.07595805078744888,
-0.017015421763062477,
0.003972564358264208,
-0.028802042827010155,
0.1479814201593399,
-0.08146654069423676,
0.002040722407400608,
0.03816695511341095,
0.0192909836769104,
0.007724328897893429,
-0.17791666090488434,
-0.006552519742399454,
0.016895370557904243,
-0.06924544274806976,
-0.04860888794064522,
-0.03264154866337776,
0.033994317054748535,
0.09250535070896149,
0.028593139722943306,
-0.03169388696551323,
0.02309587597846985,
-0.011556166224181652,
-0.06157436594367027,
0.19369392096996307,
-0.11164899170398712,
-0.09390488266944885,
-0.09931457787752151,
0.019964108243584633,
-0.036624450236558914,
-0.03527137264609337,
0.0034755500964820385,
-0.09298048913478851,
-0.05304092913866043,
-0.08735982328653336,
-0.01897214911878109,
-0.012411040253937244,
-0.005382317118346691,
0.024212628602981567,
-0.014667465351521969,
0.07859335094690323,
-0.13289251923561096,
0.004231974482536316,
-0.02896380051970482,
-0.10174863040447235,
0.018457219004631042,
0.061583928763866425,
0.081084243953228,
0.10230448842048645,
-0.012125218287110329,
0.015967434272170067,
-0.03042527288198471,
0.22876036167144775,
-0.06493484228849411,
0.017429765313863754,
0.08869262784719467,
-0.003714771242812276,
0.056146040558815,
0.14006231725215912,
0.03233190253376961,
-0.10768498480319977,
0.028938716277480125,
0.07781043648719788,
-0.01919020712375641,
-0.2515565752983093,
-0.026422742754220963,
-0.01911751553416252,
-0.0718887522816658,
0.08619992434978485,
0.03828410431742668,
-0.05469539016485214,
0.03451671823859215,
0.013741746544837952,
0.004791773855686188,
-0.034708600491285324,
0.06805920600891113,
0.09096906334161758,
0.03678582236170769,
0.10166402161121368,
-0.020724637433886528,
-0.00913334358483553,
0.0630650594830513,
0.02602563612163067,
0.2708764672279358,
-0.04276350140571594,
0.13479089736938477,
0.02962687611579895,
0.14960236847400665,
-0.023453084751963615,
0.04959743097424507,
0.011168565601110458,
0.00020745354413520545,
-0.0058347606100142,
-0.04792442172765732,
-0.0214151032269001,
0.002547403797507286,
-0.030686136335134506,
0.023432496935129166,
-0.07252667099237442,
0.03319694474339485,
0.0165077056735754,
0.3142293095588684,
0.04378320649266243,
-0.2716812789440155,
-0.07217850536108017,
0.004599666688591242,
-0.04770994931459427,
-0.07736770808696747,
0.004901184234768152,
0.1470649540424347,
-0.12914316356182098,
0.027911555022001266,
-0.056215621531009674,
0.09056325256824493,
-0.04804783686995506,
0.008744494989514351,
0.06564266234636307,
0.1424950808286667,
-0.008949673734605312,
0.07406902313232422,
-0.20564325153827667,
0.23447257280349731,
0.02625945955514908,
0.1057162657380104,
-0.06116572394967079,
0.013192996382713318,
0.007191635202616453,
0.04201418533921242,
0.11987831443548203,
0.003684049705043435,
-0.002547483891248703,
-0.17934969067573547,
-0.10376111418008804,
0.05701525881886482,
0.11186414211988449,
-0.03020213358104229,
0.09250934422016144,
-0.04379047080874443,
-0.005303500685840845,
0.035051390528678894,
-0.07923123985528946,
-0.11705217510461807,
-0.09287779033184052,
-0.008270679973065853,
0.0009470771183259785,
-0.02526339516043663,
-0.05947897583246231,
-0.09188858419656754,
-0.013420656323432922,
0.14008119702339172,
0.0051790340803563595,
-0.05313033610582352,
-0.13920028507709503,
0.05545951798558235,
0.1342466026544571,
-0.05192245543003082,
0.013685199432075024,
0.0083387466147542,
0.11138490587472916,
0.04785631224513054,
-0.07621072232723236,
0.05904722958803177,
-0.07234976440668106,
-0.16587789356708527,
-0.057854149490594864,
0.12170001864433289,
0.0843518078327179,
0.058559808880090714,
0.003264078637585044,
0.027524374425411224,
-0.0013299495913088322,
-0.08169000595808029,
0.015987081453204155,
0.057445015758275986,
0.0891178548336029,
0.0365133099257946,
-0.09780927747488022,
0.06611310690641403,
-0.03790285810828209,
-0.009011789225041866,
0.1290818303823471,
0.20713159441947937,
-0.09613882005214691,
0.10337309539318085,
0.07237772643566132,
-0.08114797621965408,
-0.1873728483915329,
0.06863455474376678,
0.1278141736984253,
0.022975534200668335,
0.03922151401638985,
-0.21152648329734802,
0.131911501288414,
0.10422991961240768,
-0.01662343740463257,
0.03836711868643761,
-0.2976759374141693,
-0.12483040988445282,
0.07557056844234467,
0.10349193960428238,
0.035200100392103195,
-0.12133651971817017,
-0.02492753230035305,
-0.012197241187095642,
-0.1294575333595276,
0.13155502080917358,
-0.07688860595226288,
0.11856686323881149,
-0.0051529970951378345,
0.11900363117456436,
0.02467990852892399,
-0.036876529455184937,
0.13273951411247253,
0.07188441604375839,
0.08902537822723389,
-0.037103742361068726,
0.008778952993452549,
0.05657283961772919,
-0.06750492006540298,
0.0390443354845047,
-0.04051314666867256,
0.0696890577673912,
-0.17125029861927032,
-0.005448474083095789,
-0.08882514387369156,
0.03897783160209656,
-0.04974443465471268,
-0.05678514763712883,
-0.020089389756321907,
0.053791970014572144,
0.06889905780553818,
-0.03888951241970062,
0.022921279072761536,
0.008144600316882133,
0.060958895832300186,
0.09415125846862793,
0.10172698646783829,
-0.020007506012916565,
-0.11067263782024384,
0.01344224251806736,
-0.0077095963060855865,
0.05469091609120369,
-0.11254853755235672,
0.023321954533457756,
0.12970322370529175,
0.06049497425556183,
0.12094072252511978,
0.025417502969503403,
-0.03156597539782524,
-0.013787501491606236,
0.008822399191558361,
-0.11562037467956543,
-0.1275942027568817,
0.03937740623950958,
-0.039190296083688736,
-0.14913876354694366,
0.005211973562836647,
0.09992530941963196,
-0.03299615532159805,
-0.0198511965572834,
-0.009492823854088783,
0.021252689883112907,
-0.017202917486429214,
0.1988764852285385,
0.04048500582575798,
0.07248351722955704,
-0.10536716133356094,
0.11840087920427322,
0.053958456963300705,
-0.05782262235879898,
0.05127796158194542,
0.06233527511358261,
-0.10216914117336273,
-0.011720545589923859,
0.11591894924640656,
0.160017728805542,
-0.03435812518000603,
-0.014517746865749359,
-0.07735933363437653,
-0.09221723675727844,
0.060733862221241,
0.14712224900722504,
0.05275571346282959,
-0.011747227050364017,
-0.04942992329597473,
0.02632772922515869,
-0.12448102980852127,
0.081224225461483,
0.04704391956329346,
0.061660539358854294,
-0.09652354568243027,
0.10599275678396225,
-0.006875331979244947,
0.04324931651353836,
-0.017120813950896263,
0.023939885199069977,
-0.09823241829872131,
-0.011364326812326908,
-0.14851553738117218,
0.008739748038351536,
0.0015542444307357073,
0.012980268336832523,
-0.01993856206536293,
-0.05054960772395134,
-0.0234686192125082,
0.02891519106924534,
-0.08873648196458817,
-0.05445234477519989,
0.018825389444828033,
0.04147764667868614,
-0.14762094616889954,
-0.01609252765774727,
0.023480942472815514,
-0.09615431725978851,
0.08210301399230957,
0.05915377661585808,
0.013585226610302925,
0.027434546500444412,
-0.09595751762390137,
-0.047782476991415024,
0.003515668213367462,
0.027242686599493027,
0.08608254045248032,
-0.09307456016540527,
-0.01746082678437233,
-0.036813605576753616,
0.041895315051078796,
0.017429227009415627,
0.0912812277674675,
-0.11423833668231964,
0.005818390753120184,
-0.04952964931726456,
-0.03945169597864151,
-0.06688088923692703,
0.0456397719681263,
0.10822625458240509,
0.048293307423591614,
0.1607242375612259,
-0.071201853454113,
0.03544691577553749,
-0.1904553323984146,
-0.04031474143266678,
0.001949421362951398,
-0.04366566985845566,
-0.08895520120859146,
-0.05133894830942154,
0.09558061510324478,
-0.04905350133776665,
0.09752116352319717,
-0.002083094324916601,
0.09745986759662628,
0.031918611377477646,
-0.02094605378806591,
-0.04902828484773636,
0.00828892458230257,
0.15343523025512695,
0.04984424635767937,
-0.0143346656113863,
0.11201269924640656,
-0.007114053703844547,
0.04725605621933937,
0.06901369988918304,
0.2101432979106903,
0.15355363488197327,
0.00522890780121088,
0.04904162511229515,
0.06263521313667297,
-0.11823885142803192,
-0.14328669011592865,
0.1393013298511505,
-0.04777657985687256,
0.12548282742500305,
-0.06773623824119568,
0.2240695208311081,
0.02308260090649128,
-0.18660558760166168,
0.06542254984378815,
-0.060345374047756195,
-0.12524467706680298,
-0.11548034101724625,
-0.03246208280324936,
-0.07558275759220123,
-0.10897122323513031,
0.021508563309907913,
-0.12078487873077393,
0.0608629509806633,
0.12796998023986816,
0.017152564600110054,
0.02617482654750347,
0.1560553014278412,
-0.0340123251080513,
0.018723098561167717,
0.06948623061180115,
0.018845614045858383,
-0.004197744652628899,
-0.06292488425970078,
-0.06179990991950035,
0.05056971311569214,
0.026820892468094826,
0.07632603496313095,
-0.043144483119249344,
0.01054467260837555,
0.02114051580429077,
-0.016160914674401283,
-0.07266473770141602,
0.013747104443609715,
0.026193981990218163,
0.044134676456451416,
0.06038234010338783,
0.05265727639198303,
0.006934334523975849,
-0.036548394709825516,
0.28901439905166626,
-0.07861951738595963,
-0.09387557208538055,
-0.13438299298286438,
0.22604195773601532,
0.009588980115950108,
-0.020993582904338837,
0.07959414273500443,
-0.09484849870204926,
-0.024892952293157578,
0.1738569438457489,
0.13804395496845245,
-0.1011291965842247,
-0.02523821033537388,
-0.015423225238919258,
-0.013131224550306797,
-0.04565267264842987,
0.13033972680568695,
0.10830334573984146,
-0.009800284169614315,
-0.08150629699230194,
-0.023560887202620506,
-0.010824703611433506,
-0.05171875283122063,
-0.06828996539115906,
0.06228797510266304,
0.018353354185819626,
0.0015062710735946894,
-0.04303066059947014,
0.05674959346652031,
-0.0027093240059912205,
-0.24997705221176147,
0.03520967438817024,
-0.15637077391147614,
-0.17833520472049713,
-0.037981823086738586,
0.05369963124394417,
-0.005626271944493055,
0.03942443057894707,
-0.013798028230667114,
0.012386507354676723,
0.15461961925029755,
-0.034039899706840515,
-0.033563435077667236,
-0.11845710873603821,
0.11814011633396149,
-0.10565824806690216,
0.2004222571849823,
0.004053378943353891,
0.07213708013296127,
0.096785768866539,
0.01346250344067812,
-0.13453544676303864,
0.03944828733801842,
0.07574602961540222,
-0.10452525317668915,
0.012477684766054153,
0.15348321199417114,
-0.053149912506341934,
0.08263593167066574,
0.02255251631140709,
-0.10925553739070892,
-0.011988888494670391,
-0.04405955970287323,
-0.036787569522857666,
-0.07929641753435135,
-0.01737726852297783,
-0.06563732773065567,
0.15849646925926208,
0.22745387256145477,
-0.024676088243722916,
0.0166685301810503,
-0.09259171038866043,
0.012402277439832687,
0.04432128742337227,
0.050702519714832306,
-0.045427288860082626,
-0.1935582160949707,
0.024943513795733452,
0.033056583255529404,
0.019817223772406578,
-0.21062423288822174,
-0.08608153462409973,
0.044976454228162766,
-0.028787484392523766,
-0.04732616990804672,
0.10535355657339096,
0.03032110072672367,
0.04488835856318474,
-0.03389202803373337,
-0.09954994171857834,
-0.0402875654399395,
0.14268909394741058,
-0.16031505167484283,
-0.04204968735575676
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10189717262983322,
0.1348591148853302,
-0.0019889138638973236,
0.09340251982212067,
0.13636623322963715,
0.03413691744208336,
0.08314309269189835,
0.13912290334701538,
-0.08013676106929779,
0.053176190704107285,
0.07446851581335068,
0.059344902634620667,
0.04683567211031914,
0.12834954261779785,
-0.04233553633093834,
-0.21552757918834686,
0.003042177064344287,
-0.013918033801019192,
-0.06482440233230591,
0.10326992720365524,
0.08760318905115128,
-0.1111224964261055,
0.06951463222503662,
-0.02230260707437992,
-0.15450899302959442,
0.009908201172947884,
-0.03248876705765724,
-0.023870699107646942,
0.10518023371696472,
-0.007887695915997028,
0.0929245725274086,
0.015292265452444553,
0.14342477917671204,
-0.21266475319862366,
0.0004317720595281571,
0.07581207156181335,
0.045332714915275574,
0.08929572254419327,
0.03580458089709282,
0.01973591186106205,
0.027281705290079117,
-0.15156099200248718,
0.09285158663988113,
0.026035141199827194,
-0.07961273938417435,
-0.1331366002559662,
-0.09036742895841599,
0.02559015154838562,
0.0845004990696907,
0.08046423643827438,
0.007606842089444399,
0.12784568965435028,
-0.09944945573806763,
0.08266016095876694,
0.19609570503234863,
-0.2822944223880768,
-0.06593730300664902,
0.046182241290807724,
0.05158363655209541,
0.07387666404247284,
-0.11852158606052399,
-0.01788092590868473,
0.021907201036810875,
0.03185144439339638,
0.11129708588123322,
-0.020898178219795227,
-0.1296512484550476,
0.00859907828271389,
-0.12785780429840088,
-0.015968669205904007,
0.10948089510202408,
0.041835956275463104,
-0.047909583896398544,
-0.07824552804231644,
-0.06543368846178055,
-0.09724219888448715,
-0.02926536276936531,
-0.011750166304409504,
0.052729059010744095,
-0.05607578903436661,
-0.06729020178318024,
-0.04830310866236687,
-0.057721883058547974,
-0.08996938914060593,
0.003775634802877903,
0.1241091638803482,
0.04026690870523453,
0.01949560083448887,
-0.030886845663189888,
0.11705686151981354,
0.001597676775418222,
-0.12830883264541626,
-0.010392092168331146,
0.004430053289979696,
-0.12017279863357544,
-0.05182228982448578,
-0.032273922115564346,
0.02087574079632759,
0.013744988478720188,
0.15028701722621918,
-0.03694593533873558,
0.07579381763935089,
0.014679611660540104,
-0.01958082802593708,
-0.0203907061368227,
0.15822583436965942,
-0.03361016884446144,
-0.04773584008216858,
0.0014611538499593735,
0.09855245798826218,
-0.0019099544733762741,
-0.005817183293402195,
-0.08152133971452713,
-0.017099162563681602,
0.0775117501616478,
0.0665363073348999,
-0.0571952722966671,
0.038250505924224854,
-0.04168540611863136,
-0.024810051545500755,
0.020971765741705894,
-0.12901079654693604,
0.03629038855433464,
0.0062661501578986645,
-0.08092139661312103,
-0.05795903876423836,
0.008391650393605232,
-0.016203787177801132,
-0.026616258546710014,
0.07468035072088242,
-0.07137113809585571,
-0.006457267329096794,
-0.08703655004501343,
-0.07942360639572144,
0.0018624932272359729,
-0.13817524909973145,
-0.011967489495873451,
-0.04645659774541855,
-0.1990516185760498,
-0.03457046300172806,
0.050583478063344955,
-0.07692574709653854,
-0.04541851580142975,
-0.05100508779287338,
-0.07794032245874405,
0.009419466368854046,
-0.007230713963508606,
0.18963240087032318,
-0.057878945022821426,
0.08100836724042892,
-0.015277385711669922,
0.046741876751184464,
0.022427722811698914,
0.051045868545770645,
-0.07935535907745361,
0.027585530653595924,
-0.14713692665100098,
0.08732251822948456,
-0.10421005636453629,
0.01336990762501955,
-0.12715573608875275,
-0.08717404305934906,
0.05849588289856911,
-0.021212933585047722,
0.06898441910743713,
0.13963839411735535,
-0.19219906628131866,
0.001862971344962716,
0.11847485601902008,
-0.0374203659594059,
-0.04241029545664787,
0.08060694485902786,
-0.059237297624349594,
0.03679656609892845,
0.05490544065833092,
0.17999869585037231,
0.08264843374490738,
-0.142264723777771,
0.008428839966654778,
0.043278321623802185,
0.06087183579802513,
0.006686486303806305,
0.041564829647541046,
0.003340387949720025,
0.023842619732022285,
0.009451462887227535,
-0.09508514404296875,
-0.023839250206947327,
-0.08845636248588562,
-0.07263567298650742,
-0.04853526130318642,
-0.09292008727788925,
0.036677662283182144,
0.011232107877731323,
0.01936160773038864,
-0.06456165760755539,
-0.10798116028308868,
0.09708493947982788,
0.12571217119693756,
-0.05855879932641983,
0.010715633630752563,
-0.07912269234657288,
0.019450554624199867,
-0.019969165325164795,
-0.027279457077383995,
-0.19458289444446564,
-0.12711192667484283,
0.05018535256385803,
-0.04314018040895462,
0.024651145562529564,
0.03275485709309578,
0.06572804600000381,
0.05538706108927727,
-0.04476487636566162,
-0.023651642724871635,
-0.06807505339384079,
-0.002485736971721053,
-0.11350615322589874,
-0.20074492692947388,
-0.07310397177934647,
-0.037451665848493576,
0.14484189450740814,
-0.20208251476287842,
0.005652885418385267,
-0.021406780928373337,
0.10863982141017914,
0.01720740832388401,
-0.04983089864253998,
0.01181622501462698,
0.0326593853533268,
0.015244588255882263,
-0.09876804798841476,
0.05052913725376129,
0.010608790442347527,
-0.06882277131080627,
-0.044854555279016495,
-0.10162338614463806,
0.0006445262697525322,
0.0576382614672184,
0.08424769341945648,
-0.10129719972610474,
-0.018028998747467995,
-0.05240219458937645,
-0.03792759031057358,
-0.0832725539803505,
0.014473747462034225,
0.20537233352661133,
0.03323974460363388,
0.12384086847305298,
-0.05782784894108772,
-0.07983329147100449,
-0.004007588140666485,
0.02197444625198841,
0.034528885036706924,
0.10111233592033386,
0.07434099167585373,
-0.07888199388980865,
0.08139407634735107,
0.08015374839305878,
-0.04195484519004822,
0.11459580063819885,
-0.05487833544611931,
-0.0755424052476883,
-0.016217097640037537,
0.002912241732701659,
-0.029352335259318352,
0.1489432454109192,
-0.08252053707838058,
0.0005152208614163101,
0.037897899746894836,
0.01866433583199978,
0.008023780770599842,
-0.17682790756225586,
-0.0064614093862473965,
0.01569473370909691,
-0.06828813254833221,
-0.04960562661290169,
-0.03225104138255119,
0.03332943469285965,
0.09208609908819199,
0.02849947288632393,
-0.032946255058050156,
0.023249130696058273,
-0.011524281464517117,
-0.061172835528850555,
0.19403232634067535,
-0.11187224835157394,
-0.09316664189100266,
-0.09901698678731918,
0.017981810495257378,
-0.03786337003111839,
-0.035880714654922485,
0.0032660949509590864,
-0.09486818313598633,
-0.05362791568040848,
-0.08706402778625488,
-0.0203021802008152,
-0.012345656752586365,
-0.005606230814009905,
0.024006081745028496,
-0.014866065233945847,
0.0782359167933464,
-0.13254937529563904,
0.004369062837213278,
-0.02918204478919506,
-0.10198667645454407,
0.019095301628112793,
0.06221610680222511,
0.0813763439655304,
0.10154193639755249,
-0.011450025252997875,
0.01602986454963684,
-0.03054692968726158,
0.22957289218902588,
-0.06515457481145859,
0.01774510182440281,
0.08857549726963043,
-0.0025986123364418745,
0.054806895554065704,
0.14013929665088654,
0.03339089825749397,
-0.1082959771156311,
0.028585392981767654,
0.0785263329744339,
-0.01862035132944584,
-0.2518285810947418,
-0.025676675140857697,
-0.019610092043876648,
-0.07177161425352097,
0.08627986162900925,
0.03837946802377701,
-0.05379123240709305,
0.03505250811576843,
0.014691616408526897,
0.005921951029449701,
-0.03470338508486748,
0.06759988516569138,
0.09309318661689758,
0.036109842360019684,
0.10250592231750488,
-0.020986484363675117,
-0.009206496179103851,
0.06225130707025528,
0.026391038671135902,
0.272098571062088,
-0.04278147965669632,
0.134741872549057,
0.030175706371665,
0.14916662871837616,
-0.02328239195048809,
0.05069812759757042,
0.01077831070870161,
-0.00047059881035238504,
-0.005321388598531485,
-0.04792865738272667,
-0.01982734724879265,
0.001707307412289083,
-0.03193684294819832,
0.02254505828022957,
-0.07298007607460022,
0.03123532608151436,
0.016693277284502983,
0.3136419951915741,
0.04363863170146942,
-0.27412527799606323,
-0.07305831462144852,
0.0038599157705903053,
-0.04762125015258789,
-0.07661093771457672,
0.005171905737370253,
0.14650164544582367,
-0.12867885828018188,
0.029136423021554947,
-0.056329336017370224,
0.08999147266149521,
-0.048162270337343216,
0.009762857109308243,
0.06800645589828491,
0.14307913184165955,
-0.009654604829847813,
0.07364621013402939,
-0.2057456225156784,
0.2330743819475174,
0.026334071531891823,
0.10644539445638657,
-0.06152850762009621,
0.013388472609221935,
0.00782795250415802,
0.04302012175321579,
0.11968994140625,
0.0030347981955856085,
-0.002419454976916313,
-0.17937426269054413,
-0.10272246599197388,
0.05826956778764725,
0.11118005216121674,
-0.029003532603383064,
0.0934879332780838,
-0.04276259243488312,
-0.005913987755775452,
0.03456292301416397,
-0.0805870071053505,
-0.11757874488830566,
-0.09388626366853714,
-0.008572544902563095,
0.0018614375730976462,
-0.024806279689073563,
-0.05892939493060112,
-0.09260256588459015,
-0.015420420095324516,
0.1393755078315735,
0.006244957447052002,
-0.05341712385416031,
-0.1398877650499344,
0.05454535037279129,
0.1345098465681076,
-0.05058056488633156,
0.014043871313333511,
0.009160853922367096,
0.11034069955348969,
0.04905390366911888,
-0.07584813237190247,
0.05917029455304146,
-0.07306017726659775,
-0.16500325500965118,
-0.05779392644762993,
0.12132570892572403,
0.08366952836513519,
0.05767875164747238,
0.0028826503548771143,
0.02777007222175598,
-0.0017808584962040186,
-0.0823979452252388,
0.016920877620577812,
0.05528251826763153,
0.09073658287525177,
0.03569934517145157,
-0.09881091117858887,
0.0656629353761673,
-0.0373082309961319,
-0.008168189786374569,
0.12701484560966492,
0.20436912775039673,
-0.09516485035419464,
0.10153063386678696,
0.07280624657869339,
-0.0806247740983963,
-0.18669171631336212,
0.06965764611959457,
0.12635685503482819,
0.02335948869585991,
0.037450604140758514,
-0.21253594756126404,
0.1334388554096222,
0.1039106696844101,
-0.015389205887913704,
0.0406763032078743,
-0.2945367395877838,
-0.12413528561592102,
0.07556511461734772,
0.1043747141957283,
0.03536340221762657,
-0.12187375873327255,
-0.02426338382065296,
-0.012255197390913963,
-0.12928901612758636,
0.13039569556713104,
-0.07972944527864456,
0.11870507895946503,
-0.0055364943109452724,
0.1192093938589096,
0.023719126358628273,
-0.03715796023607254,
0.13175269961357117,
0.07301775366067886,
0.08975640684366226,
-0.03738085925579071,
0.00935648288577795,
0.05661487579345703,
-0.06663321703672409,
0.03912116959691048,
-0.04147880896925926,
0.06884779036045074,
-0.17200720310211182,
-0.0053587728179991245,
-0.08975586295127869,
0.038515109568834305,
-0.04977262020111084,
-0.056269194930791855,
-0.018734445795416832,
0.05403291806578636,
0.06779715418815613,
-0.03908567130565643,
0.02146090939640999,
0.007381516508758068,
0.06167144328355789,
0.093123659491539,
0.10246346145868301,
-0.021308209747076035,
-0.11198709905147552,
0.014371106401085854,
-0.00820915587246418,
0.0542718842625618,
-0.11210111528635025,
0.02191278524696827,
0.13054928183555603,
0.05972540006041527,
0.12082395702600479,
0.02618921920657158,
-0.030407559126615524,
-0.014333357103168964,
0.010004930198192596,
-0.11594611406326294,
-0.12618833780288696,
0.03968171775341034,
-0.043393511325120926,
-0.14906568825244904,
0.006683706305921078,
0.0997561439871788,
-0.03281727433204651,
-0.019738104194402695,
-0.009682847186923027,
0.021007388830184937,
-0.017287667840719223,
0.20018446445465088,
0.04061673954129219,
0.0723077729344368,
-0.10659251362085342,
0.11807096749544144,
0.05394411459565163,
-0.0591011717915535,
0.051175542175769806,
0.06389108300209045,
-0.10336893051862717,
-0.01299501396715641,
0.11459451168775558,
0.16131453216075897,
-0.03251499682664871,
-0.015292310155928135,
-0.07876919955015182,
-0.09278585016727448,
0.06088140606880188,
0.14557667076587677,
0.05211155116558075,
-0.01282027829438448,
-0.04950988292694092,
0.025889284908771515,
-0.1251130849123001,
0.08102437108755112,
0.04573668912053108,
0.06189568713307381,
-0.0965539738535881,
0.10514817386865616,
-0.006478685420006514,
0.04241861775517464,
-0.017085451632738113,
0.024514788761734962,
-0.09894760698080063,
-0.011640951968729496,
-0.148642435669899,
0.007644870784133673,
0.0008806156110949814,
0.013573208823800087,
-0.02035074308514595,
-0.049149852246046066,
-0.02415434829890728,
0.02924269624054432,
-0.08886593580245972,
-0.054018329828977585,
0.020727762952446938,
0.042240407317876816,
-0.14688432216644287,
-0.015450939536094666,
0.022132139652967453,
-0.09641366451978683,
0.08217281848192215,
0.059096310287714005,
0.01328540500253439,
0.028027597814798355,
-0.09358079731464386,
-0.04816627502441406,
0.003035284811630845,
0.026359137147665024,
0.08674163371324539,
-0.09274792671203613,
-0.016858121380209923,
-0.036902036517858505,
0.04226144775748253,
0.0177648663520813,
0.0900769904255867,
-0.11362186074256897,
0.00604541739448905,
-0.048722825944423676,
-0.03848050907254219,
-0.06753373891115189,
0.0462116077542305,
0.10920916497707367,
0.04700819030404091,
0.1610490381717682,
-0.0713484063744545,
0.034815553575754166,
-0.19061435759067535,
-0.040859244763851166,
0.0016282135620713234,
-0.04526696354150772,
-0.08810114860534668,
-0.05093849077820778,
0.09644156694412231,
-0.04977519065141678,
0.09924157708883286,
-0.0026324272621423006,
0.0979585275053978,
0.0315803661942482,
-0.02088000439107418,
-0.05059734359383583,
0.009208998642861843,
0.15291430056095123,
0.049313172698020935,
-0.015142890624701977,
0.11129607260227203,
-0.006118435878306627,
0.04622425138950348,
0.07035048305988312,
0.20815439522266388,
0.15277479588985443,
0.006093010306358337,
0.04896920174360275,
0.06341923773288727,
-0.11928141862154007,
-0.14211325347423553,
0.14056554436683655,
-0.047453880310058594,
0.12666042149066925,
-0.06888401508331299,
0.22216807305812836,
0.02233186922967434,
-0.18577487766742706,
0.06567889451980591,
-0.06158898025751114,
-0.12544482946395874,
-0.11387744545936584,
-0.031236493960022926,
-0.07470438629388809,
-0.1097453162074089,
0.02184447832405567,
-0.12148310244083405,
0.05958772823214531,
0.12925364077091217,
0.018030941486358643,
0.02558882348239422,
0.1577063947916031,
-0.03259210288524628,
0.019461970776319504,
0.0697781965136528,
0.01852749101817608,
-0.004633003380149603,
-0.06238787621259689,
-0.06059873104095459,
0.04966538026928902,
0.025118838995695114,
0.07592210918664932,
-0.04393535107374191,
0.00977408979088068,
0.021899035200476646,
-0.0153884869068861,
-0.07221648097038269,
0.013801825232803822,
0.02630363218486309,
0.04387891665101051,
0.06170288100838661,
0.052062563598155975,
0.006082427687942982,
-0.03675119951367378,
0.28760501742362976,
-0.07860378921031952,
-0.0922626182436943,
-0.13531747460365295,
0.22518667578697205,
0.009585075080394745,
-0.021724918857216835,
0.07876855880022049,
-0.0941278487443924,
-0.023532606661319733,
0.17546004056930542,
0.14026330411434174,
-0.10021065920591354,
-0.025527365505695343,
-0.014767667278647423,
-0.013159061782062054,
-0.045541249215602875,
0.13057759404182434,
0.1083807647228241,
-0.010266450233757496,
-0.08138089627027512,
-0.02350044436752796,
-0.010066385380923748,
-0.05181138962507248,
-0.0680646002292633,
0.06197570636868477,
0.019496304914355278,
0.001183829503133893,
-0.0423131138086319,
0.056975968182086945,
-0.002598936902359128,
-0.25006306171417236,
0.0364559106528759,
-0.1557493656873703,
-0.1789184957742691,
-0.03910483047366142,
0.05298382043838501,
-0.004607571288943291,
0.040146443992853165,
-0.01328439824283123,
0.011887956410646439,
0.15574803948402405,
-0.03405703976750374,
-0.032886140048503876,
-0.1195300966501236,
0.11834365874528885,
-0.1054454818367958,
0.1989412158727646,
0.0033228066749870777,
0.07214460521936417,
0.09743267297744751,
0.013024468906223774,
-0.13448871672153473,
0.04004694148898125,
0.07521989196538925,
-0.10502822697162628,
0.013125458732247353,
0.15281865000724792,
-0.05302167311310768,
0.08185447007417679,
0.021149883046746254,
-0.10819029808044434,
-0.011297888122498989,
-0.04344853386282921,
-0.03639456629753113,
-0.0791134461760521,
-0.01679549179971218,
-0.06527851521968842,
0.15889810025691986,
0.22908084094524384,
-0.024694137275218964,
0.016355635598301888,
-0.09251534193754196,
0.01259488333016634,
0.04483799636363983,
0.04896415024995804,
-0.046108026057481766,
-0.19395974278450012,
0.025196565315127373,
0.033197786659002304,
0.019456278532743454,
-0.2118125706911087,
-0.08478032052516937,
0.04500648006796837,
-0.02850513346493244,
-0.04787092283368111,
0.10523581504821777,
0.031422343105077744,
0.04579976573586464,
-0.034606438130140305,
-0.09639108180999756,
-0.04031059145927429,
0.14243146777153015,
-0.16069073975086212,
-0.04259832948446274
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10178697854280472,
0.13452737033367157,
-0.00195504748262465,
0.09295835345983505,
0.13667839765548706,
0.03357855603098869,
0.08275282382965088,
0.13979429006576538,
-0.07907605171203613,
0.05361323058605194,
0.07364438474178314,
0.0606226846575737,
0.04735708236694336,
0.12789805233478546,
-0.04239572584629059,
-0.2152659147977829,
0.003005857579410076,
-0.013436747714877129,
-0.064724862575531,
0.1034029871225357,
0.08779542148113251,
-0.11136387288570404,
0.06855007261037827,
-0.02272144705057144,
-0.15485481917858124,
0.01000349409878254,
-0.03295646607875824,
-0.023383021354675293,
0.10482914745807648,
-0.008440784178674221,
0.09242131561040878,
0.015271816402673721,
0.14320911467075348,
-0.21354161202907562,
0.0004353405674919486,
0.07624849677085876,
0.04558388516306877,
0.08955148607492447,
0.03675542026758194,
0.01989748328924179,
0.029022786766290665,
-0.15148715674877167,
0.09264672547578812,
0.026607267558574677,
-0.0794602781534195,
-0.1309981495141983,
-0.09082713723182678,
0.024989333003759384,
0.08495844155550003,
0.08120984584093094,
0.007062356919050217,
0.12944325804710388,
-0.09982412308454514,
0.08293132483959198,
0.19825603067874908,
-0.28045281767845154,
-0.06600344926118851,
0.047353096306324005,
0.05164571478962898,
0.07257521152496338,
-0.11956136673688889,
-0.018872609362006187,
0.021781273186206818,
0.03147568181157112,
0.11001837998628616,
-0.020161503925919533,
-0.13147756457328796,
0.008186145685613155,
-0.12830011546611786,
-0.01609671115875244,
0.10773400217294693,
0.04213830456137657,
-0.04712290316820145,
-0.07819361984729767,
-0.06598557531833649,
-0.09629219025373459,
-0.029191384091973305,
-0.011747551150619984,
0.05271834135055542,
-0.056159473955631256,
-0.06666933000087738,
-0.048274606466293335,
-0.05735911801457405,
-0.09118185192346573,
0.004083259496837854,
0.12369272857904434,
0.0402291938662529,
0.018988575786352158,
-0.0318366140127182,
0.11690263450145721,
0.0015342672122642398,
-0.12890027463436127,
-0.011436504311859608,
0.00527582224458456,
-0.12047620117664337,
-0.05203229561448097,
-0.03160165250301361,
0.01843463070690632,
0.013149769976735115,
0.14778777956962585,
-0.037064168602228165,
0.07628846913576126,
0.013711975887417793,
-0.019549164921045303,
-0.020961564034223557,
0.1588452011346817,
-0.03272860124707222,
-0.04571383818984032,
0.000626922701485455,
0.09864349663257599,
-0.0025248599704355,
-0.005350809078663588,
-0.08090353012084961,
-0.017344405874609947,
0.0784335508942604,
0.06632400304079056,
-0.05636508762836456,
0.03768004849553108,
-0.04215360805392265,
-0.024982310831546783,
0.021599628031253815,
-0.12877437472343445,
0.03668750077486038,
0.0057224128395318985,
-0.08143457770347595,
-0.05924420803785324,
0.008609590120613575,
-0.01640939526259899,
-0.02703195996582508,
0.07554268836975098,
-0.07167105376720428,
-0.006127539556473494,
-0.0880240872502327,
-0.07907713204622269,
0.002476328518241644,
-0.13982199132442474,
-0.012250993400812149,
-0.045452702790498734,
-0.19954387843608856,
-0.03474361076951027,
0.04993682727217674,
-0.07711692899465561,
-0.04485483840107918,
-0.05185289308428764,
-0.07870005816221237,
0.00983777828514576,
-0.006672328803688288,
0.19080403447151184,
-0.05734734609723091,
0.08189236372709274,
-0.0155643867328763,
0.046336930245161057,
0.023090677335858345,
0.051814768463373184,
-0.08074094355106354,
0.027209052816033363,
-0.14653484523296356,
0.0877414271235466,
-0.10518775135278702,
0.013204636983573437,
-0.12814448773860931,
-0.0862790122628212,
0.05712173879146576,
-0.021913688629865646,
0.07004928588867188,
0.14056266844272614,
-0.19259323179721832,
0.002309059491381049,
0.11906429380178452,
-0.03793745115399361,
-0.04286475107073784,
0.07942723482847214,
-0.05892074480652809,
0.03663381561636925,
0.054291896522045135,
0.18052764236927032,
0.08217678219079971,
-0.14193221926689148,
0.00721267145127058,
0.04272999241948128,
0.06164654344320297,
0.0063917082734405994,
0.04119690880179405,
0.0031953686848282814,
0.025486290454864502,
0.009173755533993244,
-0.09496136009693146,
-0.023995880037546158,
-0.08880896121263504,
-0.07214583456516266,
-0.04940570145845413,
-0.09304323047399521,
0.03594567999243736,
0.013112685643136501,
0.01894543319940567,
-0.06406352669000626,
-0.10749322921037674,
0.09712271392345428,
0.1255478709936142,
-0.057764071971178055,
0.010955153964459896,
-0.07840663939714432,
0.01830684021115303,
-0.02046005241572857,
-0.02709796279668808,
-0.1964375376701355,
-0.12766766548156738,
0.05075477436184883,
-0.04390571638941765,
0.025171173736453056,
0.031755805015563965,
0.06563135981559753,
0.05421000346541405,
-0.04496752843260765,
-0.023723391816020012,
-0.06885866820812225,
-0.0030185983050614595,
-0.11395760625600815,
-0.20079435408115387,
-0.07319246232509613,
-0.03826318308711052,
0.14226429164409637,
-0.20136253535747528,
0.005348519887775183,
-0.021626293659210205,
0.1095796450972557,
0.01731887273490429,
-0.05040815472602844,
0.012137681245803833,
0.03242392838001251,
0.014891620725393295,
-0.09911393374204636,
0.050222430378198624,
0.009663485921919346,
-0.06867968291044235,
-0.044942457228899,
-0.10158337652683258,
-0.0018190396949648857,
0.057143472135066986,
0.08593656867742538,
-0.10151524096727371,
-0.01831558346748352,
-0.05232453718781471,
-0.03809838742017746,
-0.08430156856775284,
0.015634585171937943,
0.20486249029636383,
0.03373727574944496,
0.12305399030447006,
-0.058213215321302414,
-0.08024140447378159,
-0.003535877214744687,
0.022133035585284233,
0.03375992178916931,
0.10260391980409622,
0.07613413780927658,
-0.08145351707935333,
0.0824688971042633,
0.08080336451530457,
-0.041135065257549286,
0.11496362835168839,
-0.05527280271053314,
-0.07599583268165588,
-0.014571587555110455,
0.003449450945481658,
-0.02905985713005066,
0.14857344329357147,
-0.08209054917097092,
0.0010526240803301334,
0.037789084017276764,
0.018897665664553642,
0.007821235805749893,
-0.1769680678844452,
-0.006484889425337315,
0.0155605748295784,
-0.06801864504814148,
-0.050082094967365265,
-0.03202994912862778,
0.033418264240026474,
0.09240349382162094,
0.028149500489234924,
-0.03223591297864914,
0.022505300119519234,
-0.011733854189515114,
-0.060969676822423935,
0.19494366645812988,
-0.11151214689016342,
-0.09200876206159592,
-0.09752289205789566,
0.01912912353873253,
-0.03741903975605965,
-0.03616069629788399,
0.0033443672582507133,
-0.09609381854534149,
-0.05344749987125397,
-0.08668555319309235,
-0.019996097311377525,
-0.011743303388357162,
-0.00668105436488986,
0.023036271333694458,
-0.014637873508036137,
0.07758644968271255,
-0.1333221197128296,
0.004723209887742996,
-0.029651420190930367,
-0.10206028074026108,
0.018548183143138885,
0.06173131987452507,
0.08155632019042969,
0.10211614519357681,
-0.012087319977581501,
0.016034306958317757,
-0.03049781545996666,
0.2285536676645279,
-0.0658491924405098,
0.017126062884926796,
0.08834946155548096,
-0.0013880431652069092,
0.05545787885785103,
0.14017106592655182,
0.03269558399915695,
-0.10821690410375595,
0.028976941481232643,
0.0791962593793869,
-0.018896041437983513,
-0.25292274355888367,
-0.026111910119652748,
-0.01982394978404045,
-0.07228690385818481,
0.08627991378307343,
0.03875226899981499,
-0.05398934334516525,
0.03504057228565216,
0.013707061298191547,
0.004098162055015564,
-0.0347866490483284,
0.0680684745311737,
0.09068682044744492,
0.037209752947092056,
0.10207048058509827,
-0.021037397906184196,
-0.008656988851726055,
0.0613248310983181,
0.02652699686586857,
0.2727198600769043,
-0.043335217982530594,
0.1344069540500641,
0.02969421073794365,
0.14882639050483704,
-0.023947186768054962,
0.0508890338242054,
0.010274031199514866,
-0.0005880365497432649,
-0.0051350826397538185,
-0.04778645187616348,
-0.02003377303481102,
0.001519824261777103,
-0.032782938331365585,
0.02312207594513893,
-0.07271959632635117,
0.031665511429309845,
0.016271600499749184,
0.3140765428543091,
0.043670203536748886,
-0.2728669345378876,
-0.07220618426799774,
0.0037634659092873335,
-0.047798968851566315,
-0.07743709534406662,
0.00474048824980855,
0.145600363612175,
-0.12820474803447723,
0.02897445298731327,
-0.056734487414360046,
0.09113533794879913,
-0.04633912816643715,
0.00906029250472784,
0.06727561354637146,
0.1433527022600174,
-0.009502683766186237,
0.07432280480861664,
-0.206979900598526,
0.23511239886283875,
0.026468360796570778,
0.1063610389828682,
-0.06213434785604477,
0.01311464048922062,
0.007047990802675486,
0.04121769964694977,
0.1202172264456749,
0.002884287154302001,
-0.002565948059782386,
-0.17857204377651215,
-0.10181332379579544,
0.057889193296432495,
0.11209364235401154,
-0.029207831248641014,
0.0926484689116478,
-0.04269149526953697,
-0.00582997826859355,
0.03551757335662842,
-0.0808233693242073,
-0.11761835962533951,
-0.09391201287508011,
-0.00832361914217472,
0.0011879726080223918,
-0.025573717430233955,
-0.05849851667881012,
-0.09225837141275406,
-0.014057682827115059,
0.13899314403533936,
0.0065339552238583565,
-0.05281426012516022,
-0.1397276669740677,
0.05566942319273949,
0.13470712304115295,
-0.051139041781425476,
0.013987942598760128,
0.008547977544367313,
0.11005108058452606,
0.04814189672470093,
-0.0763128474354744,
0.060113415122032166,
-0.07292891293764114,
-0.16545213758945465,
-0.05774274840950966,
0.12104335427284241,
0.08431129157543182,
0.05821007862687111,
0.0026773419231176376,
0.02798723615705967,
-0.0016492749564349651,
-0.08198574930429459,
0.01705155149102211,
0.055318597704172134,
0.09041019529104233,
0.03653275966644287,
-0.09842777252197266,
0.06437358260154724,
-0.03824375569820404,
-0.008778220973908901,
0.12673094868659973,
0.2051183581352234,
-0.09505273401737213,
0.10173746198415756,
0.07353052496910095,
-0.08090225607156754,
-0.18734554946422577,
0.07064718008041382,
0.12687934935092926,
0.022719599306583405,
0.03833609074354172,
-0.2122189700603485,
0.1332305371761322,
0.10334338247776031,
-0.015815546736121178,
0.039717040956020355,
-0.2956703007221222,
-0.12407661229372025,
0.07568935304880142,
0.10424834489822388,
0.032709185034036636,
-0.1221228837966919,
-0.024383671581745148,
-0.011809062212705612,
-0.1286955624818802,
0.13141976296901703,
-0.07786092162132263,
0.11883597075939178,
-0.0059045529924333096,
0.11965875327587128,
0.024133699014782906,
-0.03770161047577858,
0.13019315898418427,
0.07315314561128616,
0.09009998291730881,
-0.03726602718234062,
0.009249433875083923,
0.05680684745311737,
-0.06665589660406113,
0.03930943086743355,
-0.041525598615407944,
0.06954091042280197,
-0.17050223052501678,
-0.00494551844894886,
-0.09060323983430862,
0.03920791670680046,
-0.04980095103383064,
-0.05637866631150246,
-0.018952926620841026,
0.05447951331734657,
0.06813199073076248,
-0.039473116397857666,
0.02354426681995392,
0.00735910190269351,
0.06378144025802612,
0.09296930581331253,
0.10310129076242447,
-0.02089935541152954,
-0.11089524626731873,
0.013740857131779194,
-0.007413150742650032,
0.05448048189282417,
-0.11318664997816086,
0.021552998572587967,
0.13050560653209686,
0.061276376247406006,
0.12060953676700592,
0.026485919952392578,
-0.03138442710042,
-0.014433668926358223,
0.008990297093987465,
-0.11524983495473862,
-0.12710973620414734,
0.039846207946538925,
-0.038725461810827255,
-0.1495904177427292,
0.007234025280922651,
0.09780296683311462,
-0.03367462754249573,
-0.01983124390244484,
-0.009392916224896908,
0.02146757021546364,
-0.01636442169547081,
0.20081132650375366,
0.040976617485284805,
0.07298857718706131,
-0.1063975989818573,
0.11846239119768143,
0.05363753065466881,
-0.06024082750082016,
0.051443491131067276,
0.06422792375087738,
-0.10254913568496704,
-0.01300093811005354,
0.11633390933275223,
0.15984266996383667,
-0.032749876379966736,
-0.013777361251413822,
-0.07724275439977646,
-0.09238672256469727,
0.06110994517803192,
0.1473919302225113,
0.05204317346215248,
-0.013114222325384617,
-0.04906123876571655,
0.026262624189257622,
-0.12600670754909515,
0.08152158558368683,
0.04536667466163635,
0.06224516034126282,
-0.09620247036218643,
0.10307444632053375,
-0.0065263984724879265,
0.042702384293079376,
-0.01690271869301796,
0.02506001479923725,
-0.09861453622579575,
-0.011444670148193836,
-0.14593134820461273,
0.007256637327373028,
0.0009775384096428752,
0.013066861778497696,
-0.02100808545947075,
-0.04999744892120361,
-0.022955097258090973,
0.029293524101376534,
-0.0890115275979042,
-0.05469809100031853,
0.020070530474185944,
0.041771478950977325,
-0.14754420518875122,
-0.01557291392236948,
0.022505396977066994,
-0.09561251103878021,
0.08088414371013641,
0.05825292691588402,
0.013284893706440926,
0.028467843309044838,
-0.09688286483287811,
-0.04787667468190193,
0.0033899194095283747,
0.026605017483234406,
0.08671016246080399,
-0.0918751135468483,
-0.017143378034234047,
-0.03718949109315872,
0.042477358132600784,
0.017496872693300247,
0.08919170498847961,
-0.11426883935928345,
0.005390560254454613,
-0.05017681419849396,
-0.03925253823399544,
-0.06683403253555298,
0.04699161648750305,
0.10958242416381836,
0.04712049663066864,
0.16047176718711853,
-0.07094845920801163,
0.03574954345822334,
-0.19057907164096832,
-0.040675658732652664,
0.0018411423079669476,
-0.045199450105428696,
-0.0886692926287651,
-0.050051018595695496,
0.09676174074411392,
-0.04991219937801361,
0.09634801745414734,
-0.00215091067366302,
0.09896518290042877,
0.032079339027404785,
-0.02023780532181263,
-0.050905030220746994,
0.009045287035405636,
0.15258993208408356,
0.049119286239147186,
-0.014569220133125782,
0.11336664110422134,
-0.005630593281239271,
0.04567639157176018,
0.07155520468950272,
0.21001505851745605,
0.15422910451889038,
0.004652600269764662,
0.048872798681259155,
0.063435859978199,
-0.11985404044389725,
-0.14193937182426453,
0.14029496908187866,
-0.046791546046733856,
0.12645947933197021,
-0.06946513801813126,
0.22312790155410767,
0.022193901240825653,
-0.18533940613269806,
0.0664239376783371,
-0.06218860670924187,
-0.12530647218227386,
-0.1139458417892456,
-0.03125303238630295,
-0.07467484474182129,
-0.1092492863535881,
0.022247955203056335,
-0.12154056876897812,
0.060544710606336594,
0.1293284296989441,
0.017719997093081474,
0.025668198242783546,
0.15885432064533234,
-0.03212610259652138,
0.01953026093542576,
0.07037326693534851,
0.01875430718064308,
-0.0050863600336015224,
-0.06307094544172287,
-0.0600522942841053,
0.05090200528502464,
0.025401027873158455,
0.07586200535297394,
-0.04361617937684059,
0.008596324361860752,
0.021514838561415672,
-0.015444464981555939,
-0.07253070920705795,
0.014001576229929924,
0.02671385183930397,
0.04388735070824623,
0.062130920588970184,
0.05186883732676506,
0.006781678181141615,
-0.03700000047683716,
0.2897214889526367,
-0.0790477842092514,
-0.09227882325649261,
-0.1339585781097412,
0.22593605518341064,
0.010781001299619675,
-0.021732516586780548,
0.07874462753534317,
-0.09521448612213135,
-0.023782672360539436,
0.17349641025066376,
0.13812343776226044,
-0.09929634630680084,
-0.025229759514331818,
-0.015199699439108372,
-0.012939919717609882,
-0.045650728046894073,
0.13006095588207245,
0.10899478942155838,
-0.009643089026212692,
-0.08175186067819595,
-0.022795887663960457,
-0.010183647274971008,
-0.052728697657585144,
-0.06709479540586472,
0.06184972822666168,
0.02001877874135971,
0.0007621568511240184,
-0.04245079681277275,
0.057352758944034576,
-0.003056018380448222,
-0.24951741099357605,
0.03521096706390381,
-0.15718713402748108,
-0.17840291559696198,
-0.03924965113401413,
0.052761420607566833,
-0.004399739671498537,
0.03992800787091255,
-0.013017240911722183,
0.012588794343173504,
0.15300697088241577,
-0.033853679895401,
-0.03302589803934097,
-0.12059998512268066,
0.11753617227077484,
-0.10622111707925797,
0.19977295398712158,
0.0032808620017021894,
0.07102248072624207,
0.0973600223660469,
0.013045922853052616,
-0.13518713414669037,
0.0395343117415905,
0.07569398730993271,
-0.105489082634449,
0.013127720914781094,
0.15399721264839172,
-0.052866701036691666,
0.08208754658699036,
0.02097046747803688,
-0.11001210659742355,
-0.010989772155880928,
-0.04483554884791374,
-0.03575028106570244,
-0.07971695065498352,
-0.014721278101205826,
-0.06489847600460052,
0.15918205678462982,
0.22927746176719666,
-0.0244105514138937,
0.015998495742678642,
-0.09310869872570038,
0.012250572443008423,
0.0436604879796505,
0.05012783780694008,
-0.045818161219358444,
-0.19383151829242706,
0.02563047781586647,
0.03313108906149864,
0.019382961094379425,
-0.21185345947742462,
-0.08442746102809906,
0.04555843025445938,
-0.02958833985030651,
-0.04783434420824051,
0.1045030876994133,
0.03158152848482132,
0.045010052621364594,
-0.03468087315559387,
-0.0987190529704094,
-0.04017476737499237,
0.14313067495822906,
-0.16084438562393188,
-0.041968975216150284
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of bert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
50,
54,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-uncased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of bert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.10270828753709793,
0.13611020147800446,
-0.001973906299099326,
0.09336329251527786,
0.1365167796611786,
0.034212514758110046,
0.08240136504173279,
0.13982652127742767,
-0.07842632383108139,
0.05354969948530197,
0.07392475008964539,
0.059349045157432556,
0.047128040343523026,
0.12699416279792786,
-0.0426001213490963,
-0.21470092236995697,
0.0026061085518449545,
-0.012765764258801937,
-0.062396131455898285,
0.10278794169425964,
0.08769488334655762,
-0.1114441454410553,
0.06844816356897354,
-0.022726459428668022,
-0.15371505916118622,
0.009873338975012302,
-0.032445333898067474,
-0.02390219457447529,
0.10519450902938843,
-0.007546858862042427,
0.09248554706573486,
0.01447245106101036,
0.14323879778385162,
-0.21390721201896667,
0.0001739553117658943,
0.07652726024389267,
0.04541192576289177,
0.08967572450637817,
0.0355348140001297,
0.021184418350458145,
0.02775556594133377,
-0.1517845243215561,
0.09272423386573792,
0.02615288272500038,
-0.07916498929262161,
-0.13126295804977417,
-0.09018395841121674,
0.026208577677607536,
0.08417142927646637,
0.08063339442014694,
0.008153118193149567,
0.131033793091774,
-0.09831149131059647,
0.08339758962392807,
0.19849008321762085,
-0.2801605761051178,
-0.06506883352994919,
0.04599471762776375,
0.05161600187420845,
0.07395664602518082,
-0.11857220530509949,
-0.0190008282661438,
0.021745437756180763,
0.03117845021188259,
0.11003870517015457,
-0.020916707813739777,
-0.133056178689003,
0.00812370702624321,
-0.12785886228084564,
-0.01652837172150612,
0.10813209414482117,
0.042205341160297394,
-0.04747769236564636,
-0.07789870351552963,
-0.06719835102558136,
-0.09798159450292587,
-0.02949550747871399,
-0.012029132805764675,
0.05221206694841385,
-0.05561676248908043,
-0.0651007741689682,
-0.048140935599803925,
-0.05677315965294838,
-0.09011071175336838,
0.004290255717933178,
0.1239088624715805,
0.04041660204529762,
0.018920093774795532,
-0.03099486604332924,
0.11642531305551529,
-0.0007766128401271999,
-0.12877413630485535,
-0.01097178366035223,
0.005952898878604174,
-0.12036672979593277,
-0.05210493877530098,
-0.031944576650857925,
0.017296697944402695,
0.012703529559075832,
0.1500042974948883,
-0.036733582615852356,
0.07603628188371658,
0.014099447056651115,
-0.01886659860610962,
-0.02072056569159031,
0.16007626056671143,
-0.03363209590315819,
-0.04683845862746239,
0.0017139653209596872,
0.09854275733232498,
-0.0015384487342089415,
-0.006259310990571976,
-0.08186856657266617,
-0.01837562397122383,
0.07901355624198914,
0.06546062976121902,
-0.05636778101325035,
0.03709163889288902,
-0.042164068669080734,
-0.02511655166745186,
0.02291024662554264,
-0.1286419928073883,
0.0367797426879406,
0.0054996078833937645,
-0.08104913681745529,
-0.0580904521048069,
0.009793262928724289,
-0.015782393515110016,
-0.027252471074461937,
0.07455786317586899,
-0.07095532864332199,
-0.005809003487229347,
-0.08756688982248306,
-0.07923359423875809,
0.0027767957653850317,
-0.13968080282211304,
-0.011593127623200417,
-0.04676637798547745,
-0.19948188960552216,
-0.03443574905395508,
0.04985511302947998,
-0.07628793269395828,
-0.0445132739841938,
-0.05064542964100838,
-0.07719188183546066,
0.009722509421408176,
-0.00729722622781992,
0.1881658434867859,
-0.05756598338484764,
0.08145546168088913,
-0.015408218838274479,
0.04599153622984886,
0.02126121148467064,
0.051993194967508316,
-0.08022384345531464,
0.027153633534908295,
-0.14617201685905457,
0.0871763750910759,
-0.10437505692243576,
0.013292727060616016,
-0.1266709268093109,
-0.08601014316082001,
0.0584941990673542,
-0.02124982699751854,
0.07102834433317184,
0.1395595371723175,
-0.1923774927854538,
0.0026547627057880163,
0.11794085055589676,
-0.03735816851258278,
-0.04285898804664612,
0.08142663538455963,
-0.05885031446814537,
0.03598739579319954,
0.05468706414103508,
0.180042564868927,
0.08377782255411148,
-0.14195731282234192,
0.008023684844374657,
0.04423832148313522,
0.062438782304525375,
0.005037724040448666,
0.041115447878837585,
0.003438257612287998,
0.025845540687441826,
0.009407369419932365,
-0.0937359407544136,
-0.023325728252530098,
-0.0887671560049057,
-0.07243084162473679,
-0.04967303201556206,
-0.09274308383464813,
0.03549737110733986,
0.013587652705609798,
0.019019322469830513,
-0.06386499851942062,
-0.10812707990407944,
0.09839994460344315,
0.12546904385089874,
-0.05807211622595787,
0.011161998845636845,
-0.07823028415441513,
0.01892247051000595,
-0.01967807300388813,
-0.0273422934114933,
-0.19573907554149628,
-0.12583059072494507,
0.050958484411239624,
-0.04363163188099861,
0.024359984323382378,
0.032264843583106995,
0.06484542787075043,
0.05496462807059288,
-0.04423963278532028,
-0.022883974015712738,
-0.06807821244001389,
-0.0028759250417351723,
-0.11526293307542801,
-0.19951114058494568,
-0.0732317864894867,
-0.03760756179690361,
0.1419556736946106,
-0.20234644412994385,
0.005772524978965521,
-0.02220851741731167,
0.10899873822927475,
0.016935499384999275,
-0.05033920332789421,
0.011878515593707561,
0.03324083983898163,
0.0144326938316226,
-0.09930627793073654,
0.0502568818628788,
0.010330529883503914,
-0.0695079043507576,
-0.04608691856265068,
-0.10214009881019592,
-0.0006605929229408503,
0.05677307769656181,
0.08486643433570862,
-0.10127586126327515,
-0.018539387732744217,
-0.051784664392471313,
-0.03810560330748558,
-0.08348749577999115,
0.01465549599379301,
0.20564748346805573,
0.033907193690538406,
0.12440584599971771,
-0.05799012631177902,
-0.07958990335464478,
-0.0035771450493484735,
0.02200246974825859,
0.03450392186641693,
0.10170592367649078,
0.07582913339138031,
-0.0785539373755455,
0.08150627464056015,
0.07979242503643036,
-0.04239245131611824,
0.11431794613599777,
-0.05479218438267708,
-0.07565802335739136,
-0.01484937034547329,
0.0032682802993804216,
-0.028782710433006287,
0.14837439358234406,
-0.08422265201807022,
0.00035528314765542746,
0.038079045712947845,
0.018397146835923195,
0.008257562294602394,
-0.1771700233221054,
-0.006579573731869459,
0.016482222825288773,
-0.06739196926355362,
-0.04899383336305618,
-0.03281526640057564,
0.03304710239171982,
0.0917145162820816,
0.027601370587944984,
-0.032716281712055206,
0.0230349563062191,
-0.011352683417499065,
-0.06136507913470268,
0.19405685365200043,
-0.11143803596496582,
-0.09325803071260452,
-0.09938577562570572,
0.019395606592297554,
-0.03733513504266739,
-0.036094408482313156,
0.003756128251552582,
-0.09416046738624573,
-0.053387146443128586,
-0.0871017649769783,
-0.022241657599806786,
-0.011442501097917557,
-0.0064661530777812,
0.023675587028265,
-0.015353424474596977,
0.07841069251298904,
-0.133207306265831,
0.004447005223482847,
-0.028817035257816315,
-0.10098741203546524,
0.018976036459207535,
0.06172702834010124,
0.08239049464464188,
0.1025191992521286,
-0.012982220388948917,
0.01522161066532135,
-0.02998734824359417,
0.22811830043792725,
-0.06529422849416733,
0.01726018823683262,
0.08878377825021744,
-0.0028106910176575184,
0.055671948939561844,
0.13945691287517548,
0.03289283812046051,
-0.10835102945566177,
0.02843506820499897,
0.07805510610342026,
-0.019310804083943367,
-0.25202885270118713,
-0.025941984727978706,
-0.019994517788290977,
-0.07157306373119354,
0.08655580133199692,
0.0387117862701416,
-0.05347024276852608,
0.035301871597766876,
0.0136843491345644,
0.006414861883968115,
-0.036223191767930984,
0.0677027627825737,
0.09118133038282394,
0.03688085824251175,
0.10185890644788742,
-0.02052166871726513,
-0.008627034723758698,
0.0618138313293457,
0.025664804503321648,
0.2704572081565857,
-0.04344000294804573,
0.13563762605190277,
0.028321554884314537,
0.14987695217132568,
-0.02367890253663063,
0.051033101975917816,
0.009945039637386799,
-0.0008778044139035046,
-0.005208962131291628,
-0.047894835472106934,
-0.02169063501060009,
0.0021764093544334173,
-0.03328695893287659,
0.023462560027837753,
-0.07286331057548523,
0.03256568685173988,
0.015788670629262924,
0.31365495920181274,
0.043354813009500504,
-0.2732242941856384,
-0.07261013239622116,
0.003149599302560091,
-0.04774586856365204,
-0.07817709445953369,
0.004947884939610958,
0.14696428179740906,
-0.12763498723506927,
0.028208963572978973,
-0.05602533742785454,
0.09127239137887955,
-0.0477743074297905,
0.009429452940821648,
0.06682460755109787,
0.1431272327899933,
-0.009246340952813625,
0.07442205399274826,
-0.20740081369876862,
0.23340988159179688,
0.026783598586916924,
0.10656595975160599,
-0.062313154339790344,
0.013654480688273907,
0.0068036713637411594,
0.042978446930646896,
0.11921694874763489,
0.003478958969935775,
-0.0014914042549207807,
-0.18037989735603333,
-0.10235966742038727,
0.05784289911389351,
0.1107095405459404,
-0.027288751676678658,
0.09245067834854126,
-0.042828407138586044,
-0.005648661404848099,
0.03576427698135376,
-0.08013716340065002,
-0.11725606769323349,
-0.09429293125867844,
-0.00851198099553585,
0.0028789013158529997,
-0.02529672533273697,
-0.058412835001945496,
-0.09182710945606232,
-0.015892712399363518,
0.13891004025936127,
0.006802908610552549,
-0.052829936146736145,
-0.1396554857492447,
0.05627055838704109,
0.13395974040031433,
-0.05135694518685341,
0.013874391093850136,
0.0086385328322649,
0.10993974655866623,
0.047885894775390625,
-0.07545594871044159,
0.06012837961316109,
-0.07273704558610916,
-0.16478103399276733,
-0.05837515741586685,
0.11987177282571793,
0.08405039459466934,
0.058250561356544495,
0.0026670475490391254,
0.02771095745265484,
-0.0018838031683117151,
-0.08179368078708649,
0.015801889821887016,
0.05645659193396568,
0.08928623050451279,
0.036301370710134506,
-0.09818872064352036,
0.06530612707138062,
-0.03770420700311661,
-0.008754191920161247,
0.1276935636997223,
0.20491525530815125,
-0.09529059380292892,
0.10065493732690811,
0.07434649765491486,
-0.08083097636699677,
-0.18687807023525238,
0.07005899399518967,
0.12629744410514832,
0.023038780316710472,
0.038223084062337875,
-0.2114216685295105,
0.13335587084293365,
0.10479201376438141,
-0.015336810611188412,
0.04003491997718811,
-0.29552266001701355,
-0.12436489015817642,
0.07652557641267776,
0.10382479429244995,
0.03541629761457443,
-0.12279336899518967,
-0.02442857064306736,
-0.012487421743571758,
-0.1304469108581543,
0.1299174427986145,
-0.07693856209516525,
0.11869735270738602,
-0.005995971150696278,
0.11798109114170074,
0.024293823167681694,
-0.03788166493177414,
0.1307789534330368,
0.07376176118850708,
0.08999738097190857,
-0.03736606612801552,
0.009270576760172844,
0.056292153894901276,
-0.06669919937849045,
0.03992808237671852,
-0.041127290576696396,
0.06944765895605087,
-0.17291711270809174,
-0.005156246945261955,
-0.08904923498630524,
0.0393851064145565,
-0.04944983869791031,
-0.05656253919005394,
-0.018932756036520004,
0.05324241518974304,
0.0679907500743866,
-0.03933895006775856,
0.024500872939825058,
0.007853766903281212,
0.06271031498908997,
0.09487419575452805,
0.10123501718044281,
-0.02439822070300579,
-0.11090248823165894,
0.013658772222697735,
-0.007523144595324993,
0.05509769916534424,
-0.11260020732879639,
0.02203848585486412,
0.1310279667377472,
0.0615045502781868,
0.12081588059663773,
0.025277988985180855,
-0.03073328733444214,
-0.01412923727184534,
0.008418340235948563,
-0.11517223715782166,
-0.12600739300251007,
0.039224982261657715,
-0.03917752951383591,
-0.1490560621023178,
0.006154896691441536,
0.09869163483381271,
-0.03464686870574951,
-0.019237209111452103,
-0.00982216838747263,
0.020145205780863762,
-0.0162639319896698,
0.20045213401317596,
0.04151908680796623,
0.07272875308990479,
-0.1061582863330841,
0.11797121912240982,
0.053992293775081635,
-0.05935680866241455,
0.05176747962832451,
0.06369403004646301,
-0.10281895846128464,
-0.012928018346428871,
0.11605586111545563,
0.15976691246032715,
-0.03391829505562782,
-0.014876430854201317,
-0.07823286950588226,
-0.09086251258850098,
0.06092926487326622,
0.14638087153434753,
0.05246242508292198,
-0.013332056812942028,
-0.0494069829583168,
0.025510288774967194,
-0.1259692758321762,
0.08087518066167831,
0.045270390808582306,
0.06256329268217087,
-0.09642504155635834,
0.10510646551847458,
-0.006071718875318766,
0.04253091290593147,
-0.016849679872393608,
0.024860844016075134,
-0.09839344769716263,
-0.011285142041742802,
-0.14812491834163666,
0.007650039624422789,
0.0011514618527144194,
0.013462548144161701,
-0.020999761298298836,
-0.049657925963401794,
-0.02311186119914055,
0.02917802520096302,
-0.08815276622772217,
-0.054401591420173645,
0.020388323813676834,
0.041269704699516296,
-0.14683620631694794,
-0.015726355835795403,
0.02214932069182396,
-0.09555768966674805,
0.08112334460020065,
0.05770855396986008,
0.012848526239395142,
0.027832236140966415,
-0.09690331667661667,
-0.04785369709134102,
0.00336113921366632,
0.027583006769418716,
0.08670984208583832,
-0.08984341472387314,
-0.016347425058484077,
-0.03677380457520485,
0.042373597621917725,
0.017210043966770172,
0.08947816491127014,
-0.11458141356706619,
0.004962783306837082,
-0.049649231135845184,
-0.03895160183310509,
-0.06744729727506638,
0.04656163603067398,
0.10927758365869522,
0.046670567244291306,
0.16021192073822021,
-0.07103707641363144,
0.035561878234148026,
-0.19075757265090942,
-0.04089360684156418,
0.0021716586779803038,
-0.04482494667172432,
-0.0883524939417839,
-0.050723545253276825,
0.0965770035982132,
-0.04919734597206116,
0.09831606596708298,
-0.0025396833661943674,
0.09937147051095963,
0.03160246089100838,
-0.021325474604964256,
-0.05093904212117195,
0.008144098334014416,
0.15370216965675354,
0.050098396837711334,
-0.014516901224851608,
0.11218442022800446,
-0.005480342544615269,
0.04708466678857803,
0.07099061459302902,
0.20732219517230988,
0.15421122312545776,
0.004211402498185635,
0.049116719514131546,
0.06287860125303268,
-0.11938737332820892,
-0.14224331080913544,
0.1398431658744812,
-0.04723050072789192,
0.12711897492408752,
-0.0692330077290535,
0.22312557697296143,
0.022233743220567703,
-0.1848905384540558,
0.06599801033735275,
-0.06159263104200363,
-0.12544861435890198,
-0.11378229409456253,
-0.030459927394986153,
-0.07469718903303146,
-0.10937902331352234,
0.021797005087137222,
-0.12104131281375885,
0.06044745817780495,
0.12995277345180511,
0.01743420772254467,
0.025395279750227928,
0.15795990824699402,
-0.03241199254989624,
0.019810881465673447,
0.07016114890575409,
0.01879902556538582,
-0.004577438812702894,
-0.06423397362232208,
-0.061330534517765045,
0.051182713359594345,
0.025451311841607094,
0.07649748772382736,
-0.04381268844008446,
0.010671628639101982,
0.02207520231604576,
-0.014950241893529892,
-0.07301627099514008,
0.013940303586423397,
0.02591376006603241,
0.04369162768125534,
0.060696519911289215,
0.05218431353569031,
0.007271092385053635,
-0.03714478760957718,
0.2877693772315979,
-0.07845555245876312,
-0.0926148071885109,
-0.13402865827083588,
0.2242669314146042,
0.011150028556585312,
-0.02142367884516716,
0.0786995142698288,
-0.09530562907457352,
-0.023031653836369514,
0.17384731769561768,
0.13671216368675232,
-0.10003798454999924,
-0.025307634845376015,
-0.014703488908708096,
-0.013097390532493591,
-0.046688757836818695,
0.13037142157554626,
0.10923296958208084,
-0.011007231660187244,
-0.08071427792310715,
-0.02365886978805065,
-0.01045314408838749,
-0.05263800173997879,
-0.0680682510137558,
0.060767464339733124,
0.020162496715784073,
0.0016052783466875553,
-0.042269520461559296,
0.05639689415693283,
-0.0017972156638279557,
-0.2503480315208435,
0.03575075417757034,
-0.1573444902896881,
-0.17830336093902588,
-0.0387590155005455,
0.05324121564626694,
-0.00404586736112833,
0.039274416863918304,
-0.013043595477938652,
0.013123752549290657,
0.15419723093509674,
-0.03432421386241913,
-0.0335463285446167,
-0.11932642757892609,
0.11635847389698029,
-0.1048269271850586,
0.19956840574741364,
0.0034215699415653944,
0.07163957506418228,
0.09717336297035217,
0.013691220432519913,
-0.1343166083097458,
0.039831098169088364,
0.07537339627742767,
-0.10460468381643295,
0.012568607926368713,
0.1524650901556015,
-0.05267563834786415,
0.08219067007303238,
0.02133222296833992,
-0.10982260853052139,
-0.011825503781437874,
-0.04573218151926994,
-0.035950805991888046,
-0.07946667820215225,
-0.01644919440150261,
-0.06439884752035141,
0.15940400958061218,
0.22842390835285187,
-0.02427336387336254,
0.015363684855401516,
-0.09300865232944489,
0.011979331262409687,
0.044283948838710785,
0.04997647926211357,
-0.04586233198642731,
-0.1937662959098816,
0.026331376284360886,
0.03261302784085274,
0.01964849978685379,
-0.2111525684595108,
-0.08488958328962326,
0.04531213641166687,
-0.030029665678739548,
-0.04757874459028244,
0.1046685054898262,
0.03140171989798546,
0.04531796649098396,
-0.03471580147743225,
-0.09904243052005768,
-0.04054872319102287,
0.14312304556369781,
-0.1607595682144165,
-0.04240638390183449
] |
null | null |
transformers
|
Results:
{'exact_match': 76.82119205298014, 'f1': 84.69734248389383}
|
{}
|
question-answering
|
anas-awadalla/bert-medium-finetuned-squad
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #endpoints_compatible #region-us
|
Results:
{'exact_match': 76.82119205298014, 'f1': 84.69734248389383}
|
[] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #endpoints_compatible #region-us \n"
] |
[
29
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #endpoints_compatible #region-us \n"
] |
[
-0.03135230019688606,
0.019992724061012268,
-0.0106701934710145,
-0.018308134749531746,
0.09194501489400864,
0.033905480057001114,
0.015166199766099453,
0.09145615994930267,
0.134480282664299,
0.009099881164729595,
0.1723843812942505,
0.20330657064914703,
-0.06626442819833755,
-0.06713318079710007,
-0.10391829162836075,
-0.2025584578514099,
0.04793500155210495,
0.10157988965511322,
-0.023979786783456802,
0.12151781469583511,
0.03099050000309944,
-0.13043755292892456,
0.038694705814123154,
-0.03934689983725548,
-0.07007598131895065,
0.061426807194948196,
0.002661224454641342,
-0.05560101941227913,
0.1275879591703415,
0.01988556981086731,
0.16128244996070862,
0.032734207808971405,
-0.13124491274356842,
-0.19133411347866058,
0.0434899628162384,
-0.02546050027012825,
-0.05259579047560692,
0.028877267614006996,
0.052893463522195816,
-0.0953020453453064,
-0.03832237422466278,
0.07442127913236618,
0.0002482777344994247,
0.06498449295759201,
-0.18763136863708496,
-0.1596250683069229,
-0.05712383985519409,
0.015845056623220444,
0.0636272132396698,
0.09438946843147278,
-0.02710471861064434,
0.17875422537326813,
-0.18476411700248718,
0.09474390745162964,
0.18092048168182373,
-0.31450721621513367,
-0.01166314072906971,
0.08102051168680191,
0.09412065893411636,
0.051648981869220734,
-0.020864112302660942,
0.06979437917470932,
0.0380079485476017,
0.02188878506422043,
-0.11535484343767166,
-0.11366855353116989,
-0.033182621002197266,
0.10256030410528183,
-0.08841803669929504,
-0.09839369356632233,
0.2299497276544571,
0.01531962025910616,
0.04654033109545708,
0.04189028963446617,
-0.09467408806085587,
0.021009191870689392,
0.02639743499457836,
-0.031534597277641296,
-0.030764060094952583,
0.050264179706573486,
0.01495309267193079,
-0.03456006199121475,
-0.11285790801048279,
0.022392652928829193,
-0.24631081521511078,
0.24179495871067047,
0.03275267407298088,
0.0936693400144577,
-0.2500198483467102,
0.05645543336868286,
-0.04845157638192177,
-0.07376648485660553,
0.0028497406747192144,
-0.07651493698358536,
0.0036522154696285725,
-0.008018171414732933,
-0.06546273827552795,
0.06301076710224152,
0.060571182519197464,
0.18575403094291687,
-0.0070006283931434155,
0.03181932866573334,
-0.014564059674739838,
0.09062321484088898,
0.06006910279393196,
0.11936818808317184,
-0.021268334239721298,
0.002836136845871806,
-0.03208696469664574,
-0.14179284870624542,
-0.041232071816921234,
-0.03558903932571411,
-0.08642159402370453,
-0.058963969349861145,
0.0002489425241947174,
0.1372518092393875,
0.08277205377817154,
0.003957946319133043,
-0.07083798944950104,
0.0008623613393865526,
-0.022650524973869324,
-0.026950567960739136,
-0.014865356497466564,
0.0037242902908474207,
0.019868748262524605,
0.197527214884758,
-0.07651297003030777,
0.026126010343432426,
-0.041966866701841354,
0.09156736731529236,
-0.07137662917375565,
-0.019976438954472542,
-0.019683154299855232,
-0.009521318599581718,
0.07046832889318466,
-0.12470054626464844,
0.07528544962406158,
-0.11439041793346405,
-0.04859017953276634,
0.015044530853629112,
0.03159348666667938,
-0.00811055675148964,
0.024456506595015526,
0.0011752002174034715,
-0.027929307892918587,
-0.04827839136123657,
-0.055367715656757355,
-0.02931503765285015,
-0.0629315972328186,
0.11279169470071793,
0.024471081793308258,
0.03690626099705696,
-0.07810340076684952,
0.059454578906297684,
-0.0799008384346962,
0.04948350414633751,
-0.06193244829773903,
-0.04327748715877533,
-0.00406288867816329,
0.13943274319171906,
-0.0188955869525671,
-0.0814240500330925,
-0.10613562166690826,
0.034284912049770355,
-0.05427347868680954,
0.1935708075761795,
-0.00923309288918972,
-0.06814873218536377,
0.2118416130542755,
-0.040127113461494446,
-0.2278028279542923,
0.08400595188140869,
0.0055225868709385395,
-0.006558615248650312,
0.07845839112997055,
0.18719437718391418,
-0.021085986867547035,
-0.08935558795928955,
0.0636000707745552,
0.12111418694257736,
-0.12495958805084229,
-0.07829990983009338,
0.043070293962955475,
-0.07017793506383896,
-0.11699909716844559,
0.03349027782678604,
0.013137046247720718,
0.03932749107480049,
-0.10104537010192871,
-0.031039975583553314,
-0.012059801258146763,
0.0038918794598430395,
0.07856530696153641,
0.07118111103773117,
0.06136491149663925,
-0.07381823658943176,
0.013349482789635658,
-0.02925417199730873,
-0.032968953251838684,
0.052713699638843536,
0.02929193526506424,
-0.07410431653261185,
0.13964945077896118,
-0.11522811651229858,
0.01286560483276844,
-0.21762342751026154,
-0.1018327847123146,
-0.026389047503471375,
0.1203073263168335,
-0.02318541333079338,
0.23964986205101013,
0.09678550064563751,
-0.14600299298763275,
-0.024652112275362015,
-0.04226558282971382,
0.10834454745054245,
0.0070869335904717445,
-0.03607034310698509,
-0.03498968854546547,
0.04036606475710869,
-0.07709482312202454,
-0.09886907041072845,
-0.01497116032987833,
-0.030939951539039612,
0.1038946807384491,
0.10997966676950455,
-0.008906964212656021,
0.05910978466272354,
-0.009353389963507652,
0.04758423939347267,
0.01564851403236389,
0.04256250336766243,
0.10758287459611893,
-0.04391014575958252,
-0.09051477909088135,
0.10550031810998917,
-0.07713214308023453,
0.29826268553733826,
0.17867204546928406,
-0.31840789318084717,
0.02676197700202465,
-0.049528881907463074,
-0.05446058511734009,
0.024229198694229126,
0.07913020253181458,
-0.005939568392932415,
0.1239536926150322,
0.046388477087020874,
0.0834580585360527,
-0.041127074509859085,
-0.06847981363534927,
-0.025993147864937782,
-0.05089956149458885,
-0.04375419393181801,
0.11598169803619385,
0.06533234566450119,
-0.18627792596817017,
0.15774103999137878,
0.31001606583595276,
0.058010876178741455,
0.08272289484739304,
-0.0744117945432663,
-0.045550115406513214,
0.007455171085894108,
0.04436682537198067,
-0.05532664805650711,
0.048578739166259766,
-0.2567776143550873,
-0.002197499154135585,
0.08615592122077942,
0.0150474077090621,
0.0801461935043335,
-0.13763967156410217,
-0.08460202068090439,
0.00677099684253335,
0.015640581026673317,
-0.08960431814193726,
0.10787198692560196,
0.05573767051100731,
0.09508851170539856,
0.03299635648727417,
-0.010484343394637108,
0.1108626276254654,
-0.009887348860502243,
-0.06068798899650574,
0.15392078459262848,
-0.10568766295909882,
-0.24203269183635712,
-0.03854992240667343,
-0.09625014662742615,
0.012964913621544838,
0.00018531580280978233,
0.0804150328040123,
-0.09665409475564957,
-0.01300626341253519,
0.10569056868553162,
0.043230123817920685,
-0.19515320658683777,
0.0074298488907516,
-0.046658266335725784,
0.041596390306949615,
-0.09609968215227127,
-0.05583184212446213,
-0.0663028135895729,
-0.07535520195960999,
-0.07003165036439896,
0.11798246949911118,
-0.12265362590551376,
0.09397588670253754,
0.11396625638008118,
0.05636340752243996,
0.06993921101093292,
-0.01918971538543701,
0.23854759335517883,
-0.12224425375461578,
-0.04046132415533066,
0.1740262806415558,
-0.042808569967746735,
0.10165534913539886,
0.14503295719623566,
0.02203395403921604,
-0.08699861168861389,
0.0018511483212932944,
-0.019437525421380997,
-0.06568467617034912,
-0.2527044117450714,
-0.05329287797212601,
-0.1233673095703125,
0.0628412589430809,
-0.00775183504447341,
0.03164323791861534,
0.09978434443473816,
0.06742298603057861,
0.025705672800540924,
-0.14536437392234802,
-0.047002241015434265,
0.049619875848293304,
0.2609071135520935,
-0.062233880162239075,
0.08922796696424484,
-0.05042990669608116,
-0.11225856095552444,
0.054784633219242096,
0.08798287063837051,
0.14191558957099915,
0.13485278189182281,
-0.0012541507603600621,
0.08753865957260132,
0.15189595520496368,
0.13263347744941711,
0.09365297108888626,
-0.01275166217237711,
-0.05979113653302193,
-0.026594338938593864,
0.008423350751399994,
-0.057978056371212006,
0.017605869099497795,
0.18533632159233093,
-0.13100740313529968,
-0.04123637452721596,
-0.22012758255004883,
0.08449064940214157,
0.04960712045431137,
0.06855249404907227,
-0.06138977035880089,
0.02542274072766304,
0.08517176657915115,
-0.022691896185278893,
-0.04600032791495323,
0.09733358770608902,
0.018432024866342545,
-0.1500939279794693,
0.013220317661762238,
-0.03884725272655487,
0.1429862529039383,
0.03140929341316223,
0.09210632741451263,
-0.07383108139038086,
-0.17207033932209015,
0.06419593840837479,
0.10168318450450897,
-0.28792649507522583,
0.31095463037490845,
0.011329254135489464,
-0.10282246023416519,
-0.0687149316072464,
-0.051583193242549896,
-0.04170820862054825,
0.14244696497917175,
0.17153170704841614,
0.01912018284201622,
-0.016781393438577652,
-0.08975808322429657,
0.0813727155327797,
0.05595412105321884,
0.14812445640563965,
-0.03946775943040848,
-0.019575145095586777,
-0.009081060066819191,
0.018597830086946487,
-0.03334806114435196,
0.07303453981876373,
0.06333085149526596,
-0.10992588102817535,
0.04181332513689995,
-0.03497014939785004,
0.03160944581031799,
-0.004879975691437721,
0.0007611183100380003,
-0.0661424919962883,
0.10216393321752548,
-0.05482377111911774,
-0.055647995322942734,
-0.087758369743824,
-0.125993013381958,
0.13307835161685944,
-0.09655394405126572,
0.01837206818163395,
-0.08865699172019958,
-0.08842838555574417,
-0.06578792631626129,
-0.13334771990776062,
0.13474911451339722,
-0.09167075157165527,
-0.00372123415581882,
-0.03356693685054779,
0.21501420438289642,
-0.07080360502004623,
0.026272060349583626,
0.0075701139867305756,
0.038749873638153076,
-0.14425958693027496,
-0.10025361180305481,
0.026171782985329628,
-0.09551046043634415,
0.08756383508443832,
0.06237189844250679,
-0.01696663163602352,
0.12272684276103973,
-0.00312767899595201,
0.030395150184631348,
0.22212424874305725,
0.2156762182712555,
-0.033601753413677216,
0.08700301498174667,
0.17262640595436096,
-0.012947848998010159,
-0.2656528055667877,
-0.053389135748147964,
-0.157435804605484,
-0.07222622632980347,
-0.01569162867963314,
-0.10197952389717102,
0.13498151302337646,
0.023410171270370483,
-0.03472396731376648,
0.08764279633760452,
-0.23221847414970398,
-0.02134993113577366,
0.15970177948474884,
-0.0018289608415216208,
0.503491222858429,
-0.12488686293363571,
-0.09662727266550064,
0.04213958978652954,
-0.26681509613990784,
0.09501863270998001,
0.023150363937020302,
0.040762800723314285,
-0.027052946388721466,
0.11409859359264374,
0.03983669728040695,
-0.087588831782341,
0.154712975025177,
0.019638868048787117,
0.008942476473748684,
-0.06394541263580322,
-0.14411474764347076,
0.02673479914665222,
0.015640057623386383,
-0.0205686055123806,
0.05106743797659874,
0.042612675577402115,
-0.17278915643692017,
-0.01770436391234398,
-0.14686651527881622,
0.05070509389042854,
0.015234127640724182,
-0.048967428505420685,
-0.02123750001192093,
-0.02703016623854637,
-0.013095390982925892,
0.004948455840349197,
0.26325055956840515,
-0.07312103360891342,
0.17537301778793335,
-0.0264870747923851,
0.14215585589408875,
-0.19738759100437164,
-0.11319785565137863,
-0.06279365718364716,
-0.04668644443154335,
0.08705991506576538,
-0.03566458821296692,
0.05618258938193321,
0.19768133759498596,
-0.01593010127544403,
0.029327290132641792,
0.10862293839454651,
0.030872980132699013,
-0.03635275363922119,
0.08378896117210388,
-0.21281634271144867,
-0.14595142006874084,
-0.020121952518820763,
-0.03108396753668785,
0.05637865886092186,
0.0810856968164444,
0.06577019393444061,
0.12356884032487869,
-0.027859080582857132,
0.010623176582157612,
-0.0415089912712574,
-0.05093950033187866,
0.0011893544578924775,
0.08625370264053345,
0.03957598656415939,
-0.10335700213909149,
0.048189807683229446,
-0.028619252145290375,
-0.26291027665138245,
-0.04337899759411812,
0.08210968226194382,
-0.1047981008887291,
-0.1045025959610939,
-0.10444848984479904,
0.053661737591028214,
-0.1388372927904129,
-0.023707421496510506,
-0.020813921466469765,
-0.10614456236362457,
0.06624601781368256,
0.2284243404865265,
0.09281963109970093,
0.07564294338226318,
-0.002429374260827899,
-0.03401452675461769,
0.06096583977341652,
-0.03658943623304367,
-0.041938427835702896,
-0.01639465242624283,
-0.04779340326786041,
-0.07221720367670059,
-0.021267537027597427,
0.19879907369613647,
-0.07687898725271225,
-0.08742836862802505,
-0.17992953956127167,
0.10190094262361526,
-0.16759216785430908,
-0.11017142981290817,
-0.11501381546258926,
-0.08021751791238785,
0.0074523696675896645,
-0.12993402779102325,
-0.030433058738708496,
-0.04129278287291527,
-0.14015986025333405,
0.08424700796604156,
0.06597542017698288,
0.0144586730748415,
-0.07749994844198227,
-0.049343839287757874,
0.17375698685646057,
-0.03013371303677559,
0.09371156990528107,
0.1599135547876358,
-0.11322277784347534,
0.10763051360845566,
-0.11577168852090836,
-0.16121093928813934,
0.0631881058216095,
0.01964796893298626,
0.07204954326152802,
0.04969954863190651,
-0.008374828845262527,
0.07370220124721527,
0.048836011439561844,
0.08179794251918793,
-0.0492769293487072,
-0.1146647185087204,
0.013015234842896461,
0.036680009216070175,
-0.1983516365289688,
-0.03920695185661316,
-0.11963635683059692,
0.10883194953203201,
0.018821001052856445,
0.08767793327569962,
0.030451947823166847,
0.13595004379749298,
-0.04663078859448433,
0.016889382153749466,
0.01088773924857378,
-0.15472035109996796,
0.042594362050294876,
-0.07039690762758255,
0.01914464868605137,
-0.019411256536841393,
0.256925493478775,
-0.11366448551416397,
0.09016063064336777,
0.05351637303829193,
0.058282043784856796,
0.04103595018386841,
-0.00021439496777020395,
0.21260762214660645,
0.08834570646286011,
-0.06397779285907745,
-0.08060546219348907,
0.07965119928121567,
-0.06695281714200974,
-0.061735931783914566,
0.15358220040798187,
0.14465074241161346,
0.08401516824960709,
0.061305031180381775,
-0.010541096329689026,
0.06096964329481125,
-0.03317619860172272,
-0.2506944537162781,
0.024842530488967896,
0.013741164468228817,
0.004459200892597437,
0.12087604403495789,
0.1336645632982254,
-0.0315852053463459,
0.06568727642297745,
-0.05112520232796669,
-0.011567851528525352,
-0.1471722573041916,
-0.06196950376033783,
-0.056035030633211136,
-0.0801311731338501,
0.05537676438689232,
-0.10163131356239319,
-0.01644347794353962,
0.12720726430416107,
0.07157047837972641,
-0.04949105158448219,
0.1256283074617386,
0.048139408230781555,
-0.0648137629032135,
0.016248608008027077,
0.001133329002186656,
0.09110874682664871,
0.03384251520037651,
0.03011697717010975,
-0.12861861288547516,
-0.08243808150291443,
-0.05201524868607521,
0.04190807044506073,
-0.12921161949634552,
-0.0575026273727417,
-0.14889025688171387,
-0.09961677342653275,
-0.06934770941734314,
0.10918867588043213,
-0.048636797815561295,
0.16727615892887115,
-0.03501144424080849,
0.04687380790710449,
0.020574437454342842,
0.22149790823459625,
-0.07983614504337311,
-0.035045426338911057,
-0.02953096479177475,
0.19169217348098755,
0.02293289452791214,
0.10123874247074127,
0.0039009263273328543,
0.02663610689342022,
-0.05562928318977356,
0.3356931805610657,
0.21145908534526825,
-0.07349588721990585,
0.04438743740320206,
0.07307980209589005,
0.051443129777908325,
0.11890368908643723,
0.015651078894734383,
0.10086067020893097,
0.290741890668869,
-0.10576304793357849,
-0.02907668985426426,
-0.017210746183991432,
0.006212418898940086,
-0.036074891686439514,
0.05308860167860985,
0.06385476887226105,
-0.06612837314605713,
-0.07547162473201752,
0.1183580681681633,
-0.15669208765029907,
0.10222597420215607,
0.043853580951690674,
-0.1995655596256256,
-0.05199456587433815,
-0.03384201601147652,
0.1678486317396164,
-0.016779230907559395,
0.12842759490013123,
-0.029464829713106155,
-0.12687046825885773,
0.03811600059270859,
0.0530795156955719,
-0.2323380559682846,
-0.07880351692438126,
0.15334609150886536,
0.029253190383315086,
-0.01345739234238863,
0.010100535117089748,
0.0423290953040123,
0.07262477278709412,
0.032937027513980865,
-0.029636263847351074,
0.018044499680399895,
0.09700760990381241,
-0.12007711827754974,
-0.12795883417129517,
-0.017589513212442398,
0.059171874076128006,
-0.09466255456209183,
0.07944472134113312,
-0.19034650921821594,
0.048007190227508545,
0.005575351417064667,
-0.024408496916294098,
-0.05244743824005127,
0.06626787036657333,
-0.07314711064100266,
0.012519342824816704,
0.06918328255414963,
0.0026371274143457413,
-0.034837137907743454,
-0.03111114725470543,
-0.004997910466045141,
0.03914216533303261,
-0.08890500664710999,
-0.14575104415416718,
0.01178313884884119,
-0.06271177530288696,
0.09262482821941376,
-0.04425259679555893,
-0.08461843430995941,
-0.031173918396234512,
0.0015296326018869877,
0.0687042847275734,
-0.0828213095664978,
0.010084830224514008,
0.03441310301423073,
0.046000514179468155,
0.02118447609245777,
-0.09721404314041138,
0.03851703181862831,
0.05934135988354683,
-0.11750422418117523,
-0.05397212877869606
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_medium_pretrain_squad
This model is a fine-tuned version of [anas-awadalla/bert-medium-pretrained-on-squad](https://huggingface.co/anas-awadalla/bert-medium-pretrained-on-squad) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0973
- "exact_match": 77.95648060548723
- "f1": 85.85300366384631
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert_medium_pretrain_squad", "results": []}]}
|
question-answering
|
anas-awadalla/bert-medium-pretrained-finetuned-squad
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# bert_medium_pretrain_squad
This model is a fine-tuned version of anas-awadalla/bert-medium-pretrained-on-squad on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0973
- "exact_match": 77.95648060548723
- "f1": 85.85300366384631
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-medium-pretrained-on-squad on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973\n- \"exact_match\": 77.95648060548723\n- \"f1\": 85.85300366384631",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-medium-pretrained-on-squad on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973\n- \"exact_match\": 77.95648060548723\n- \"f1\": 85.85300366384631",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
47,
95,
6,
12,
8,
3,
90,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-medium-pretrained-on-squad on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973\n- \"exact_match\": 77.95648060548723\n- \"f1\": 85.85300366384631## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.1233932226896286,
0.1558522880077362,
-0.002197667956352234,
0.08441988378763199,
0.15989358723163605,
0.03515397012233734,
0.07351044565439224,
0.1337008774280548,
-0.10238029062747955,
0.11096196621656418,
0.09586140513420105,
0.029649412259459496,
0.06905315816402435,
0.1230185255408287,
-0.012443131767213345,
-0.23981580138206482,
0.01794000156223774,
0.010765764862298965,
-0.11195509880781174,
0.09368515759706497,
0.12582390010356903,
-0.09935491532087326,
0.08649571985006332,
0.02690920978784561,
-0.16999270021915436,
0.006742865312844515,
-0.024855663999915123,
-0.049536947160959244,
0.0710076168179512,
0.048458609730005264,
0.06579045206308365,
-0.009801618754863739,
0.07615996897220612,
-0.19838957488536835,
-0.00636043818667531,
0.06285136193037033,
0.03491349145770073,
0.09750936925411224,
0.05306222289800644,
0.05115644261240959,
0.044710833579301834,
-0.13549885153770447,
0.06889424473047256,
0.049976982176303864,
-0.10688912123441696,
-0.16913223266601562,
-0.09784593433141708,
0.08353538066148758,
0.07732003182172775,
0.10271675139665604,
-0.004908701870590448,
0.17049966752529144,
-0.02341817133128643,
0.05668454244732857,
0.18629902601242065,
-0.294206440448761,
-0.07899846136569977,
0.018368134275078773,
0.07729438692331314,
0.047053348273038864,
-0.11182117462158203,
-0.0009378606919199228,
0.05106730759143829,
0.029080908745527267,
0.06841766834259033,
-0.012138300575315952,
-0.0026435558684170246,
-0.0239772517234087,
-0.11323683708906174,
-0.06816981732845306,
0.23658247292041779,
0.07957533746957779,
-0.06013043597340584,
-0.1008540540933609,
-0.041250891983509064,
-0.10764604061841965,
-0.024928759783506393,
-0.046152252703905106,
0.025990746915340424,
-0.04511531814932823,
-0.07481016218662262,
-0.04119214788079262,
-0.07621550559997559,
-0.051314063370227814,
0.021906504407525063,
0.13914625346660614,
0.024326082319021225,
0.017825763672590256,
-0.01964745856821537,
0.07021690905094147,
-0.024763889610767365,
-0.15830814838409424,
-0.0516655370593071,
-0.008514722809195518,
-0.07865139096975327,
-0.05405231565237045,
-0.0528792105615139,
-0.012516574002802372,
0.006284318398684263,
0.15737247467041016,
-0.036070793867111206,
0.06802112609148026,
0.044359881430864334,
-0.01326510775834322,
-0.007139316760003567,
0.17140817642211914,
-0.07511758804321289,
-0.042673662304878235,
-0.005036256741732359,
0.10308126360177994,
-0.000007775503036100417,
-0.01722341775894165,
-0.07396101951599121,
-0.0381021611392498,
0.10586347430944443,
0.04601162299513817,
-0.0004588785523083061,
0.03601047024130821,
-0.05054915323853493,
-0.04827360436320305,
0.08070231229066849,
-0.11297957599163055,
0.04137163981795311,
-0.016859378665685654,
-0.07801911979913712,
-0.0519750751554966,
0.012476475909352303,
-0.011838340200483799,
-0.041193634271621704,
0.03864305838942528,
-0.11132002621889114,
-0.04084471985697746,
-0.05695101618766785,
-0.05546844005584717,
0.010553566738963127,
-0.03571813926100731,
0.009545511566102505,
-0.07339703291654587,
-0.12546125054359436,
-0.04467914626002312,
0.019575994461774826,
-0.06806168705224991,
-0.0880688726902008,
0.010229021310806274,
-0.0758017897605896,
0.029127629473805428,
-0.0066617317497730255,
0.10764391720294952,
-0.017138993367552757,
0.06976254284381866,
0.0675649493932724,
0.023842385038733482,
0.021405017003417015,
0.05358988419175148,
-0.0769374743103981,
0.050568837672472,
-0.10642819851636887,
0.09344719350337982,
-0.09305223077535629,
0.02684338018298149,
-0.13916026055812836,
-0.10963595658540726,
-0.010166416876018047,
-0.02462678775191307,
0.09223265200853348,
0.1217341348528862,
-0.11244324594736099,
-0.024265602231025696,
0.1167801171541214,
-0.0518614761531353,
-0.13658060133457184,
0.07304459810256958,
-0.019096214324235916,
0.011788694187998772,
0.04003486409783363,
0.10840294510126114,
0.15046916902065277,
-0.0934176966547966,
-0.06530231237411499,
0.02719229832291603,
0.07124676555395126,
0.01623985357582569,
0.09080559760332108,
-0.013122637756168842,
0.008894871920347214,
0.01929817907512188,
-0.10453514009714127,
-0.01947646401822567,
-0.08858726918697357,
-0.0788254588842392,
-0.059478312730789185,
-0.07114206999540329,
0.028046337887644768,
0.034191809594631195,
0.02793034352362156,
-0.05175367742776871,
-0.1176796555519104,
0.09471172839403152,
0.11921321600675583,
-0.03311450779438019,
0.003989299293607473,
-0.08324195444583893,
0.05030583590269089,
-0.07392477989196777,
-0.0166748259216547,
-0.21630562841892242,
-0.1321544498205185,
0.06507480889558792,
-0.07477238774299622,
0.019901612773537636,
0.003499305807054043,
0.05963282659649849,
0.05245744436979294,
-0.03182854503393173,
-0.02281237579882145,
-0.10326384007930756,
-0.02018291875720024,
-0.09646868705749512,
-0.1332966685295105,
-0.09691497683525085,
-0.016859738156199455,
0.17395512759685516,
-0.21028219163417816,
0.0008912152261473238,
0.03565867617726326,
0.13019712269306183,
-0.0057517350651323795,
-0.0660245418548584,
0.00431797793135047,
0.011578502133488655,
-0.003824121318757534,
-0.09899361431598663,
0.027652272954583168,
0.032179202884435654,
-0.08870526403188705,
-0.05484803766012192,
-0.13986290991306305,
0.0722448006272316,
0.07958795875310898,
0.06660635024309158,
-0.08064267039299011,
-0.006772748660296202,
-0.051981374621391296,
-0.043115925043821335,
-0.05226101353764534,
-0.049340635538101196,
0.17326205968856812,
0.015383969992399216,
0.1251920759677887,
-0.06336702406406403,
-0.05329778045415878,
0.032854072749614716,
0.012642093002796173,
-0.031263045966625214,
0.054493073374032974,
0.01150214858353138,
-0.15110944211483002,
0.10465137660503387,
0.08022776991128922,
-0.004042459186166525,
0.09908043593168259,
-0.056771911680698395,
-0.08886303752660751,
-0.05175180733203888,
0.016284525394439697,
0.01585862785577774,
0.10542748123407364,
-0.08592686057090759,
0.01549304649233818,
0.06141723319888115,
0.019402330741286278,
0.01887321099638939,
-0.13118675351142883,
0.004921494051814079,
0.04728151112794876,
-0.027887161821126938,
0.0005759469349868596,
-0.01912899874150753,
0.0015571461990475655,
0.07241258770227432,
0.06630899012088776,
0.018996959552168846,
0.034837108105421066,
-0.012562021613121033,
-0.0765770748257637,
0.19328415393829346,
-0.10762021690607071,
-0.18034608662128448,
-0.17512932419776917,
0.058258056640625,
-0.07698678225278854,
-0.0054013291373848915,
0.011204948648810387,
-0.06043161079287529,
-0.0668189525604248,
-0.05044031888246536,
-0.005955409724265337,
-0.07191197574138641,
-0.0033801421523094177,
0.083075612783432,
-0.010740156285464764,
0.11666285246610641,
-0.13145452737808228,
0.014741738326847553,
0.013299542479217052,
-0.07908366620540619,
-0.028448861092329025,
0.030235450714826584,
0.12482884526252747,
0.0700141116976738,
-0.014497803524136543,
0.017791742458939552,
-0.01559146773070097,
0.29327672719955444,
-0.08833201229572296,
-0.019209837540984154,
0.12523041665554047,
0.010622086003422737,
0.07172323018312454,
0.10355962812900543,
0.03338710963726044,
-0.08591930568218231,
0.019853394478559494,
0.03639213740825653,
-0.012263050302863121,
-0.22904784977436066,
-0.03316430374979973,
-0.035628534853458405,
-0.0751873254776001,
0.1290876716375351,
0.049387771636247635,
0.038012001663446426,
0.08306895941495895,
-0.043907467275857925,
0.07598365098237991,
-0.04559304937720299,
0.1023443192243576,
0.13022001087665558,
0.057186923921108246,
0.0998924970626831,
-0.027663225308060646,
-0.014544590376317501,
0.06637153029441833,
-0.0031884245108813047,
0.23648007214069366,
-0.0018683102680370212,
0.2032676786184311,
0.016836993396282196,
0.1373898983001709,
-0.017549147829413414,
0.05419829115271568,
0.011478510685265064,
0.004077825229614973,
0.010770393535494804,
-0.0665309801697731,
-0.05193080008029938,
0.009185402654111385,
0.002719240030273795,
0.07387007772922516,
-0.10000404715538025,
0.03990575298666954,
0.0029160580597817898,
0.2567162811756134,
0.04415647312998772,
-0.3189016878604889,
-0.10695404559373856,
0.005700564011931419,
-0.007671475410461426,
-0.11483187228441238,
-0.022947760298848152,
0.10596038401126862,
-0.14107047021389008,
0.0631636455655098,
-0.05484030023217201,
0.09621097147464752,
-0.03550025448203087,
0.005057670641690493,
0.08036437630653381,
0.09515228867530823,
0.005064111202955246,
0.09769928455352783,
-0.1729101687669754,
0.20034819841384888,
0.024608580395579338,
0.07764670997858047,
-0.05422477424144745,
0.05369550362229347,
-0.01200740598142147,
0.06968741118907928,
0.10842087864875793,
0.004234008491039276,
-0.016458913683891296,
-0.17940053343772888,
-0.09011787921190262,
0.025309031829237938,
0.11762920022010803,
-0.07385725528001785,
0.09466524422168732,
-0.05604064464569092,
-0.0017538681859150529,
0.03257614001631737,
-0.010311435908079147,
-0.12008040398359299,
-0.14093294739723206,
0.05070808157324791,
-0.005104658659547567,
0.017971208319067955,
-0.07362030446529388,
-0.08974475413560867,
-0.007944824174046516,
0.17847049236297607,
-0.011249588802456856,
-0.05645006522536278,
-0.15350772440433502,
0.10421296954154968,
0.15332703292369843,
-0.08207506686449051,
0.03331220895051956,
-0.017011793330311775,
0.1643141806125641,
0.04498644173145294,
-0.05939139798283577,
0.05266084149479866,
-0.05335721746087074,
-0.16499632596969604,
-0.03211076930165291,
0.14466309547424316,
0.0019989190623164177,
0.042406801134347916,
0.011441608890891075,
0.04111660271883011,
-0.003857301315292716,
-0.10100057721138,
0.00030041823629289865,
0.003797466866672039,
0.07662709802389145,
0.03175313398241997,
-0.027541160583496094,
0.0640629455447197,
-0.04320826381444931,
0.02478650026023388,
0.11510619521141052,
0.2257164865732193,
-0.09242624789476395,
-0.0006618742481805384,
0.048100993037223816,
-0.06299074739217758,
-0.16684681177139282,
0.03461679071187973,
0.13524885475635529,
0.019387373700737953,
0.04670700058341026,
-0.17898106575012207,
0.11424049735069275,
0.09910421073436737,
-0.029666872695088387,
0.04334358125925064,
-0.25747519731521606,
-0.11996035277843475,
0.0589408315718174,
0.08565875142812729,
0.019730011001229286,
-0.17216843366622925,
-0.08272143453359604,
-0.042426466941833496,
-0.17783477902412415,
0.09771237522363663,
-0.008498966693878174,
0.10380079597234726,
-0.015047241933643818,
0.04382199048995972,
0.039200861006975174,
-0.03102491982281208,
0.18351733684539795,
0.05101296678185463,
0.04549049958586693,
-0.056162234395742416,
0.003919463604688644,
0.11640071868896484,
-0.06623323261737823,
0.07815812528133392,
-0.013040429912507534,
0.08141575008630753,
-0.19800812005996704,
-0.012475376948714256,
-0.05945756658911705,
0.05487202852964401,
-0.07026530057191849,
-0.03078574687242508,
-0.038465458899736404,
0.03384082391858101,
0.03680850937962532,
-0.02295863628387451,
0.09102063626050949,
0.04702926427125931,
0.061539310961961746,
0.1315460205078125,
0.09167306125164032,
0.029437603428959846,
-0.17661672830581665,
0.0010595156345516443,
-0.012355816550552845,
0.07425157725811005,
-0.1218537911772728,
0.03564388304948807,
0.11110047996044159,
0.05133891478180885,
0.11301969736814499,
0.025760123506188393,
-0.07035242766141891,
-0.011554451659321785,
0.041358742862939835,
-0.08501160889863968,
-0.18895523250102997,
-0.05106130987405777,
0.0032090696040540934,
-0.1913813203573227,
-0.005859081633388996,
0.0992569774389267,
-0.04348437860608101,
-0.018346957862377167,
-0.009759487584233284,
-0.007569759618490934,
-0.009522877633571625,
0.17726805806159973,
0.06256434321403503,
0.08311218023300171,
-0.08504576236009598,
0.1012183353304863,
0.0893222913146019,
-0.043479833751916885,
0.045340895652770996,
0.036975521594285965,
-0.08368139714002609,
-0.01880466565489769,
0.04348659887909889,
0.085934117436409,
-0.04278998449444771,
-0.03336688503623009,
-0.05332234129309654,
-0.08029214292764664,
0.04807206988334656,
0.05399599298834801,
0.05361423268914223,
-0.0025466924998909235,
-0.014101988635957241,
0.019519006833434105,
-0.13703185319900513,
0.08899422734975815,
0.024308860301971436,
0.08936672657728195,
-0.14645254611968994,
0.06114908307790756,
0.0017501781694591045,
0.061492715030908585,
-0.010266820900142193,
-0.0021978046279400587,
-0.08815772831439972,
-0.04402707889676094,
-0.10700137913227081,
0.00840803049504757,
-0.03145422786474228,
-0.008074671030044556,
-0.02978852204978466,
-0.06864858418703079,
-0.0454847477376461,
0.05528314411640167,
-0.05359870567917824,
-0.09567108005285263,
0.01366958487778902,
0.05207338556647301,
-0.13502345979213715,
0.0011196358827874064,
0.042287975549697876,
-0.11027480661869049,
0.0893249362707138,
0.048762958496809006,
0.050000037997961044,
0.005740898195654154,
0.01772479899227619,
-0.016331851482391357,
0.012689405120909214,
0.02848801016807556,
0.06283679604530334,
-0.11497744917869568,
0.004931551869958639,
-0.03692079335451126,
0.0407114215195179,
0.001406557043083012,
0.061347559094429016,
-0.1397821605205536,
-0.04154947027564049,
-0.046984899789094925,
-0.014235066249966621,
-0.059226103127002716,
0.04585367813706398,
0.10674317926168442,
0.024313660338521004,
0.15587304532527924,
-0.03963935002684593,
0.030205655843019485,
-0.2307705581188202,
-0.022138850763440132,
-0.021709082648158073,
-0.055328283458948135,
-0.06264001131057739,
-0.031246419996023178,
0.07906392216682434,
-0.046664003282785416,
0.07944644242525101,
-0.03543059527873993,
0.12705866992473602,
0.04612541198730469,
0.022568097338080406,
-0.001477861194871366,
0.00815378874540329,
0.19416886568069458,
0.09221170842647552,
-0.017944682389497757,
0.1074235737323761,
0.0065471637062728405,
0.044027239084243774,
0.03539260849356651,
0.11291159689426422,
0.13431361317634583,
-0.05128812417387962,
0.06545491516590118,
0.08054102957248688,
-0.08482476323843002,
-0.15898039937019348,
0.05847253277897835,
-0.029730938374996185,
0.10570928454399109,
-0.028181159868836403,
0.10378798842430115,
0.09477156400680542,
-0.17583975195884705,
0.05693869665265083,
-0.07500059902667999,
-0.10633010417222977,
-0.09049546718597412,
-0.055081795901060104,
-0.0873628556728363,
-0.11180015653371811,
0.014685207046568394,
-0.14239899814128876,
0.024922139942646027,
0.11254389584064484,
-0.015850747004151344,
-0.008746947161853313,
0.1847265660762787,
-0.038886163383722305,
-0.003592587076127529,
0.051536258310079575,
-0.002710036700591445,
0.012140601873397827,
-0.03541704639792442,
-0.03209700062870979,
0.055004652589559555,
0.03598969802260399,
0.10209477692842484,
-0.029577044770121574,
0.013751118443906307,
0.03890005499124527,
-0.011474316008388996,
-0.10270152240991592,
0.002554049249738455,
0.02225484699010849,
0.006891228258609772,
0.01935097761452198,
0.021770114079117775,
0.02198815904557705,
-0.050626084208488464,
0.2466295212507248,
-0.054811205714941025,
-0.05001240223646164,
-0.14186254143714905,
0.13647069036960602,
0.04356999322772026,
-0.012409411370754242,
0.0723583847284317,
-0.11277010291814804,
-0.011728480458259583,
0.14878900349140167,
0.09767300635576248,
-0.04919857159256935,
-0.022387070581316948,
0.01630508527159691,
-0.01553577370941639,
-0.0363537073135376,
0.1064290851354599,
0.08173324167728424,
0.004084173124283552,
-0.05346306785941124,
0.007975810207426548,
-0.010881895199418068,
-0.043863750994205475,
-0.05721182003617287,
0.07478726655244827,
0.027639267966151237,
0.02137087658047676,
-0.03091072477400303,
0.06350962817668915,
0.026092497631907463,
-0.17421123385429382,
0.03280619904398918,
-0.16462446749210358,
-0.18433816730976105,
-0.02930995635688305,
0.03324423357844353,
0.0066176219843328,
0.0754394680261612,
0.01583195850253105,
-0.009223760105669498,
0.15001873672008514,
-0.005955500993877649,
-0.07678642868995667,
-0.09863241761922836,
0.10052572190761566,
-0.0860753133893013,
0.21181604266166687,
0.004927997011691332,
0.06337545812129974,
0.10965883731842041,
0.010116515681147575,
-0.1748954802751541,
-0.0001754044642439112,
0.08231106400489807,
-0.044916413724422455,
0.046818796545267105,
0.14583711326122284,
-0.027817636728286743,
0.0653134435415268,
0.03235111013054848,
-0.1474262773990631,
-0.04256035387516022,
-0.031070349738001823,
0.03468940034508705,
-0.08854390680789948,
-0.011121424846351147,
-0.0693277046084404,
0.1595156192779541,
0.2202308475971222,
-0.06333427131175995,
-0.042960505932569504,
-0.062227360904216766,
0.02685176022350788,
0.059674300253391266,
0.13589781522750854,
-0.023734038695693016,
-0.2003568708896637,
0.008489873260259628,
0.0073864515870809555,
0.033334068953990936,
-0.240797221660614,
-0.08830231428146362,
0.060443755239248276,
-0.05171990767121315,
-0.012142055667936802,
0.1079733744263649,
0.03302174061536789,
0.023924386128783226,
-0.04671638831496239,
-0.06449228525161743,
-0.07354148477315903,
0.14666642248630524,
-0.14977575838565826,
-0.06404200196266174
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_medium_pretrain_squad
This model is a fine-tuned version of [prajjwal1/bert-medium](https://huggingface.co/prajjwal1/bert-medium) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0973
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert_medium_pretrain_squad", "results": []}]}
|
fill-mask
|
anas-awadalla/bert-medium-pretrained-on-squad
|
[
"transformers",
"pytorch",
"bert",
"fill-mask",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# bert_medium_pretrain_squad
This model is a fine-tuned version of prajjwal1/bert-medium on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0973
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-medium on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-medium on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
54,
54,
6,
12,
8,
3,
90,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# bert_medium_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-medium on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.0973## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09395503997802734,
0.12533560395240784,
-0.002293637953698635,
0.10002526640892029,
0.16839690506458282,
0.03386710211634636,
0.07225673645734787,
0.12426257878541946,
-0.12222238630056381,
0.06903046369552612,
0.08978105336427689,
0.06466992944478989,
0.04863124340772629,
0.137523353099823,
-0.023519938811659813,
-0.25853705406188965,
0.010782675817608833,
-0.0004944042302668095,
-0.05476089194417,
0.095028817653656,
0.08649459481239319,
-0.11742611229419708,
0.07568700611591339,
0.005033406894654036,
-0.17728711664676666,
0.024377970024943352,
-0.01282515563070774,
-0.03323754295706749,
0.07597503811120987,
0.017243817448616028,
0.11421196907758713,
-0.006811775267124176,
0.13943937420845032,
-0.20588193833827972,
-0.00420946441590786,
0.08174580335617065,
0.04085221886634827,
0.07099882513284683,
0.04319849610328674,
0.02917642705142498,
0.11133015900850296,
-0.16990171372890472,
0.086750827729702,
0.027188625186681747,
-0.08257671445608139,
-0.14303015172481537,
-0.08418785035610199,
0.07695423811674118,
0.08515904098749161,
0.09652131795883179,
0.004213215783238411,
0.15035493671894073,
-0.08411762118339539,
0.0791928768157959,
0.22404880821704865,
-0.2791368067264557,
-0.06833605468273163,
0.0547846183180809,
0.048087794333696365,
0.0297005046159029,
-0.10920582711696625,
-0.014156322926282883,
0.03619920089840889,
0.03686198964715004,
0.11703148484230042,
-0.02895141765475273,
-0.07327321171760559,
-0.0024789876770228148,
-0.141736701130867,
-0.028978051617741585,
0.14897973835468292,
0.03073611855506897,
-0.03415943682193756,
-0.06501458585262299,
-0.06461203098297119,
-0.0970323383808136,
-0.016863666474819183,
-0.0275036059319973,
0.0404186025261879,
-0.05655888840556145,
-0.06315117329359055,
-0.028067603707313538,
-0.06345952302217484,
-0.06900250166654587,
-0.01267746277153492,
0.10090169310569763,
0.05849628150463104,
0.014475646428763866,
-0.029112230986356735,
0.09106937795877457,
-0.008085593581199646,
-0.1253632754087448,
-0.013902312144637108,
-0.0037232092581689358,
-0.09059664607048035,
-0.06135963648557663,
-0.05431605875492096,
-0.055424559861421585,
-0.009782945737242699,
0.11473019421100616,
-0.05606350675225258,
0.0662856176495552,
0.03724753111600876,
0.007020364049822092,
-0.005078866612166166,
0.13761916756629944,
-0.06634760648012161,
-0.031529493629932404,
-0.001795959658920765,
0.07257574051618576,
0.009748022072017193,
-0.0010534781031310558,
-0.0691564679145813,
-0.012149697169661522,
0.08877426385879517,
0.04708149656653404,
-0.05797961726784706,
0.05148565024137497,
-0.033084649592638016,
-0.02325083687901497,
0.012464579194784164,
-0.12937511503696442,
0.057622238993644714,
-0.010516173206269741,
-0.09639430046081543,
0.017263591289520264,
0.039481401443481445,
-0.03730129823088646,
-0.05241534486413002,
0.12264269590377808,
-0.09559984505176544,
0.01288585364818573,
-0.10589402168989182,
-0.1102251335978508,
0.012697398662567139,
-0.06695922464132309,
-0.02155500091612339,
-0.07481852173805237,
-0.21206647157669067,
-0.040228862315416336,
0.05337405949831009,
-0.048821333795785904,
-0.02617046982049942,
-0.04453859478235245,
-0.0641397014260292,
0.011126331053674221,
-0.0128073301166296,
0.09920432418584824,
-0.04779883846640587,
0.08774366974830627,
0.006334288511425257,
0.04757389426231384,
0.014489149674773216,
0.050153087824583054,
-0.07821844518184662,
0.023951886221766472,
-0.14145788550376892,
0.047936782240867615,
-0.08125172555446625,
0.023088576272130013,
-0.10752715915441513,
-0.12874673306941986,
0.00889643281698227,
-0.012085629627108574,
0.07294034957885742,
0.11470136046409607,
-0.19076763093471527,
-0.03290224075317383,
0.16053417325019836,
-0.07681725174188614,
-0.044078703969717026,
0.09584169834852219,
-0.051592011004686356,
0.0050814347341656685,
0.06427179276943207,
0.12541769444942474,
0.11532056331634521,
-0.15211689472198486,
-0.03406951203942299,
-0.006248675752431154,
0.050340645015239716,
-0.020179003477096558,
0.04153924435377121,
0.013132037594914436,
0.020220749080181122,
0.01983371004462242,
-0.0577814057469368,
0.008025245741009712,
-0.09886445105075836,
-0.07551680505275726,
-0.042779289186000824,
-0.09156085550785065,
0.03778247535228729,
0.04869770631194115,
0.05074469745159149,
-0.06015113368630409,
-0.1087024137377739,
0.1503327339887619,
0.11709298193454742,
-0.05407574400305748,
0.015606189146637917,
-0.06470893323421478,
0.04485243186354637,
-0.051522016525268555,
-0.02150089666247368,
-0.19993658363819122,
-0.10519527643918991,
0.04188570752739906,
-0.07351614534854889,
0.029713261872529984,
0.031040066853165627,
0.06584479659795761,
0.07285658270120621,
-0.044101689010858536,
-0.010159976780414581,
-0.08045846968889236,
-0.00927102379500866,
-0.10076943039894104,
-0.18348908424377441,
-0.07303730398416519,
-0.016272589564323425,
0.106772780418396,
-0.23003563284873962,
0.0150134963914752,
-0.06720738857984543,
0.1290561854839325,
0.012149403803050518,
-0.05802765116095543,
-0.009254747070372105,
0.07088959217071533,
-0.0072624715976417065,
-0.0797693133354187,
0.0534781776368618,
0.00021068112982902676,
-0.032574400305747986,
-0.08538713306188583,
-0.12165798991918564,
0.03834730014204979,
0.08152873069047928,
0.007164329290390015,
-0.10604618489742279,
0.015662042424082756,
-0.05974674969911575,
-0.042264800518751144,
-0.09753216803073883,
0.002109925262629986,
0.17244167625904083,
0.005966499447822571,
0.13440458476543427,
-0.04881520941853523,
-0.05664195865392685,
0.015861883759498596,
0.0037701446563005447,
-0.00465479725971818,
0.06352736055850983,
0.13074040412902832,
-0.08122581988573074,
0.09746837615966797,
0.05937855318188667,
-0.07334887236356735,
0.13959182798862457,
-0.031568218022584915,
-0.08665145188570023,
-0.02433200553059578,
-0.0033703092485666275,
-0.009782624430954456,
0.125203475356102,
-0.08399534970521927,
-0.012326152995228767,
0.025964021682739258,
0.024238744750618935,
0.024461720138788223,
-0.1673356592655182,
-0.0009173140861093998,
0.028676511719822884,
-0.03494560346007347,
-0.04190314561128616,
-0.02685859054327011,
0.025381319224834442,
0.08736662566661835,
0.02687225118279457,
-0.014116569422185421,
0.012479729019105434,
-0.0029875533655285835,
-0.07836927473545074,
0.17914733290672302,
-0.11831945925951004,
-0.15283097326755524,
-0.12578678131103516,
0.03379792347550392,
-0.06478270143270493,
-0.018728502094745636,
0.020387183874845505,
-0.07569082081317902,
-0.062648244202137,
-0.07964616268873215,
0.013219732791185379,
-0.041022833436727524,
0.01455481257289648,
0.03598187491297722,
-0.006073704920709133,
0.07905389368534088,
-0.1340121179819107,
0.0016198927769437432,
-0.027461817488074303,
-0.09316732734441757,
-0.000675843155477196,
0.04480624571442604,
0.11624778062105179,
0.104097381234169,
-0.021529467776417732,
0.030618291348218918,
-0.030908837914466858,
0.2231401652097702,
-0.043732739984989166,
0.0014544457662850618,
0.10990872979164124,
0.014888777397572994,
0.05345072224736214,
0.11446759104728699,
0.02704704739153385,
-0.0939510315656662,
0.025773392990231514,
0.07029055804014206,
-0.021145815029740334,
-0.20613664388656616,
-0.0559837706387043,
-0.04929708316922188,
-0.06590411812067032,
0.10636471956968307,
0.039225444197654724,
-0.031836237758398056,
0.03757987916469574,
0.006885036360472441,
0.07068406045436859,
-0.05108785256743431,
0.08911076933145523,
0.08739766478538513,
0.04636726155877113,
0.10273177921772003,
-0.030771875753998756,
-0.02436179108917713,
0.05848204344511032,
-0.03428451344370842,
0.2947424352169037,
-0.032852351665496826,
0.07305069267749786,
0.05149451643228531,
0.1405174285173416,
-0.023991713300347328,
0.06803803890943527,
0.004282075446099043,
-0.001022364478558302,
0.009181438013911247,
-0.05963240563869476,
-0.04464757815003395,
0.003710669232532382,
-0.008830209262669086,
0.07200178503990173,
-0.12141893059015274,
0.017323268577456474,
0.020658450201153755,
0.2639037072658539,
0.04617215320467949,
-0.30578452348709106,
-0.08472750335931778,
-0.0017541972920298576,
-0.03382361680269241,
-0.05979705601930618,
0.002111368579789996,
0.13165545463562012,
-0.11839595437049866,
0.0628974437713623,
-0.05371814966201782,
0.09274683147668839,
-0.03906552121043205,
0.00626629963517189,
0.06398846954107285,
0.13334117829799652,
0.00004036716927657835,
0.08393076807260513,
-0.21740733087062836,
0.23943248391151428,
0.01892760954797268,
0.12512867152690887,
-0.056707777082920074,
0.023606833070516586,
0.011859990656375885,
0.06851182132959366,
0.09695667028427124,
0.007682826370000839,
-0.025856226682662964,
-0.1835552304983139,
-0.05915183201432228,
0.03308374807238579,
0.11080583184957504,
0.012094985693693161,
0.09952066838741302,
-0.03903194144368172,
-0.00625182781368494,
0.0447816364467144,
-0.021787187084555626,
-0.16346029937267303,
-0.103036068379879,
0.008799359202384949,
0.017226064577698708,
-0.06241190806031227,
-0.05641147866845131,
-0.10632914304733276,
-0.035540077835321426,
0.1776880919933319,
0.004751706495881081,
-0.03601200506091118,
-0.13584363460540771,
0.08966302871704102,
0.11334967613220215,
-0.0782240554690361,
0.01620369218289852,
0.014004428870975971,
0.10988988727331161,
0.034921035170555115,
-0.08495601266622543,
0.059563662856817245,
-0.0710674598813057,
-0.1453821212053299,
-0.0650966465473175,
0.10503800213336945,
0.0710497573018074,
0.05644316226243973,
-0.00891048088669777,
0.03064623661339283,
0.01391658466309309,
-0.09274646639823914,
-0.0003411492798477411,
0.09583181142807007,
0.09713775664567947,
0.06440283358097076,
-0.10195071250200272,
-0.005560619290918112,
-0.012631532736122608,
0.014848154038190842,
0.11608049273490906,
0.203420951962471,
-0.08942735940217972,
0.05977253243327141,
0.06651101261377335,
-0.09381980448961258,
-0.19211232662200928,
0.07450839877128601,
0.10239490121603012,
0.02034689486026764,
0.0325353778898716,
-0.18810714781284332,
0.12809127569198608,
0.11094646155834198,
-0.0073195514269173145,
0.035553526133298874,
-0.33469054102897644,
-0.11846630275249481,
0.06424859166145325,
0.13663029670715332,
0.06400713324546814,
-0.14337508380413055,
-0.019901584833860397,
-0.03819350153207779,
-0.16594325006008148,
0.10812937468290329,
-0.06399623304605484,
0.11932315677404404,
-0.014530640095472336,
0.08341507613658905,
0.021401526406407356,
-0.05689289793372154,
0.12097860127687454,
0.05591202154755592,
0.08201106637716293,
-0.056813500821590424,
0.006248855032026768,
0.0674581304192543,
-0.06287543475627899,
0.07349193841218948,
-0.033563610166311264,
0.05398194491863251,
-0.1553601175546646,
-0.016155168414115906,
-0.07795426994562149,
0.06945347040891647,
-0.05100918933749199,
-0.057386815547943115,
-0.04639171063899994,
0.06077206879854202,
0.07945352047681808,
-0.027723638340830803,
0.044253651052713394,
0.02553119696676731,
0.09770715981721878,
0.06897120177745819,
0.06380796432495117,
-0.0014583923621103168,
-0.11445648968219757,
0.004953660536557436,
-0.006740991957485676,
0.07166310399770737,
-0.10372912138700485,
0.017355307936668396,
0.14257670938968658,
0.04818969964981079,
0.13892732560634613,
0.04847919940948486,
-0.04300031438469887,
-0.002409640932455659,
0.036825139075517654,
-0.12487547844648361,
-0.11786241084337234,
0.013628745451569557,
-0.05375475436449051,
-0.1377498060464859,
0.01859782449901104,
0.0817200243473053,
-0.08118616044521332,
-0.0050738537684082985,
-0.01549468282610178,
0.019291676580905914,
-0.033172041177749634,
0.2052852064371109,
0.034758638590574265,
0.06152302771806717,
-0.08512099832296371,
0.10289621353149414,
0.06561252474784851,
-0.06276273727416992,
0.04538575932383537,
0.07431460916996002,
-0.08883566409349442,
-0.017426658421754837,
0.0910467803478241,
0.18444682657718658,
-0.03243637830018997,
-0.02802938036620617,
-0.09984466433525085,
-0.07959605008363724,
0.06213821470737457,
0.13865333795547485,
0.05707871913909912,
-0.023833254352211952,
-0.045879051089286804,
0.031656187027692795,
-0.15889166295528412,
0.07408151775598526,
0.06665986776351929,
0.06870754063129425,
-0.13200075924396515,
0.1604231595993042,
0.01035214401781559,
0.047187384217977524,
-0.02253374084830284,
0.026470011100172997,
-0.12772947549819946,
-0.017919333651661873,
-0.10730841010808945,
-0.01894722878932953,
-0.05283471941947937,
-0.007419598288834095,
-0.015625623986124992,
-0.04161466658115387,
-0.04852328076958656,
0.04541993886232376,
-0.0712142288684845,
-0.06353830546140671,
0.026016542688012123,
0.047624070197343826,
-0.1436319351196289,
-0.013415414839982986,
0.008866406977176666,
-0.08031241595745087,
0.05131441354751587,
0.0488075315952301,
0.018390115350484848,
0.04141046851873398,
-0.09056121855974197,
-0.03784553334116936,
0.02983185090124607,
0.021675575524568558,
0.08633548021316528,
-0.07645957171916962,
0.014097962528467178,
-0.01993451453745365,
0.060257550328969955,
0.018257683143019676,
0.0632629320025444,
-0.12398501485586166,
0.00895901583135128,
-0.06239176169037819,
-0.051723234355449677,
-0.05281662940979004,
0.04228426143527031,
0.10036758333444595,
0.03450659662485123,
0.17535091936588287,
-0.09354844689369202,
0.04820765182375908,
-0.1940109133720398,
-0.038211923092603683,
0.0009577579912729561,
-0.04431939870119095,
-0.06066368892788887,
-0.036425407975912094,
0.0844535306096077,
-0.05401116982102394,
0.10758553445339203,
0.022527100518345833,
0.07764710485935211,
0.037712108343839645,
-0.03778478503227234,
-0.04986322298645973,
-0.00512072304263711,
0.1469685584306717,
0.049459971487522125,
-0.0339869000017643,
0.09544456005096436,
0.015467475168406963,
0.06575173884630203,
0.06591986119747162,
0.23671580851078033,
0.14905741810798645,
-0.029923386871814728,
0.07026941329240799,
0.051721010357141495,
-0.10830917209386826,
-0.16152812540531158,
0.05924646183848381,
-0.02705307863652706,
0.13490939140319824,
-0.04938063398003578,
0.1676386296749115,
0.06897590309381485,
-0.1771012842655182,
0.06682981550693512,
-0.056057460606098175,
-0.11844479292631149,
-0.11878347396850586,
-0.049897704273462296,
-0.07751407474279404,
-0.10410446673631668,
0.020492026582360268,
-0.11488845199346542,
0.03026369959115982,
0.08501360565423965,
0.0058936611749231815,
0.003179409774020314,
0.16012732684612274,
-0.0381975956261158,
0.026571856811642647,
0.04420189931988716,
0.009785962291061878,
-0.004362710285931826,
-0.06934038549661636,
-0.03975563496351242,
0.02226230874657631,
-0.0020420467481017113,
0.07802657783031464,
-0.034293610602617264,
-0.00151549163274467,
0.03666200488805771,
-0.009544103406369686,
-0.07411748915910721,
0.016092222183942795,
0.018954653292894363,
0.04131009057164192,
0.052335552871227264,
0.0495687797665596,
-0.004773124121129513,
-0.04983613267540932,
0.23577867448329926,
-0.07294831424951553,
-0.08756541460752487,
-0.1267140656709671,
0.23650071024894714,
0.0717189759016037,
-0.021154241636395454,
0.06270794570446014,
-0.10944545269012451,
-0.024021990597248077,
0.19915705919265747,
0.15870875120162964,
-0.04592803865671158,
-0.01054500974714756,
-0.014473382383584976,
-0.014005357399582863,
-0.03689117729663849,
0.13852278888225555,
0.10477994382381439,
0.07905824482440948,
-0.04344930127263069,
-0.02362692914903164,
-0.03152473643422127,
-0.03287497162818909,
-0.0921681746840477,
0.032974548637866974,
0.04274619743227959,
0.011070188134908676,
-0.034378938376903534,
0.05715382099151611,
-0.01810905896127224,
-0.18785959482192993,
0.05945771560072899,
-0.1591617614030838,
-0.1691354215145111,
-0.024995282292366028,
0.07948438078165054,
-0.0062093306332826614,
0.06730173528194427,
-0.025169512256979942,
0.0033672903664410114,
0.16155971586704254,
-0.013264761306345463,
-0.06371694058179855,
-0.1239878386259079,
0.08466143161058426,
-0.06834210455417633,
0.21902599930763245,
0.000596469035372138,
0.06376444548368454,
0.10755182802677155,
0.028544778004288673,
-0.12322700023651123,
0.048223044723272324,
0.05794141814112663,
-0.08226853609085083,
0.02049431949853897,
0.16265900433063507,
-0.058109305799007416,
0.09376679360866547,
0.02045941725373268,
-0.1171918734908104,
-0.009614141657948494,
-0.06524239480495453,
-0.01802223175764084,
-0.08062184602022171,
0.005987664218991995,
-0.06351672112941742,
0.14880803227424622,
0.23106786608695984,
-0.019143734127283096,
0.005240160971879959,
-0.09839624911546707,
0.0316583625972271,
0.04680272564291954,
0.1017458587884903,
-0.04710377752780914,
-0.20167572796344757,
0.016240477561950684,
-0.022928357124328613,
0.0291612446308136,
-0.24026569724082947,
-0.08877571672201157,
0.0335383266210556,
-0.06338807940483093,
-0.059456102550029755,
0.10578518360853195,
0.03879411891102791,
0.04738127067685127,
-0.044814664870500565,
-0.1158946231007576,
-0.04536379128694534,
0.14900939166545868,
-0.16297359764575958,
-0.06815014779567719
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-small-finetuned-squad
This model is a fine-tuned version of [prajjwal1/bert-small](https://huggingface.co/prajjwal1/bert-small) on the squad dataset.
It achieves the following results on the evaluation set:
- eval_loss: 1.3138
- eval_runtime: 46.6577
- eval_samples_per_second: 231.13
- eval_steps_per_second: 14.446
- epoch: 4.0
- step: 22132
{'exact_match': 71.05960264900662, 'f1': 80.8260245470904}
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-small-finetuned-squad", "results": []}]}
|
question-answering
|
anas-awadalla/bert-small-finetuned-squad
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# bert-small-finetuned-squad
This model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.
It achieves the following results on the evaluation set:
- eval_loss: 1.3138
- eval_runtime: 46.6577
- eval_samples_per_second: 231.13
- eval_steps_per_second: 14.446
- epoch: 4.0
- step: 22132
{'exact_match': 71.05960264900662, 'f1': 80.8260245470904}
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
[
"# bert-small-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.3138\n- eval_runtime: 46.6577\n- eval_samples_per_second: 231.13\n- eval_steps_per_second: 14.446\n- epoch: 4.0\n- step: 22132\n\n{'exact_match': 71.05960264900662, 'f1': 80.8260245470904}",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 20",
"### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# bert-small-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.3138\n- eval_runtime: 46.6577\n- eval_samples_per_second: 231.13\n- eval_steps_per_second: 14.446\n- epoch: 4.0\n- step: 22132\n\n{'exact_match': 71.05960264900662, 'f1': 80.8260245470904}",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 20",
"### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.0\n- Tokenizers 0.10.3"
] |
[
46,
135,
6,
12,
8,
3,
90,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# bert-small-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 1.3138\n- eval_runtime: 46.6577\n- eval_samples_per_second: 231.13\n- eval_steps_per_second: 14.446\n- epoch: 4.0\n- step: 22132\n\n{'exact_match': 71.05960264900662, 'f1': 80.8260245470904}## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 20### Framework versions\n\n- Transformers 4.15.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.0\n- Tokenizers 0.10.3"
] |
[
-0.07704535871744156,
0.08598323911428452,
-0.004494743887335062,
0.0968918427824974,
0.14258649945259094,
0.021765757352113724,
0.1290193349123001,
0.11342119425535202,
-0.0738651230931282,
0.0829835906624794,
0.06322038173675537,
0.09285782277584076,
0.03779708966612816,
0.10807615518569946,
-0.05797439441084862,
-0.1897258311510086,
-0.004422049503773451,
-0.02783249504864216,
-0.05088936537504196,
0.10836528241634369,
0.09614472091197968,
-0.12053345143795013,
0.05194060876965523,
-0.012555248104035854,
-0.14799588918685913,
0.035865530371665955,
0.021124375984072685,
-0.016048284247517586,
0.118619903922081,
0.013476544991135597,
0.12599121034145355,
0.010338067077100277,
0.13424749672412872,
-0.21224407851696014,
-0.00891701877117157,
0.09418342262506485,
0.043641671538352966,
0.05997971072793007,
0.06952861696481705,
0.002028093673288822,
0.038600578904151917,
-0.1440925896167755,
0.08517420291900635,
0.0126308249309659,
-0.13741914927959442,
-0.17404505610466003,
-0.11894828826189041,
0.0013408479280769825,
0.10887819528579712,
0.0864778459072113,
-0.02230302430689335,
0.15682606399059296,
-0.0770745798945427,
0.08519744873046875,
0.24292483925819397,
-0.29275840520858765,
-0.07029061019420624,
0.05330566316843033,
0.04429835081100464,
0.06640390306711197,
-0.08894564211368561,
0.009432191029191017,
0.027441179379820824,
0.024773219600319862,
0.10546471923589706,
-0.03625136613845825,
-0.0982237234711647,
0.01542856264859438,
-0.13602064549922943,
-0.021468279883265495,
0.06402568519115448,
0.03966399282217026,
-0.039184294641017914,
-0.05490587651729584,
-0.08752409368753433,
-0.07976888865232468,
-0.006813173647969961,
-0.06873776018619537,
0.06890638172626495,
-0.04483423009514809,
-0.03979824483394623,
-0.004648296162486076,
-0.06293194741010666,
-0.043363846838474274,
-0.03393332287669182,
0.1216341182589531,
0.016842765733599663,
0.02542482316493988,
-0.03289581462740898,
0.085004061460495,
-0.04571249708533287,
-0.12825046479701996,
-0.018557509407401085,
0.020034288987517357,
-0.12782339751720428,
-0.07106849551200867,
-0.06809528917074203,
-0.07405994087457657,
-0.028175147250294685,
0.19301551580429077,
-0.04192236438393593,
0.08239994198083878,
0.022965876385569572,
-0.015171206556260586,
-0.04286021739244461,
0.15489710867404938,
-0.03063495270907879,
-0.08984404057264328,
-0.030218392610549927,
0.07584488391876221,
0.004389531444758177,
-0.024412300437688828,
-0.03989340737462044,
0.01777898147702217,
0.0664944052696228,
0.027682099491357803,
-0.040647346526384354,
0.023999661207199097,
-0.042246006429195404,
-0.01592387817800045,
-0.06355724483728409,
-0.13470913469791412,
0.06261374801397324,
-0.013075740076601505,
-0.08700843155384064,
-0.01599038578569889,
0.00967093463987112,
0.01415275875478983,
-0.02683297172188759,
0.15239308774471283,
-0.06704447418451309,
0.03419651463627815,
-0.08595829457044601,
-0.11367487162351608,
0.00039016781374812126,
-0.07345972210168839,
-0.02252541296184063,
-0.04716495797038078,
-0.16054004430770874,
-0.06455976516008377,
0.06435831636190414,
-0.06358128041028976,
-0.004927640315145254,
-0.03633136302232742,
-0.05168455094099045,
0.026064511388540268,
-0.024823157116770744,
0.14064021408557892,
-0.05108427256345749,
0.05858822539448738,
0.0228170994669199,
0.05844149738550186,
0.01980329304933548,
0.027672534808516502,
-0.06927069276571274,
0.02835480496287346,
-0.1809743493795395,
0.0770982950925827,
-0.06417248398065567,
0.01354675181210041,
-0.1202901154756546,
-0.08856194466352463,
0.025862662121653557,
-0.017257045954465866,
0.09372348338365555,
0.08602525293827057,
-0.20454509556293488,
-0.02895691618323326,
0.14687591791152954,
-0.05646096542477608,
-0.055551882833242416,
0.07829690724611282,
-0.07294727861881256,
0.01944022998213768,
0.05916443467140198,
0.17278249561786652,
0.0424526073038578,
-0.14636950194835663,
-0.008803032338619232,
-0.0487021841108799,
0.0555841326713562,
0.07850331813097,
0.03686187416315079,
-0.007123315241187811,
0.10353625565767288,
-0.013285554014146328,
-0.09520668536424637,
-0.029545243829488754,
-0.08370392769575119,
-0.07632669806480408,
-0.03333110362291336,
-0.06622330099344254,
0.02081090398132801,
0.03521554917097092,
0.028620146214962006,
-0.07728850841522217,
-0.09616188704967499,
0.1522427499294281,
0.07770126312971115,
-0.04621206596493721,
0.026394443586468697,
-0.07779742777347565,
0.01128020416945219,
-0.0049736835062503815,
-0.030646752566099167,
-0.22738510370254517,
-0.11016666889190674,
0.019289493560791016,
-0.07385239005088806,
0.026215141639113426,
0.046374306082725525,
0.07529426366090775,
0.058043114840984344,
-0.04358853027224541,
0.019496452063322067,
-0.07489442080259323,
-0.01596396043896675,
-0.1209712103009224,
-0.18250896036624908,
-0.06105868145823479,
-0.007545932661741972,
0.17157945036888123,
-0.20819029211997986,
-0.004078192636370659,
-0.027720291167497635,
0.1323522925376892,
0.011602417565882206,
-0.07069411873817444,
-0.06270188838243484,
0.052247464656829834,
-0.011032694950699806,
-0.08123022317886353,
0.04911315441131592,
-0.019776739180088043,
-0.03683536872267723,
-0.08063528686761856,
-0.1636916846036911,
0.015048205852508545,
0.08178295940160751,
-0.027938563376665115,
-0.10555414855480194,
0.03628664091229439,
-0.04354734346270561,
-0.02333632856607437,
-0.10269143432378769,
0.0038919851649552584,
0.20738406479358673,
0.032570626586675644,
0.12028295546770096,
-0.041195351630449295,
-0.06794535368680954,
-0.009306823834776878,
0.002441523829475045,
0.04679068922996521,
0.09128755331039429,
0.08051407337188721,
-0.10582858324050903,
0.061111364513635635,
0.08677854388952255,
-0.03551354631781578,
0.08537183701992035,
-0.028883537277579308,
-0.07905726134777069,
-0.05905045568943024,
-0.002642969135195017,
0.007220105268061161,
0.11124816536903381,
-0.049905188381671906,
-0.009567360393702984,
0.02418660745024681,
0.03538088500499725,
-0.017708495259284973,
-0.1856638789176941,
-0.0010734322713688016,
0.04183546453714371,
-0.031900327652692795,
-0.018603594973683357,
-0.05489487946033478,
0.036284588277339935,
0.1018771305680275,
0.02541862614452839,
-0.030053909868001938,
-0.021240899339318275,
-0.028603916987776756,
-0.08028408139944077,
0.17983049154281616,
-0.07342999428510666,
-0.10777847468852997,
-0.08491232246160507,
-0.03528010472655296,
-0.052273936569690704,
-0.012408281676471233,
0.025725658982992172,
-0.07651548087596893,
-0.06648935377597809,
-0.08772590011358261,
0.0024607751984149218,
-1.7872654467510074e-8,
-0.0026797479949891567,
0.06328950822353363,
-0.021103965118527412,
0.10702728480100632,
-0.13030052185058594,
-0.0009175497107207775,
-0.04773503914475441,
-0.06210803613066673,
0.013854826800525188,
0.10163519531488419,
0.09508771449327469,
0.1238497942686081,
-0.024095579981803894,
0.019357891753315926,
-0.017229722812771797,
0.27437353134155273,
-0.0725213885307312,
0.005090401507914066,
0.1072366014122963,
-0.009964692406356335,
0.053540267050266266,
0.131081685423851,
0.043355464935302734,
-0.12646809220314026,
0.023319264873862267,
0.10607250034809113,
-0.0310137327760458,
-0.242396742105484,
-0.018899474292993546,
-0.030486853793263435,
-0.09943100064992905,
0.09444277733564377,
0.01061953790485859,
-0.04328107088804245,
0.022496651858091354,
0.02533428743481636,
0.05005909875035286,
-0.010733101516962051,
0.06926564872264862,
0.12234003841876984,
0.0512874536216259,
0.1381492167711258,
-0.013891763053834438,
-0.027543535456061363,
0.04897093027830124,
-0.047719355672597885,
0.23276396095752716,
-0.007480147294700146,
0.07751544564962387,
0.0474674366414547,
0.12406417727470398,
-0.045869119465351105,
0.02692430093884468,
0.032233208417892456,
-0.024297000840306282,
-0.006452842149883509,
-0.0566398985683918,
-0.017804421484470367,
0.023737456649541855,
-0.04916530102491379,
0.024950893595814705,
-0.06257527321577072,
0.04055124521255493,
0.047771017998456955,
0.2712963819503784,
0.08360865712165833,
-0.29204288125038147,
-0.05294131860136986,
0.010911188088357449,
-0.037460409104824066,
-0.032163724303245544,
-0.029074179008603096,
0.10044878721237183,
-0.11114896088838577,
0.08445203304290771,
-0.056812677532434464,
0.09804914891719818,
-0.02867758274078369,
0.029809333384037018,
0.0845951959490776,
0.08856946974992752,
0.00047835856094025075,
0.028219914063811302,
-0.2076236605644226,
0.23774370551109314,
0.028493814170360565,
0.09849822521209717,
-0.03533390909433365,
0.029975110664963722,
0.0060877990908920765,
0.016235683113336563,
0.0831189975142479,
0.013515098951756954,
-0.08405598253011703,
-0.18613609671592712,
-0.04546363279223442,
0.021725470200181007,
0.1518017202615738,
-0.054778605699539185,
0.12199359387159348,
-0.04511447623372078,
-0.0040566688403487206,
0.035700567066669464,
0.005983341485261917,
-0.11162595450878143,
-0.07447126507759094,
0.025105822831392288,
0.004626105073839426,
-0.06851617246866226,
-0.04549261927604675,
-0.08534978330135345,
-0.0941363275051117,
0.18050731718540192,
-0.0031896643340587616,
-0.018515178933739662,
-0.13779744505882263,
0.11291675269603729,
0.08901109546422958,
-0.05936744064092636,
0.005900203250348568,
0.04149458929896355,
0.07524694502353668,
0.03628535941243172,
-0.06988299638032913,
0.08678130060434341,
-0.04369143024086952,
-0.14920943975448608,
-0.08694127947092056,
0.11532530188560486,
0.04715798795223236,
0.050878070294857025,
-0.004270497709512711,
0.06788657605648041,
0.03403402119874954,
-0.08709022402763367,
0.021192220970988274,
0.017824092879891396,
0.05641784518957138,
0.03723521530628204,
-0.03244893252849579,
0.03709886968135834,
-0.053550589829683304,
0.0030071930959820747,
0.08996959775686264,
0.28452223539352417,
-0.09023133665323257,
0.04316253215074539,
0.026603491976857185,
-0.08949437737464905,
-0.159991055727005,
0.09270240366458893,
0.10983334481716156,
-0.002513025188818574,
0.0970027819275856,
-0.14116792380809784,
0.1530332714319229,
0.13024890422821045,
-0.010595656931400299,
0.02148275077342987,
-0.3212284743785858,
-0.1535605788230896,
0.0154759231954813,
0.11560490727424622,
0.1013702005147934,
-0.153725728392601,
-0.03964494541287422,
-0.024302439764142036,
-0.21676498651504517,
0.06714130938053131,
-0.07475162297487259,
0.09804894775152206,
0.0235527902841568,
0.06663282960653305,
0.023395130410790443,
-0.03616441413760185,
0.15032139420509338,
0.06712497025728226,
0.09250368177890778,
-0.05989891290664673,
0.01908877305686474,
0.07809562236070633,
-0.06195901706814766,
0.0077270204201340675,
0.004178553819656372,
0.03924165293574333,
-0.1366281807422638,
-0.01800110749900341,
-0.06234465911984444,
0.02170315943658352,
-0.05710125342011452,
-0.056960154324769974,
-0.03644905611872673,
0.026813484728336334,
0.0745711699128151,
-0.022102627903223038,
0.046880051493644714,
-0.0071935164742171764,
0.11206942051649094,
0.09917356818914413,
0.04517931491136551,
-0.03249982371926308,
-0.13158786296844482,
0.019355149939656258,
0.016904979944229126,
0.03173843398690224,
-0.11806856095790863,
0.05094771087169647,
0.1604873687028885,
0.06069961562752724,
0.14004334807395935,
0.04739551991224289,
-0.02923440746963024,
0.012933610938489437,
0.02174942009150982,
-0.1363839954137802,
-0.13315144181251526,
0.0544101782143116,
-0.09302063286304474,
-0.1049959659576416,
-0.0015237764455378056,
0.15708521008491516,
-0.020397666841745377,
-0.0066514769569039345,
-0.002344441134482622,
0.016428222879767418,
-0.021249959245324135,
0.21138501167297363,
-0.009848346002399921,
0.07196555286645889,
-0.08976120501756668,
0.09263177961111069,
0.08795551210641861,
-0.06712500005960464,
0.019429033622145653,
0.08134598284959793,
-0.07863008230924606,
0.0027813680935651064,
0.057760462164878845,
0.1489315927028656,
-0.06264336407184601,
0.008078084327280521,
-0.1190803200006485,
-0.0929560512304306,
0.05435548722743988,
0.13999314606189728,
0.05147559195756912,
-0.029721125960350037,
-0.02386414259672165,
0.017489822581410408,
-0.11068493872880936,
0.06250152736902237,
0.06289642304182053,
0.072780080139637,
-0.09259751439094543,
0.1763858199119568,
-0.0027368981391191483,
0.007921308279037476,
-0.010810486972332,
0.030386365950107574,
-0.10061482340097427,
0.000053295640100259334,
-0.12649978697299957,
-0.015472794882953167,
0.01718783937394619,
-0.00787730049341917,
-0.006545815616846085,
-0.05699644982814789,
-0.05804447829723358,
0.03575475886464119,
-0.08625391125679016,
-0.06572944670915604,
0.02170480228960514,
0.029451079666614532,
-0.16321337223052979,
-0.02568686567246914,
0.030410001054406166,
-0.08761154860258102,
0.06656701862812042,
0.04743694141507149,
0.0214043278247118,
0.029807211831212044,
-0.10188134759664536,
-0.04204695299267769,
0.019374828785657883,
0.025368284434080124,
0.08508113026618958,
-0.10617034137248993,
-0.010002929717302322,
-0.026551470160484314,
0.09109149128198624,
0.036410145461559296,
0.029657604172825813,
-0.10861355066299438,
0.009964554570615292,
-0.058313291519880295,
-0.07028260082006454,
-0.04567563161253929,
0.012568267993628979,
0.0956299751996994,
0.05255897715687752,
0.1544221043586731,
-0.07742674648761749,
0.05368059501051903,
-0.2417854517698288,
-0.04270865023136139,
0.008718917146325111,
-0.04693014919757843,
-0.06482212245464325,
-0.05709806829690933,
0.09639350324869156,
-0.06586731225252151,
0.08849279582500458,
0.02913021109998226,
0.14980483055114746,
0.06531313061714172,
-0.03831600025296211,
-0.06531348079442978,
0.021679095923900604,
0.11937335878610611,
0.0641934722661972,
-0.03071405738592148,
0.08668426424264908,
0.014346563257277012,
0.07208327203989029,
0.08140217512845993,
0.23850516974925995,
0.20007628202438354,
0.019705018028616905,
0.055445875972509384,
0.018832318484783173,
-0.10494831949472427,
-0.18201246857643127,
0.10929428786039352,
-0.07471375167369843,
0.12903937697410583,
-0.055965278297662735,
0.15330713987350464,
0.04193693771958351,
-0.19833649694919586,
0.06295274943113327,
-0.09942562878131866,
-0.09871653467416763,
-0.12688201665878296,
0.019076867029070854,
-0.08326311409473419,
-0.10317137092351913,
0.020133625715970993,
-0.10523022711277008,
0.09425175935029984,
0.14919915795326233,
0.014023895375430584,
0.02311546355485916,
0.11722932755947113,
-0.057878460735082626,
0.0177322831004858,
0.04523034766316414,
0.009528720751404762,
-0.0034548055846244097,
-0.037146665155887604,
-0.05689273774623871,
0.033427443355321884,
0.01879415288567543,
0.07753166556358337,
-0.03926490619778633,
0.004147226456552744,
0.011237469501793385,
-0.016225088387727737,
-0.09193965792655945,
0.023062679916620255,
0.02329391799867153,
0.041260335594415665,
0.021895011886954308,
0.05608895793557167,
0.0391138531267643,
-0.044626958668231964,
0.2879005968570709,
-0.06648281216621399,
-0.08385645598173141,
-0.1260664463043213,
0.2321896106004715,
0.045223258435726166,
0.0305490642786026,
0.044971562922000885,
-0.10843560844659805,
-0.01074714120477438,
0.1540302038192749,
0.0926889106631279,
-0.1074029952287674,
-0.021218959242105484,
-0.015044458210468292,
-0.006774101871997118,
-0.02157733030617237,
0.09992382675409317,
0.08074179291725159,
0.02759571000933647,
-0.05332384258508682,
-0.0038044091779738665,
-0.00933042261749506,
-0.03887401521205902,
-0.04601472243666649,
0.060896921902894974,
0.01925356313586235,
0.034998927265405655,
-0.034508757293224335,
0.06814035028219223,
0.03769765421748161,
-0.24084950983524323,
0.0881219431757927,
-0.21896924078464508,
-0.18966469168663025,
0.004505020100623369,
0.10572094470262527,
-0.02695677988231182,
0.07053103297948837,
-0.01840919628739357,
-0.023921910673379898,
0.12538990378379822,
-0.022172201424837112,
-0.019616566598415375,
-0.13708089292049408,
0.09238196909427643,
-0.1594831943511963,
0.21643400192260742,
-0.01360892690718174,
0.06780292093753815,
0.11173955351114273,
0.02592519484460354,
-0.1031605526804924,
0.03967847675085068,
0.07517416775226593,
-0.1246105208992958,
0.009328811429440975,
0.15872399508953094,
-0.05176406726241112,
0.10127396881580353,
0.06652676314115524,
-0.15451617538928986,
0.006033557467162609,
0.0031309546902775764,
-0.022863490507006645,
-0.06802638620138168,
-0.05013241618871689,
-0.0702115148305893,
0.12899185717105865,
0.22666874527931213,
-0.005347192287445068,
0.050400182604789734,
-0.07629259675741196,
0.005048478953540325,
0.024711135774850845,
0.12512394785881042,
-0.05607731267809868,
-0.21970166265964508,
0.0589885413646698,
0.00949114840477705,
0.019923876971006393,
-0.2150445133447647,
-0.11688315123319626,
0.06761416792869568,
-0.04949921369552612,
-0.02597232721745968,
0.12668377161026,
0.07061871886253357,
0.036514103412628174,
-0.04646659269928932,
-0.2197270542383194,
-0.030409300699830055,
0.17840197682380676,
-0.11323605477809906,
-0.06394200772047043
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-small-pretrained-finetuned-squad
This model is a fine-tuned version of [anas-awadalla/bert-small-pretrained-on-squad](https://huggingface.co/anas-awadalla/bert-small-pretrained-on-squad) on the squad dataset.
- "exact_match": 72.20435193945127
- "f1": 81.31832229156294
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-small-pretrained-finetuned-squad", "results": []}]}
|
question-answering
|
anas-awadalla/bert-small-pretrained-finetuned-squad
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# bert-small-pretrained-finetuned-squad
This model is a fine-tuned version of anas-awadalla/bert-small-pretrained-on-squad on the squad dataset.
- "exact_match": 72.20435193945127
- "f1": 81.31832229156294
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert-small-pretrained-finetuned-squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-small-pretrained-on-squad on the squad dataset.\n\n- \"exact_match\": 72.20435193945127\n- \"f1\": 81.31832229156294",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# bert-small-pretrained-finetuned-squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-small-pretrained-on-squad on the squad dataset.\n\n- \"exact_match\": 72.20435193945127\n- \"f1\": 81.31832229156294",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
47,
83,
6,
12,
8,
3,
90,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# bert-small-pretrained-finetuned-squad\n\nThis model is a fine-tuned version of anas-awadalla/bert-small-pretrained-on-squad on the squad dataset.\n\n- \"exact_match\": 72.20435193945127\n- \"f1\": 81.31832229156294## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.11670950055122375,
0.1785510629415512,
-0.0032913656905293465,
0.09355096518993378,
0.12930350005626678,
0.015284237451851368,
0.09072671830654144,
0.14223608374595642,
-0.05523882061243057,
0.09121296554803848,
0.10865246504545212,
0.04218291491270065,
0.059647198766469955,
0.1260816603899002,
-0.03707541525363922,
-0.19072674214839935,
0.006892846431583166,
0.02454482950270176,
-0.004481730982661247,
0.10469619929790497,
0.11524523794651031,
-0.09785117954015732,
0.08106999844312668,
0.02083676867187023,
-0.12530235946178436,
0.0070235892198979855,
-0.01718793995678425,
-0.05185333266854286,
0.08301793038845062,
0.03170431777834892,
0.08778412640094757,
-0.01113979984074831,
0.07969052344560623,
-0.19123734533786774,
-0.004072011914104223,
0.05579715222120285,
0.03301721811294556,
0.088020458817482,
0.030370676890015602,
0.029426954686641693,
0.051149290055036545,
-0.10861935466527939,
0.09212452918291092,
0.022177021950483322,
-0.10581300407648087,
-0.13306012749671936,
-0.08501846343278885,
0.05674603208899498,
0.08459212630987167,
0.10700559616088867,
0.0023347889073193073,
0.17505066096782684,
-0.0304147657006979,
0.06421258300542831,
0.1485280841588974,
-0.27694782614707947,
-0.06074501946568489,
0.00658432999625802,
0.06163568049669266,
0.06666211038827896,
-0.09398398548364639,
-0.021020323038101196,
0.032339125871658325,
0.03720563277602196,
0.09098095446825027,
-0.013719817623496056,
-0.012358911335468292,
-0.02916235476732254,
-0.10808715224266052,
-0.07545290887355804,
0.1898135095834732,
0.06209608539938927,
-0.05812123045325279,
-0.12272266298532486,
-0.04712618514895439,
-0.09848007559776306,
-0.023991374298930168,
-0.03332485631108284,
0.0049111507833004,
-0.049198027700185776,
-0.03813663870096207,
-0.04692479223012924,
-0.07721974700689316,
-0.0375232957303524,
0.012215621769428253,
0.10330269485712051,
0.023158196359872818,
0.021636763587594032,
-0.025917716324329376,
0.07411831617355347,
-0.04433343932032585,
-0.1514737606048584,
-0.01668516919016838,
-0.011645474471151829,
-0.05766008421778679,
-0.05444369465112686,
-0.039977967739105225,
-0.06234199181199074,
-0.014770127832889557,
0.1431100070476532,
-0.023373974487185478,
0.061892736703157425,
0.01726282387971878,
-0.00569780170917511,
-0.017764277756214142,
0.16162501275539398,
-0.02889767475426197,
-0.0606202557682991,
0.010284370742738247,
0.14415355026721954,
0.022976523265242577,
-0.02813597396016121,
-0.0941636860370636,
-0.028401987627148628,
0.088733971118927,
0.039581093937158585,
-0.007274155505001545,
0.016620973125100136,
-0.05488551780581474,
-0.04909825325012207,
0.10189130902290344,
-0.11721171438694,
0.03785090520977974,
-0.023157229647040367,
-0.07264957576990128,
-0.08849567919969559,
0.003842197125777602,
0.0019199175294488668,
-0.046600472182035446,
0.06263397634029388,
-0.09100725501775742,
-0.031717024743556976,
-0.06872738897800446,
-0.07010886073112488,
0.005057021509855986,
-0.07437720149755478,
0.011438317596912384,
-0.08424703776836395,
-0.15167765319347382,
-0.03845898434519768,
0.036953095346689224,
-0.06305024027824402,
-0.0728062093257904,
-0.013737309724092484,
-0.05143902823328972,
0.027941588312387466,
-0.008092778734862804,
0.10190962255001068,
-0.04496922343969345,
0.07342315465211868,
0.04360560327768326,
0.02014368213713169,
-0.004058490041643381,
0.05144022777676582,
-0.08964695781469345,
0.046162333339452744,
-0.08125688135623932,
0.07764634490013123,
-0.0784766674041748,
0.02953028678894043,
-0.12007477134466171,
-0.10270772874355316,
-0.011432358995079994,
-0.038180895149707794,
0.09930293262004852,
0.10310325026512146,
-0.14233040809631348,
-0.01259113010019064,
0.11033942550420761,
-0.05151032283902168,
-0.11581429839134216,
0.10394992679357529,
-0.038577206432819366,
0.010631939396262169,
0.05035192891955376,
0.15793630480766296,
0.15475590527057648,
-0.11660712212324142,
-0.03198111057281494,
0.011081741191446781,
0.07806053757667542,
-0.01716664247214794,
0.0774715393781662,
0.0005241780309006572,
0.030639776960015297,
0.0096462806686759,
-0.052960820496082306,
-0.0006918123690411448,
-0.06620202213525772,
-0.09026272594928741,
-0.04744629189372063,
-0.09413006901741028,
0.03929811343550682,
0.047779131680727005,
0.017710616812109947,
-0.07862114161252975,
-0.12120197713375092,
0.03922916576266289,
0.1066308468580246,
-0.03331504017114639,
-0.003290522377938032,
-0.09748117625713348,
0.09091456234455109,
-0.07345452159643173,
-0.027956025674939156,
-0.20169629156589508,
-0.08896147459745407,
0.06409996747970581,
-0.028248341754078865,
0.01820790395140648,
-0.003279269440099597,
0.06096022203564644,
0.0632275715470314,
-0.03337779641151428,
-0.018833588808774948,
-0.09140784293413162,
-0.008075054734945297,
-0.11977202445268631,
-0.11143679171800613,
-0.07195675373077393,
-0.028707675635814667,
0.16169127821922302,
-0.21295084059238434,
-0.002330475952476263,
0.006603142246603966,
0.11127715557813644,
0.020484965294599533,
-0.07400292903184891,
0.007912582717835903,
0.04415658488869667,
-0.0013478165492415428,
-0.08346520364284515,
0.050612691789865494,
0.015236455015838146,
-0.08170374482870102,
-0.059278689324855804,
-0.15294650197029114,
0.08834679424762726,
0.07608145475387573,
0.0922032967209816,
-0.07094504684209824,
0.0023108224850147963,
-0.05057043582201004,
-0.02425537258386612,
-0.050163384526968,
-0.020403645932674408,
0.19908331334590912,
0.027459679171442986,
0.1374920904636383,
-0.06573290377855301,
-0.051959194242954254,
0.02845734916627407,
0.014384743757545948,
-0.019051359966397285,
0.07240093499422073,
0.011982778087258339,
-0.1496153026819229,
0.08932680636644363,
0.0974433571100235,
-0.020831666886806488,
0.10010939836502075,
-0.0332445427775383,
-0.09536483883857727,
-0.03864763677120209,
0.005658946465700865,
0.011186003684997559,
0.12383298575878143,
-0.0762542188167572,
0.007692473009228706,
0.06306633353233337,
0.009421845898032188,
0.005171394441276789,
-0.15511870384216309,
-0.005786164663732052,
0.05558473989367485,
-0.03016488626599312,
-0.034239448606967926,
-0.02236335165798664,
0.008968062698841095,
0.07636649161577225,
0.04662609100341797,
-0.019889147952198982,
0.017096074298024178,
-0.018762119114398956,
-0.06748782098293304,
0.1554459035396576,
-0.08008000254631042,
-0.18745490908622742,
-0.1551247090101242,
0.026046575978398323,
-0.055414747446775436,
-0.0011522838613018394,
0.019283650442957878,
-0.041378386318683624,
-0.059621661901474,
-0.09305031597614288,
-0.07270720601081848,
-0.04097726196050644,
-0.01915810815989971,
0.06833980232477188,
-0.013456359505653381,
0.1004861369729042,
-0.12710697948932648,
-0.0013114373432472348,
0.005089375656098127,
-0.03230339288711548,
-0.0129188671708107,
0.038662541657686234,
0.12424419075250626,
0.0740010142326355,
-0.028136160224676132,
0.022377921268343925,
-0.03226645290851593,
0.2871969938278198,
-0.08316616714000702,
-0.020611565560102463,
0.13839077949523926,
-0.018654629588127136,
0.07200398296117783,
0.10865479707717896,
0.028260163962841034,
-0.09529304504394531,
0.016278639435768127,
0.03678221255540848,
-0.03130758926272392,
-0.20766355097293854,
-0.03874905779957771,
-0.03844483569264412,
-0.0842168778181076,
0.1399722695350647,
0.03834328055381775,
0.011324204504489899,
0.07208829373121262,
-0.012630382552742958,
0.05869803577661514,
-0.049109503626823425,
0.08776198327541351,
0.11966110020875931,
0.04943421110510826,
0.10926315933465958,
-0.03288349136710167,
-0.024000639095902443,
0.05900822952389717,
0.01838892139494419,
0.2130778580904007,
-0.02509363554418087,
0.1734740436077118,
0.005081551149487495,
0.1592588871717453,
-0.01539965532720089,
0.03818804770708084,
-0.01037687063217163,
0.003337935544550419,
-0.002029571682214737,
-0.057190120220184326,
-0.07884349673986435,
0.02166557125747204,
0.028166767209768295,
0.04776957631111145,
-0.08203594386577606,
0.02378873899579048,
-0.005840483587235212,
0.2443886250257492,
0.05122017860412598,
-0.3310337960720062,
-0.10152608901262283,
0.0034955849405378103,
-0.02254827693104744,
-0.09214899688959122,
-0.006859912071377039,
0.12160021811723709,
-0.11473505944013596,
0.042576756328344345,
-0.052071139216423035,
0.10148423165082932,
-0.060121070593595505,
0.004162895958870649,
0.05098234489560127,
0.07237149775028229,
0.004005229100584984,
0.0889003574848175,
-0.18146733939647675,
0.20765690505504608,
0.028409022837877274,
0.0975889042019844,
-0.07855521142482758,
0.03430357202887535,
-0.011468591168522835,
0.04528481513261795,
0.11664269119501114,
0.003997910302132368,
-0.006296195089817047,
-0.19200752675533295,
-0.10085105895996094,
0.020883891731500626,
0.082643523812294,
-0.041942473500967026,
0.08960657566785812,
-0.04778693616390228,
0.003011047374457121,
0.02787184715270996,
-0.006837458349764347,
-0.0947793647646904,
-0.15041178464889526,
0.0474715530872345,
0.03892866522073746,
-0.04356745630502701,
-0.062187787145376205,
-0.08528672158718109,
-0.0034401393495500088,
0.15350276231765747,
0.03384898602962494,
-0.06235329061746597,
-0.13653193414211273,
0.060019105672836304,
0.1279379427433014,
-0.07090070098638535,
0.016903799027204514,
-0.0028845840133726597,
0.13626046478748322,
0.019727623090147972,
-0.05653424561023712,
0.04701067879796028,
-0.05629201978445053,
-0.12338478863239288,
-0.048173002898693085,
0.1518804430961609,
0.019113963469862938,
0.05912329629063606,
0.01302550733089447,
0.038177452981472015,
-0.008770924061536789,
-0.07143764942884445,
0.030743250623345375,
0.01528329961001873,
0.08692518621683121,
0.0001734120596665889,
-0.024931784719228745,
0.05520237982273102,
-0.06769222021102905,
0.012333949096500874,
0.13770753145217896,
0.21792715787887573,
-0.08276814222335815,
0.05279746279120445,
0.04740848019719124,
-0.060240957885980606,
-0.1462864875793457,
0.013711797073483467,
0.11700359731912613,
0.027328945696353912,
0.05539632961153984,
-0.15465545654296875,
0.05728113278746605,
0.08943761885166168,
-0.021561339497566223,
0.08093144744634628,
-0.2936083674430847,
-0.12185975909233093,
0.06927245110273361,
0.1001158133149147,
0.043205469846725464,
-0.1422473043203354,
-0.06507173180580139,
-0.025725550949573517,
-0.19019000232219696,
0.10527017712593079,
0.019264906644821167,
0.09957398474216461,
-0.010641236789524555,
0.08381002396345139,
0.041538041085004807,
-0.04258083179593086,
0.16154836118221283,
0.011736112646758556,
0.04580182582139969,
-0.06715147942304611,
0.006032160017639399,
0.09970717877149582,
-0.0733640119433403,
0.07075788825750351,
-0.029663139954209328,
0.06616830080747604,
-0.20215381681919098,
-0.02603982761502266,
-0.06119661033153534,
0.031097382307052612,
-0.04261055588722229,
-0.04466073587536812,
-0.026085373014211655,
0.034679848700761795,
0.06761330366134644,
-0.026923974975943565,
0.08242419362068176,
0.02506129816174507,
0.08364955335855484,
0.12283477187156677,
0.0962139368057251,
-0.03013577312231064,
-0.15121625363826752,
-0.0020695962011814117,
-0.01825433410704136,
0.0681658685207367,
-0.10217095166444778,
0.03669210523366928,
0.12683911621570587,
0.03967638313770294,
0.11399029195308685,
0.013625705614686012,
-0.046942900866270065,
0.0013771620579063892,
0.030738959088921547,
-0.10848229378461838,
-0.1595517247915268,
-0.02842579409480095,
0.030141327530145645,
-0.17095936834812164,
-0.044014476239681244,
0.11549940705299377,
-0.0418725349009037,
-0.022492581978440285,
-0.010551610961556435,
-0.0004922195803374052,
0.0023183426819741726,
0.1696818619966507,
0.03948037326335907,
0.07301852852106094,
-0.0689544826745987,
0.08210037648677826,
0.09553376585245132,
-0.08932233601808548,
0.06306366622447968,
0.025742223486304283,
-0.0883498340845108,
-0.02357681468129158,
0.01347633358091116,
0.11491662263870239,
-0.030390268191695213,
-0.0332031324505806,
-0.07242552936077118,
-0.05634821578860283,
0.061165861785411835,
0.05924633890390396,
0.049730222672224045,
-0.02334219217300415,
-0.03538856655359268,
0.0022955185268074274,
-0.12806428968906403,
0.1032966747879982,
0.02127000130712986,
0.07438335567712784,
-0.12792979180812836,
0.04628642275929451,
-0.005140184890478849,
0.036768559366464615,
-0.008427542634308338,
0.008743558079004288,
-0.06107919290661812,
-0.012354426085948944,
-0.12345398962497711,
0.01889774762094021,
-0.03011559508740902,
0.006485919002443552,
-0.0344473235309124,
-0.06741561740636826,
-0.045770157128572464,
0.04029655084013939,
-0.06282714754343033,
-0.07313033193349838,
0.014144014567136765,
0.038608551025390625,
-0.13365215063095093,
-0.021300628781318665,
0.034366775304079056,
-0.0885300487279892,
0.06999541074037552,
0.04128442704677582,
0.01315951719880104,
-0.014316768385469913,
-0.0035807988606393337,
-0.014392546378076077,
0.0037212420720607042,
0.04347503185272217,
0.06741338223218918,
-0.11202236264944077,
0.005676819942891598,
-0.00977611355483532,
0.016326867043972015,
0.019417637959122658,
0.10597691684961319,
-0.13301138579845428,
-0.04326152056455612,
-0.0017494019120931625,
-0.04841151833534241,
-0.058380208909511566,
0.0467471182346344,
0.09883028268814087,
0.03379390016198158,
0.17281736433506012,
-0.053325727581977844,
0.049298517405986786,
-0.19762836396694183,
-0.036774929612874985,
-0.00011879517842317,
-0.0425543338060379,
-0.04340455308556557,
-0.04640069231390953,
0.07450303435325623,
-0.05655129626393318,
0.09579193592071533,
-0.007321928162127733,
0.12933935225009918,
0.04526948556303978,
0.003599601797759533,
0.003495212411507964,
-0.015109593980014324,
0.180931955575943,
0.0657259002327919,
-0.03787200525403023,
0.1033748909831047,
-0.015174247324466705,
0.04887383431196213,
0.07876846939325333,
0.11821587383747101,
0.14296002686023712,
0.032322973012924194,
0.048590440303087234,
0.04674259573221207,
-0.06681804358959198,
-0.18179188668727875,
0.046908505260944366,
-0.028874101117253304,
0.11761602014303207,
-0.02860233187675476,
0.1539555788040161,
0.08105672150850296,
-0.16865399479866028,
0.0706987977027893,
-0.08141694962978363,
-0.10272131860256195,
-0.07433081418275833,
-0.10954530537128448,
-0.07952841371297836,
-0.0766521692276001,
0.01930442452430725,
-0.11721404641866684,
0.03148610517382622,
0.09917861223220825,
-0.01559713389724493,
-0.007877970114350319,
0.16729073226451874,
-0.04662245139479637,
0.012087670154869556,
0.05313970521092415,
0.001231611706316471,
0.0021878574043512344,
-0.05292407423257828,
-0.039609644562006,
0.04886044189333916,
0.017267927527427673,
0.09686756134033203,
-0.033522829413414,
0.009793570265173912,
0.02025916799902916,
0.003875116351991892,
-0.08736136555671692,
0.0013374518603086472,
0.024613499641418457,
0.04095472767949104,
0.04669412225484848,
0.03557993471622467,
0.032122064381837845,
-0.041709937155246735,
0.24308037757873535,
-0.05013219267129898,
-0.06011267751455307,
-0.13308104872703552,
0.15799030661582947,
0.03575575351715088,
-0.019336648285388947,
0.07044472545385361,
-0.09839288890361786,
0.015487448312342167,
0.16089721024036407,
0.12255016714334488,
-0.07898452132940292,
-0.02378566563129425,
-0.000361424230504781,
-0.0034834786783903837,
-0.03960135579109192,
0.10095840692520142,
0.09534551948308945,
0.016997357830405235,
-0.061181049793958664,
-0.010343194007873535,
-0.02980067953467369,
-0.022190449759364128,
-0.05358247458934784,
0.08617785573005676,
0.021429890766739845,
0.0206854697316885,
-0.04493227228522301,
0.05576522275805473,
0.038202252238988876,
-0.1587209850549698,
0.030637966468930244,
-0.19403673708438873,
-0.18902362883090973,
-0.012333609163761139,
0.06804350018501282,
-0.012318823486566544,
0.06356758624315262,
-0.001479469588957727,
0.007856203243136406,
0.10245677083730698,
-0.020128533244132996,
-0.04915156215429306,
-0.08987487852573395,
0.09528133273124695,
-0.07677321881055832,
0.22088110446929932,
0.0036870173644274473,
0.07110115885734558,
0.1092686802148819,
0.024525541812181473,
-0.14934566617012024,
0.03413945436477661,
0.09254482388496399,
-0.026319650933146477,
0.035201799124479294,
0.15544763207435608,
-0.03226269781589508,
0.1041695848107338,
0.04246325045824051,
-0.11900016665458679,
-0.03057916834950447,
-0.04948694258928299,
0.018992731347680092,
-0.06388919055461884,
-0.015495885163545609,
-0.06468131393194199,
0.17339611053466797,
0.18962760269641876,
-0.0471690408885479,
-0.03036576695740223,
-0.06241700425744057,
0.023591969162225723,
0.06061606854200363,
0.09027939289808273,
-0.04998685047030449,
-0.18476983904838562,
0.0036042132414877415,
0.030847355723381042,
0.025982331484556198,
-0.26782217621803284,
-0.086907759308815,
0.0524047389626503,
-0.04708603397011757,
-0.026253074407577515,
0.1042914167046547,
0.032989658415317535,
0.026002854108810425,
-0.050227247178554535,
-0.08009910583496094,
-0.056280769407749176,
0.13190478086471558,
-0.13703131675720215,
-0.052856143563985825
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_small_pretrain_squad
This model is a fine-tuned version of [prajjwal1/bert-small](https://huggingface.co/prajjwal1/bert-small) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1410
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert_small_pretrain_squad", "results": []}]}
|
fill-mask
|
anas-awadalla/bert-small-pretrained-on-squad
|
[
"transformers",
"pytorch",
"bert",
"fill-mask",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# bert_small_pretrain_squad
This model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1410
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# bert_small_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.1410",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert_small_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.1410",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
54,
55,
6,
12,
8,
3,
90,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #fill-mask #generated_from_trainer #dataset-squad #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# bert_small_pretrain_squad\n\nThis model is a fine-tuned version of prajjwal1/bert-small on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.1410## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0942181721329689,
0.12960481643676758,
-0.0023906801361590624,
0.10627450793981552,
0.1645430028438568,
0.031261008232831955,
0.08066241443157196,
0.13322220742702484,
-0.11473816633224487,
0.06875798851251602,
0.09241696447134018,
0.0508587583899498,
0.04859087988734245,
0.14206042885780334,
-0.020979877561330795,
-0.26321646571159363,
0.004650499206036329,
0.0028407967183738947,
-0.05594542622566223,
0.09088215231895447,
0.09222868829965591,
-0.11842761188745499,
0.07965681701898575,
0.006474012974649668,
-0.16230125725269318,
0.024816513061523438,
-0.008494694717228413,
-0.035019710659980774,
0.07435385137796402,
0.01634066365659237,
0.11055087298154831,
-0.008129904046654701,
0.13978946208953857,
-0.19248148798942566,
-0.0048156967386603355,
0.07672657072544098,
0.04214782267808914,
0.0726819559931755,
0.04152785241603851,
0.032048169523477554,
0.10259813815355301,
-0.17635200917720795,
0.08171498030424118,
0.02297123707830906,
-0.07872117310762405,
-0.14735101163387299,
-0.07645164430141449,
0.06803224980831146,
0.07273148000240326,
0.09687112271785736,
0.010510187596082687,
0.14869926869869232,
-0.08985582739114761,
0.07650940865278244,
0.22946521639823914,
-0.27261853218078613,
-0.0687568262219429,
0.04577399790287018,
0.04431171342730522,
0.03523769602179527,
-0.11114342510700226,
-0.015492107719182968,
0.03657212853431702,
0.04132026061415672,
0.11764495819807053,
-0.027307067066431046,
-0.08075910806655884,
-0.007863707840442657,
-0.13193535804748535,
-0.03393252193927765,
0.14978648722171783,
0.025686796754598618,
-0.03668427839875221,
-0.06416823714971542,
-0.06285113841295242,
-0.10208212584257126,
-0.02785041183233261,
-0.028198502957820892,
0.0455843061208725,
-0.05654246360063553,
-0.05634569749236107,
-0.03201303631067276,
-0.07122025638818741,
-0.06588563323020935,
-0.008935394696891308,
0.09794726967811584,
0.05407358705997467,
0.01858559623360634,
-0.035612139850854874,
0.08276841044425964,
-0.012232965789735317,
-0.12573474645614624,
-0.015760714188218117,
-0.0013658985262736678,
-0.0702035054564476,
-0.05997268855571747,
-0.052233342081308365,
-0.046671733260154724,
-0.006667733192443848,
0.10570219159126282,
-0.05592242255806923,
0.05985106900334358,
0.036355629563331604,
0.00808296911418438,
-0.014025718905031681,
0.1387990415096283,
-0.05739179253578186,
-0.02977726422250271,
0.013388865627348423,
0.07745381444692612,
0.015051141381263733,
0.005828578025102615,
-0.07335259020328522,
-0.021055886521935463,
0.09385566413402557,
0.044599857181310654,
-0.05476587265729904,
0.05081579461693764,
-0.02450132928788662,
-0.02115633338689804,
0.009154122322797775,
-0.1290743052959442,
0.05711173266172409,
-0.007529009133577347,
-0.09221208095550537,
0.010529258288443089,
0.03483528643846512,
-0.04560147970914841,
-0.04983842745423317,
0.114311084151268,
-0.09542237222194672,
0.011980172246694565,
-0.09874091297388077,
-0.11002040654420853,
0.01611468568444252,
-0.05842584744095802,
-0.020328344777226448,
-0.0807880088686943,
-0.21054507791996002,
-0.04002198949456215,
0.04592485725879669,
-0.046881694346666336,
-0.02999139577150345,
-0.04605904594063759,
-0.06822061538696289,
0.01152368076145649,
-0.009409582242369652,
0.09643667936325073,
-0.04924825206398964,
0.08632826060056686,
0.000356436416041106,
0.04908743500709534,
0.008381475694477558,
0.05068596079945564,
-0.08536770194768906,
0.015142054297029972,
-0.14822523295879364,
0.044239070266485214,
-0.08196210861206055,
0.016284558922052383,
-0.10155842453241348,
-0.13141174614429474,
0.003580149495974183,
-0.011116966605186462,
0.07011484354734421,
0.12205836921930313,
-0.2012062668800354,
-0.03187011182308197,
0.1634465754032135,
-0.07965829223394394,
-0.030520470812916756,
0.10268718004226685,
-0.050345540046691895,
0.01019340567290783,
0.0721309706568718,
0.13037200272083282,
0.12825888395309448,
-0.14141643047332764,
-0.03193788602948189,
0.006038075778633356,
0.05564849451184273,
-0.028957536444067955,
0.044573135673999786,
0.015402121469378471,
0.00780697213485837,
0.023387687280774117,
-0.06055324152112007,
0.004929238930344582,
-0.09902574867010117,
-0.07040658593177795,
-0.04311465471982956,
-0.09071588516235352,
0.042761608958244324,
0.051195867359638214,
0.056385498493909836,
-0.06521766632795334,
-0.10406450182199478,
0.13528093695640564,
0.1187206357717514,
-0.05173230171203613,
0.010649575851857662,
-0.061737310141325,
0.057483330368995667,
-0.05425596982240677,
-0.02631058357656002,
-0.19356074929237366,
-0.09751823544502258,
0.04895729944109917,
-0.07140830904245377,
0.029800821095705032,
0.04454620182514191,
0.07271510362625122,
0.0703461617231369,
-0.04548149183392525,
-0.0016601610695943236,
-0.07622534781694412,
-0.011798810213804245,
-0.10443686693906784,
-0.17476221919059753,
-0.07065197825431824,
-0.01833786442875862,
0.10206165909767151,
-0.22771593928337097,
0.010132052935659885,
-0.06683341413736343,
0.11381006240844727,
0.016800416633486748,
-0.056920960545539856,
-0.007083360105752945,
0.061711590737104416,
-0.005687065422534943,
-0.081071637570858,
0.055685169994831085,
0.004163632169365883,
-0.024272015318274498,
-0.08689657598733902,
-0.11635582894086838,
0.030392739921808243,
0.07595163583755493,
0.016346408054232597,
-0.10401502996683121,
0.022957488894462585,
-0.060452137142419815,
-0.03406469151377678,
-0.09518607705831528,
-0.001041003386490047,
0.17194372415542603,
0.00219068955630064,
0.13409703969955444,
-0.055771604180336,
-0.05453524366021156,
0.019428247585892677,
0.0003333158965688199,
0.0026655339170247316,
0.0638260692358017,
0.12135513126850128,
-0.0763736292719841,
0.095310278236866,
0.05580757558345795,
-0.07052722573280334,
0.1362713873386383,
-0.029797766357660294,
-0.08377590775489807,
-0.01867884024977684,
-0.0008625423652119935,
-0.011509989388287067,
0.12312780320644379,
-0.07609640061855316,
-0.00742822652682662,
0.030236102640628815,
0.025013556703925133,
0.022190915420651436,
-0.1676693856716156,
-0.001077721593901515,
0.031162425875663757,
-0.040282659232616425,
-0.03617134317755699,
-0.024463597685098648,
0.029286332428455353,
0.09581322222948074,
0.029659057036042213,
-0.024748049676418304,
0.010565273463726044,
-0.002495633438229561,
-0.07225822657346725,
0.1780218631029129,
-0.11370792239904404,
-0.16755767166614532,
-0.1267269253730774,
0.04030636325478554,
-0.06077703461050987,
-0.010960444808006287,
0.023065650835633278,
-0.07239164412021637,
-0.06061285361647606,
-0.0818757489323616,
0.00932666752487421,
-0.03505301848053932,
0.013403378427028656,
0.036611076444387436,
-0.008032813668251038,
0.08135739713907242,
-0.1353226900100708,
-0.004542794544249773,
-0.030675575137138367,
-0.0984150618314743,
-0.0015103894984349608,
0.046972211450338364,
0.09922496229410172,
0.09335824102163315,
-0.03197875991463661,
0.02571319043636322,
-0.03313397616147995,
0.22188912332057953,
-0.03715498372912407,
0.005726546980440617,
0.11619775742292404,
0.014215829782187939,
0.05359343811869621,
0.1105692982673645,
0.02582015097141266,
-0.09228390455245972,
0.02035648189485073,
0.0650375559926033,
-0.01985199563205242,
-0.20076265931129456,
-0.05666989088058472,
-0.04850957170128822,
-0.05586431547999382,
0.11466104537248611,
0.04222871735692024,
-0.03536919131875038,
0.039589811116456985,
0.005595085211098194,
0.07570141553878784,
-0.056680236011743546,
0.08755597472190857,
0.09104307740926743,
0.04467768967151642,
0.09610062837600708,
-0.03207896277308464,
-0.029375679790973663,
0.06486426293849945,
-0.04443125054240227,
0.2797042727470398,
-0.03235166147351265,
0.06901522725820541,
0.046074725687503815,
0.13290973007678986,
-0.017380425706505775,
0.07640422880649567,
0.003836238058283925,
-0.0007520249928347766,
0.001207109191454947,
-0.05951373651623726,
-0.052185408771038055,
-0.004104819614440203,
-0.017251402139663696,
0.07493153214454651,
-0.1330249309539795,
0.021077191457152367,
0.011800793930888176,
0.27195173501968384,
0.053293436765670776,
-0.3051462471485138,
-0.0916316956281662,
-0.003587219165638089,
-0.03836320713162422,
-0.06646815687417984,
0.010466406121850014,
0.14426106214523315,
-0.11739753186702728,
0.05335504189133644,
-0.05097077786922455,
0.0844942107796669,
-0.04497307538986206,
0.015062673948705196,
0.06005235016345978,
0.13244159519672394,
0.003849290544167161,
0.08494526147842407,
-0.20780526101589203,
0.23567229509353638,
0.01730518601834774,
0.12224528938531876,
-0.064432293176651,
0.022183187305927277,
0.017066147178411484,
0.0639442503452301,
0.10361521691083908,
0.012429601512849331,
-0.022274954244494438,
-0.18020497262477875,
-0.07343346625566483,
0.03674799203872681,
0.10177237540483475,
0.007477793376892805,
0.09549777209758759,
-0.043795157223939896,
-0.00704383896663785,
0.044372279196977615,
-0.033931322395801544,
-0.1571243405342102,
-0.09819428622722626,
0.007656520698219538,
0.02428482472896576,
-0.06486523896455765,
-0.05756104364991188,
-0.10892914980649948,
-0.03728931397199631,
0.17618656158447266,
0.0225999616086483,
-0.03739430755376816,
-0.13169455528259277,
0.09760040044784546,
0.11016657948493958,
-0.07930117845535278,
0.01791868358850479,
0.012376378290355206,
0.11406035721302032,
0.03917359933257103,
-0.08642058819532394,
0.0499340295791626,
-0.07184306532144547,
-0.14390814304351807,
-0.059143953025341034,
0.10568668693304062,
0.061152324080467224,
0.06213226914405823,
-0.0015951499808579683,
0.025904173031449318,
0.014053398743271828,
-0.08839450031518936,
-0.008072554133832455,
0.09700420498847961,
0.09434203803539276,
0.06208266690373421,
-0.10756734758615494,
-0.007569457869976759,
-0.008227543905377388,
0.0245591402053833,
0.1332051157951355,
0.20856806635856628,
-0.09554889798164368,
0.06257364898920059,
0.05947255343198776,
-0.09088167548179626,
-0.20300905406475067,
0.06980136781930923,
0.10427998006343842,
0.019574245437979698,
0.03489699959754944,
-0.18182237446308136,
0.12592482566833496,
0.10553205013275146,
-0.009342806413769722,
0.03988390043377876,
-0.32583147287368774,
-0.11210128664970398,
0.06442884355783463,
0.12933427095413208,
0.08156830072402954,
-0.13900350034236908,
-0.018073538318276405,
-0.033432912081480026,
-0.14697717130184174,
0.10915733873844147,
-0.038658905774354935,
0.12550345063209534,
-0.01922045834362507,
0.08419100940227509,
0.02705775946378708,
-0.05324891209602356,
0.11727791279554367,
0.04570971801877022,
0.08164827525615692,
-0.05568447709083557,
-0.001598501461558044,
0.06824829429388046,
-0.06243392825126648,
0.06863357126712799,
-0.0294262133538723,
0.054186899214982986,
-0.16050580143928528,
-0.014648650772869587,
-0.08346783369779587,
0.07405651360750198,
-0.045219387859106064,
-0.05348261445760727,
-0.04447358101606369,
0.06058952212333679,
0.07976564764976501,
-0.022899558767676353,
0.04051173850893974,
0.02324340119957924,
0.09404885768890381,
0.05696331337094307,
0.0706646591424942,
0.0016444866778329015,
-0.1078634262084961,
0.010410553775727749,
-0.00452640512958169,
0.06505227833986282,
-0.10540013760328293,
0.021426787599921227,
0.13797295093536377,
0.04673071950674057,
0.13066089153289795,
0.04696039482951164,
-0.0315687358379364,
-0.006629068870097399,
0.03963576257228851,
-0.12549138069152832,
-0.11989285796880722,
0.008672057650983334,
-0.05155128613114357,
-0.1443580836057663,
0.010005922988057137,
0.08099009096622467,
-0.0837567001581192,
-0.010463719256222248,
-0.019620589911937714,
0.019446037709712982,
-0.03303706273436546,
0.19784747064113617,
0.03043963760137558,
0.057686977088451385,
-0.08543024957180023,
0.10363715142011642,
0.06609802693128586,
-0.05411481112241745,
0.04683275148272514,
0.07393816858530045,
-0.08818051964044571,
-0.017798656597733498,
0.1066955104470253,
0.18758249282836914,
-0.0162295363843441,
-0.024125227704644203,
-0.10060027241706848,
-0.07714284956455231,
0.060057781636714935,
0.13063684105873108,
0.05560232698917389,
-0.021314268931746483,
-0.04831381142139435,
0.033310748636722565,
-0.1580902338027954,
0.07540721446275711,
0.0692208930850029,
0.06740265339612961,
-0.1274479329586029,
0.1610230952501297,
-0.0005771223804913461,
0.05148438736796379,
-0.02235260233283043,
0.030972709879279137,
-0.12002106755971909,
-0.02527831867337227,
-0.11055951565504074,
-0.004413661081343889,
-0.039308812469244,
-0.006307078059762716,
-0.015994267538189888,
-0.0376582071185112,
-0.049216531217098236,
0.041709739714860916,
-0.07651735097169876,
-0.06080877035856247,
0.02459600940346718,
0.052762530744075775,
-0.12974758446216583,
-0.012161016464233398,
0.006171273998916149,
-0.07738509774208069,
0.05732153356075287,
0.05011620372533798,
0.018871784210205078,
0.03699205070734024,
-0.08843053877353668,
-0.04550721496343613,
0.030104495584964752,
0.02429264411330223,
0.08389166742563248,
-0.07202328741550446,
0.01830315962433815,
-0.015213658101856709,
0.05749938264489174,
0.013330177403986454,
0.06930268555879593,
-0.12234562635421753,
0.017443198710680008,
-0.05535939708352089,
-0.0471155159175396,
-0.05186653137207031,
0.03302144631743431,
0.09473538398742676,
0.030494118109345436,
0.17123284935951233,
-0.09491413086652756,
0.04997989907860756,
-0.19872289896011353,
-0.03854350000619888,
-0.0033269007690250874,
-0.05014095827937126,
-0.0746481642127037,
-0.03688279539346695,
0.08854309469461441,
-0.04409398138523102,
0.09596891701221466,
0.03286992385983467,
0.0784212276339531,
0.031590837985277176,
-0.04214213415980339,
-0.03966708108782768,
-0.01246542390435934,
0.14689888060092926,
0.04470163956284523,
-0.04392022639513016,
0.09930658340454102,
0.0102123087272048,
0.07244949787855148,
0.07046875357627869,
0.24362292885780334,
0.15397702157497406,
-0.032784126698970795,
0.06839470565319061,
0.048272207379341125,
-0.10016949474811554,
-0.16405805945396423,
0.05401768162846565,
-0.016657385975122452,
0.13424834609031677,
-0.0560537688434124,
0.16647416353225708,
0.05897602438926697,
-0.17048506438732147,
0.07169532030820847,
-0.06474296003580093,
-0.11128029972314835,
-0.1254585236310959,
-0.05994153767824173,
-0.08204924315214157,
-0.09943760186433792,
0.01719498075544834,
-0.1167827844619751,
0.03176667541265488,
0.07962703704833984,
0.0028786335606127977,
-0.0025912539567798376,
0.16208530962467194,
-0.05597670003771782,
0.025608422234654427,
0.052676863968372345,
0.00910247303545475,
-0.00010041610948974267,
-0.07629787921905518,
-0.04717431962490082,
0.021175669506192207,
0.0036003340501338243,
0.07852353155612946,
-0.036086443811655045,
0.006938312668353319,
0.03170012682676315,
-0.01348194107413292,
-0.07665613293647766,
0.013855382800102234,
0.014948652125895023,
0.04614191874861717,
0.05669433996081352,
0.047372397035360336,
-0.005685428623110056,
-0.04387791082262993,
0.23398970067501068,
-0.07399441301822662,
-0.09493653476238251,
-0.11880367249250412,
0.24210427701473236,
0.06376350671052933,
-0.01990453526377678,
0.060546599328517914,
-0.10617934167385101,
-0.022958017885684967,
0.2094690352678299,
0.16445156931877136,
-0.04301370307803154,
-0.007098956033587456,
-0.01611391082406044,
-0.01415157224982977,
-0.040541719645261765,
0.1461300551891327,
0.10102222859859467,
0.0737023577094078,
-0.04722205549478531,
-0.022262126207351685,
-0.025405239313840866,
-0.01772410422563553,
-0.10005941241979599,
0.036502670496702194,
0.03920130804181099,
0.015163317322731018,
-0.03326767683029175,
0.05055960640311241,
-0.011157134547829628,
-0.19348184764385223,
0.05069497600197792,
-0.16422642767429352,
-0.16669699549674988,
-0.021733365952968597,
0.0715346410870552,
-0.0007113463361747563,
0.07030460238456726,
-0.02840009145438671,
0.00371415913105011,
0.14901527762413025,
-0.02142462134361267,
-0.06198146566748619,
-0.11814189702272415,
0.08649031817913055,
-0.07738897204399109,
0.2264963835477829,
0.00373974721878767,
0.05831548571586609,
0.10668550431728363,
0.027178719639778137,
-0.13078048825263977,
0.04213649034500122,
0.0556427463889122,
-0.07824565470218658,
0.020674988627433777,
0.1641807109117508,
-0.0592210590839386,
0.0867791473865509,
0.01979600079357624,
-0.10855861753225327,
-0.012297251261770725,
-0.052069924771785736,
-0.01703421212732792,
-0.0756978988647461,
0.0029170529451221228,
-0.062404438853263855,
0.14886368811130524,
0.22653579711914062,
-0.019150881096720695,
-0.0011228426592424512,
-0.09979116171598434,
0.03351445123553276,
0.053953103721141815,
0.0936090499162674,
-0.050205297768116,
-0.20366685092449188,
0.011553938500583172,
-0.018460778519511223,
0.033703241497278214,
-0.24095121026039124,
-0.08283066749572754,
0.02503681369125843,
-0.06043732538819313,
-0.06295827776193619,
0.09936409443616867,
0.02971530146896839,
0.043632395565509796,
-0.04056220129132271,
-0.10955049097537994,
-0.04914730414748192,
0.15296010673046112,
-0.16424524784088135,
-0.06899701058864594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-tiny-finetuned-squad
This model is a fine-tuned version of [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0+cu113
- Datasets 2.0.0
- Tokenizers 0.11.6
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-tiny-finetuned-squad", "results": []}]}
|
question-answering
|
anas-awadalla/bert-tiny-finetuned-squad
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# bert-tiny-finetuned-squad
This model is a fine-tuned version of prajjwal1/bert-tiny on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 64
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0+cu113
- Datasets 2.0.0
- Tokenizers 0.11.6
|
[
"# bert-tiny-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-tiny on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 64\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.11.0+cu113\n- Datasets 2.0.0\n- Tokenizers 0.11.6"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# bert-tiny-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-tiny on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 64\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.11.0+cu113\n- Datasets 2.0.0\n- Tokenizers 0.11.6"
] |
[
51,
36,
6,
12,
8,
3,
100,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# bert-tiny-finetuned-squad\n\nThis model is a fine-tuned version of prajjwal1/bert-tiny on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 64\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0### Training results### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.11.0+cu113\n- Datasets 2.0.0\n- Tokenizers 0.11.6"
] |
[
-0.10047151893377304,
0.10441913455724716,
-0.001686163479462266,
0.07668919861316681,
0.1577901393175125,
0.019957955926656723,
0.0929708331823349,
0.12590736150741577,
-0.12909378111362457,
0.06386756896972656,
0.1096818596124649,
0.0654943659901619,
0.03457073122262955,
0.09272246062755585,
-0.016825295984745026,
-0.27704572677612305,
0.006979868281632662,
0.0327601283788681,
-0.08932238817214966,
0.0976361408829689,
0.12062659114599228,
-0.11490932106971741,
0.07004581391811371,
0.03691703826189041,
-0.18554769456386566,
0.01850135438144207,
-0.02964213490486145,
-0.057365696877241135,
0.09336482733488083,
0.037953343242406845,
0.106146901845932,
-0.005511973984539509,
0.1153830960392952,
-0.20429421961307526,
0.005818516947329044,
0.06513635069131851,
0.04424017667770386,
0.08531773090362549,
0.04388860613107681,
0.031325411051511765,
0.09880237281322479,
-0.11771853268146515,
0.07573486119508743,
0.033162884414196014,
-0.07905358821153641,
-0.19511911273002625,
-0.08891517668962479,
0.10705619305372238,
0.08161245286464691,
0.08937803655862808,
-0.00471047917380929,
0.1574348658323288,
-0.10040797293186188,
0.06081047281622887,
0.19342276453971863,
-0.30123603343963623,
-0.09422994405031204,
0.06161249801516533,
0.06983590871095657,
0.03938383609056473,
-0.10608243942260742,
-0.021055253222584724,
0.06551811099052429,
0.0478571318089962,
0.07901550829410553,
-0.012143395841121674,
-0.04367810860276222,
0.013666207902133465,
-0.14044153690338135,
-0.03567369654774666,
0.1910402923822403,
0.04282834380865097,
-0.05834348499774933,
-0.10130579024553299,
-0.029345031827688217,
-0.08366233110427856,
-0.020801154896616936,
-0.046060770750045776,
0.022555818781256676,
-0.042612507939338684,
-0.11239435523748398,
-0.04458718001842499,
-0.10132695734500885,
-0.07505200803279877,
0.0006445765611715615,
0.12050352245569229,
0.056698478758335114,
0.018406691029667854,
-0.05695248022675514,
0.10590437799692154,
-0.007311569061130285,
-0.11745886504650116,
-0.032734304666519165,
-0.012720571830868721,
-0.03058731183409691,
-0.07372423261404037,
-0.0580412782728672,
0.0008378649363294244,
0.015245258808135986,
0.1603892296552658,
-0.045539505779743195,
0.04750559478998184,
0.05468377470970154,
0.011451886966824532,
0.0036877449601888657,
0.14320982992649078,
-0.06504514813423157,
0.0035357919987291098,
-0.008073593489825726,
0.0869024395942688,
-0.011517680250108242,
0.01229250617325306,
-0.07645586878061295,
-0.008018717169761658,
0.08666753768920898,
0.04076065868139267,
-0.06057155504822731,
0.03211414813995361,
-0.017260875552892685,
-0.03743633255362511,
0.013783276081085205,
-0.09853966534137726,
0.03514324873685837,
-0.007521762512624264,
-0.0878591388463974,
0.03731900081038475,
-0.01599697396159172,
0.006796061992645264,
-0.034722644835710526,
0.10344339162111282,
-0.10431092232465744,
-0.011723987758159637,
-0.0940055176615715,
-0.06883648037910461,
0.017004938796162605,
-0.04354260489344597,
0.0029499996453523636,
-0.08869325369596481,
-0.15332767367362976,
-0.03380655124783516,
0.03893473371863365,
-0.05121341347694397,
-0.06920599192380905,
-0.020325472578406334,
-0.08085831999778748,
0.011836778372526169,
-0.014486006461083889,
0.1376689076423645,
-0.03974801301956177,
0.08317997306585312,
0.040696680545806885,
0.027926545590162277,
0.01850993186235428,
0.0373641736805439,
-0.07260533422231674,
0.03194484859704971,
-0.09732279926538467,
0.048734571784734726,
-0.0956282988190651,
0.02957378514111042,
-0.1210988461971283,
-0.12478155642747879,
0.01595746912062168,
0.0019456101581454277,
0.07197257876396179,
0.10067737102508545,
-0.13897064328193665,
-0.0474417619407177,
0.16840127110481262,
-0.07410813868045807,
-0.0995231568813324,
0.11276219040155411,
-0.03931102156639099,
0.04564712196588516,
0.06412792950868607,
0.1044873371720314,
0.1180121973156929,
-0.12717409431934357,
-0.035391923040151596,
0.023550812155008316,
0.08008618652820587,
-0.021461982280015945,
0.08792738616466522,
0.009711452759802341,
0.001730801071971655,
0.024863671511411667,
-0.07684899121522903,
-0.005572714377194643,
-0.11129935830831528,
-0.07825277000665665,
-0.04584409296512604,
-0.09371078759431839,
0.019389333203434944,
0.041469622403383255,
0.07326672971248627,
-0.0793958306312561,
-0.11090944707393646,
0.13134333491325378,
0.14135943353176117,
-0.0457444004714489,
0.002853857818990946,
-0.0954970046877861,
0.062936931848526,
-0.06356889009475708,
-0.031290628015995026,
-0.17979510128498077,
-0.08411945402622223,
0.03462211415171623,
-0.04394686967134476,
0.014799225144088268,
0.05732766166329384,
0.0678999200463295,
0.06076538562774658,
-0.043792735785245895,
-0.008599997498095036,
-0.10932735353708267,
-0.02302391827106476,
-0.10982419550418854,
-0.1587621420621872,
-0.08798643946647644,
-0.03367951139807701,
0.15327851474285126,
-0.23343175649642944,
0.018479473888874054,
-0.011150223203003407,
0.13685978949069977,
0.022016944363713264,
-0.041112739592790604,
-0.03149106353521347,
0.05667312815785408,
-0.019062595441937447,
-0.07096385210752487,
0.059468671679496765,
0.015109418891370296,
-0.04464182257652283,
-0.08781597018241882,
-0.07390519231557846,
0.07154683768749237,
0.09670175611972809,
0.010034395381808281,
-0.08058590441942215,
-0.01930742710828781,
-0.08128098398447037,
-0.028098292648792267,
-0.08515872061252594,
-0.014459917321801186,
0.15661850571632385,
0.003674035659059882,
0.1392173022031784,
-0.07346659153699875,
-0.07386268675327301,
0.019799191504716873,
-0.015860212966799736,
-0.015703653916716576,
0.073992520570755,
0.09854604303836823,
-0.0972679853439331,
0.11201728135347366,
0.07158306241035461,
-0.06486549228429794,
0.14942623674869537,
-0.056378915905952454,
-0.0958557203412056,
-0.02072724513709545,
0.018895868211984634,
-0.005068584810942411,
0.11815597862005234,
-0.1230461522936821,
-0.0178862102329731,
0.039421338587999344,
0.0221762303262949,
0.05171364173293114,
-0.18882067501544952,
-0.016315260902047157,
0.025183988735079765,
-0.02810942940413952,
-0.01999656669795513,
-0.016963204368948936,
0.005551359616219997,
0.08841091394424438,
0.041624341160058975,
0.0004097841738257557,
0.024464042857289314,
0.004103188402950764,
-0.06713756918907166,
0.19259990751743317,
-0.08335786312818527,
-0.14463557302951813,
-0.1460161805152893,
0.045326199382543564,
-0.05150095745921135,
-0.01328320149332285,
0.029604574665427208,
-0.09652481228113174,
-0.041368432343006134,
-0.06541906297206879,
0.026011906564235687,
-0.04906214028596878,
-0.005275583826005459,
0.04885365441441536,
0.014406032860279083,
0.07662234455347061,
-0.14847348630428314,
0.02098758891224861,
-0.029000626876950264,
-0.09956862777471542,
-0.0002906567242462188,
0.0360584631562233,
0.08822765201330185,
0.09859849512577057,
-0.004990830086171627,
0.025604235008358955,
-0.031655605882406235,
0.25342848896980286,
-0.06238308921456337,
-0.036522626876831055,
0.13766533136367798,
0.02130221202969551,
0.03595825657248497,
0.06705817580223083,
0.04153023287653923,
-0.09696418792009354,
0.029504146426916122,
0.07375138998031616,
-0.015164602547883987,
-0.23423974215984344,
-0.03185971453785896,
-0.040306445211172104,
-0.03519342839717865,
0.11869567632675171,
0.047490451484918594,
-0.02149185538291931,
0.08932686597108841,
-0.01937483809888363,
0.06993945688009262,
-0.0649101510643959,
0.09530840069055557,
0.09419593214988708,
0.03341536968946457,
0.11081969738006592,
-0.041482098400592804,
-0.060441698879003525,
0.0629323273897171,
-0.0013542938977479935,
0.24328961968421936,
-0.00903354026377201,
0.10109807550907135,
0.05889474228024483,
0.17138266563415527,
-0.01014798879623413,
0.05422516539692879,
-0.012536614201962948,
-0.027253028005361557,
0.0017296257428824902,
-0.04639890789985657,
-0.04056793823838234,
0.017802562564611435,
-0.0032699101138859987,
0.06747754663228989,
-0.11512669920921326,
0.019144883379340172,
0.013270595110952854,
0.2936836779117584,
0.05427630990743637,
-0.291949063539505,
-0.11515958607196808,
0.01792166195809841,
-0.05530759319663048,
-0.060532622039318085,
0.02729872055351734,
0.1496218591928482,
-0.13123410940170288,
0.020853452384471893,
-0.060727111995220184,
0.10636866092681885,
-0.029859723523259163,
0.007946562953293324,
0.08154062181711197,
0.11118336021900177,
0.006567210890352726,
0.09944523125886917,
-0.21405352652072906,
0.20887257158756256,
0.006960866041481495,
0.12167637050151825,
-0.07145887613296509,
0.03356916084885597,
-0.00009709385631140321,
0.05334391072392464,
0.08203089237213135,
-0.002640239428728819,
-0.01256086491048336,
-0.16748495399951935,
-0.04971173405647278,
0.04978494718670845,
0.10990632325410843,
-0.04423372820019722,
0.10982540994882584,
-0.04804109036922455,
0.021737074479460716,
0.04097054898738861,
-0.008647463284432888,
-0.14050057530403137,
-0.1173412874341011,
0.01071128062903881,
-0.0013241693377494812,
-0.03316045179963112,
-0.0899139791727066,
-0.11108197271823883,
-0.05266008898615837,
0.17257919907569885,
-0.014781692996621132,
-0.027394933626055717,
-0.12663157284259796,
0.10159377753734589,
0.10829430818557739,
-0.06559591740369797,
0.008179237134754658,
0.009185655042529106,
0.13636696338653564,
0.03900834545493126,
-0.0606561042368412,
0.04959578439593315,
-0.07614342868328094,
-0.1829896867275238,
-0.040627822279930115,
0.13531050086021423,
0.04141569882631302,
0.03813159093260765,
0.010676738806068897,
0.01058189943432808,
-0.006336333695799112,
-0.09815045446157455,
0.011043122038245201,
0.034909818321466446,
0.08339400589466095,
0.03371935337781906,
-0.07339594513177872,
0.034390684217214584,
-0.023897508159279823,
-0.006154888309538364,
0.14645908772945404,
0.21954764425754547,
-0.07877229154109955,
0.014031744562089443,
0.0688956081867218,
-0.059673018753528595,
-0.17567089200019836,
0.052674971520900726,
0.11859147995710373,
0.02221887744963169,
0.004744200501590967,
-0.20263518393039703,
0.14721184968948364,
0.13046982884407043,
-0.017617549747228622,
0.04544129967689514,
-0.3205985724925995,
-0.11935626715421677,
0.07238849997520447,
0.11889579147100449,
0.0545276515185833,
-0.1506822258234024,
-0.03541375324130058,
-0.022638263180851936,
-0.19724005460739136,
0.14159797132015228,
-0.12119682133197784,
0.11320210248231888,
-0.006785290781408548,
0.09877757728099823,
0.016603665426373482,
-0.04973028600215912,
0.14991261065006256,
0.0400337390601635,
0.06638757139444351,
-0.04077250137925148,
-0.01842535100877285,
0.08394402265548706,
-0.03645751252770424,
0.04266571253538132,
-0.030732247978448868,
0.0598960816860199,
-0.18218515813350677,
-0.03605147451162338,
-0.07898343354463577,
0.053224921226501465,
-0.06068573147058487,
-0.06781338155269623,
-0.048430994153022766,
0.061872806400060654,
0.027993187308311462,
-0.022038642317056656,
0.1026989221572876,
0.026102453470230103,
0.1415822058916092,
0.0826893076300621,
0.09725984930992126,
-0.011761638335883617,
-0.09778487682342529,
-0.015580051578581333,
-0.015846561640501022,
0.08108493685722351,
-0.12688866257667542,
0.021792391315102577,
0.12475868314504623,
0.05683359131217003,
0.13235270977020264,
0.0542106032371521,
-0.05441436544060707,
0.014355012215673923,
0.04175738990306854,
-0.10679768025875092,
-0.16350823640823364,
-0.009821292944252491,
-0.0073730237782001495,
-0.15173009037971497,
0.053692158311605453,
0.10505750775337219,
-0.06809715926647186,
-0.0314398892223835,
-0.01955125667154789,
-0.006420218385756016,
-0.05219230055809021,
0.19685521721839905,
0.04901653155684471,
0.06899465620517731,
-0.0843261331319809,
0.08609173446893692,
0.05147262290120125,
-0.08588578552007675,
0.024249598383903503,
0.05036197230219841,
-0.07143453508615494,
-0.030139567330479622,
0.06009585037827492,
0.13523687422275543,
-0.05318276211619377,
-0.036529291421175,
-0.09028531610965729,
-0.10431913286447525,
0.0671461820602417,
0.06335604190826416,
0.07404226809740067,
-0.013383669778704643,
-0.04808373376727104,
0.05262446776032448,
-0.1546144038438797,
0.08058803528547287,
0.05663880705833435,
0.07198881357908249,
-0.14643532037734985,
0.14109894633293152,
-0.00386550254188478,
0.044458698481321335,
-0.015866391360759735,
0.010315530933439732,
-0.11918655782938004,
-0.013387413695454597,
-0.1722337156534195,
-0.007801779545843601,
-0.05317562445998192,
0.0036146536003798246,
0.000618124264292419,
-0.053049687296152115,
-0.06294756382703781,
0.0369061678647995,
-0.08654426038265228,
-0.04611532762646675,
0.02691986784338951,
0.0648043304681778,
-0.11969362944364548,
0.026990404352545738,
0.024979956448078156,
-0.09066399186849594,
0.07749155163764954,
0.05154228210449219,
0.029990341514348984,
0.04241200536489487,
-0.052556879818439484,
-0.01995769888162613,
0.015383802354335785,
0.023415159434080124,
0.07236111164093018,
-0.07904442399740219,
0.015331314876675606,
-0.01612759381532669,
0.07091895490884781,
0.019770987331867218,
0.05908619612455368,
-0.12815378606319427,
-0.033742550760507584,
-0.037837136536836624,
-0.06302402168512344,
-0.0710291787981987,
0.030550021678209305,
0.09168583899736404,
0.03835657611489296,
0.182097926735878,
-0.0752282366156578,
0.0237113144248724,
-0.19950032234191895,
-0.01873978041112423,
-0.008306968957185745,
-0.04819261655211449,
-0.0646461546421051,
-0.03088214248418808,
0.06427411735057831,
-0.05464615300297737,
0.1104489266872406,
-0.021356457844376564,
0.0683983713388443,
0.04383634775876999,
-0.05577106028795242,
0.0077316719107329845,
0.01350216381251812,
0.21823348104953766,
0.070076584815979,
-0.010281570255756378,
0.043941155076026917,
0.0009009025525301695,
0.058774709701538086,
0.06011972576379776,
0.1854664385318756,
0.15560905635356903,
-0.04836452379822731,
0.07781139016151428,
0.06938528269529343,
-0.08441388607025146,
-0.14540088176727295,
0.06755581498146057,
-0.02162926085293293,
0.09964369237422943,
-0.051331643015146255,
0.14876681566238403,
0.11657091230154037,
-0.17124149203300476,
0.03287065774202347,
-0.08589164167642593,
-0.09571000933647156,
-0.10849820077419281,
-0.05671299993991852,
-0.08008850365877151,
-0.12512899935245514,
0.030090518295764923,
-0.1382676064968109,
0.0060273706912994385,
0.10161537677049637,
0.013962406665086746,
-0.013236326165497303,
0.1416308581829071,
-0.02227729558944702,
0.005971586797386408,
0.06266531348228455,
0.0018749834271147847,
0.009351130574941635,
-0.08971764892339706,
-0.05092455819249153,
0.023951034992933273,
0.0002411387104075402,
0.08369988203048706,
-0.0498717799782753,
-0.020707931369543076,
0.023725366219878197,
-0.009080296382308006,
-0.06974256783723831,
0.0070305559784173965,
0.0249579269438982,
0.04561770334839821,
0.028047211468219757,
0.0202537402510643,
0.00161594373639673,
-0.03834382817149162,
0.2619609832763672,
-0.07634956389665604,
-0.08141403645277023,
-0.1593601405620575,
0.21497635543346405,
0.026189438998699188,
0.001429842901416123,
0.07731277495622635,
-0.09541086107492447,
-0.019468501210212708,
0.21025799214839935,
0.15848156809806824,
-0.06114349141716957,
-0.02087409980595112,
0.01927962712943554,
-0.017028771340847015,
-0.06395592540502548,
0.13012497127056122,
0.11829783022403717,
0.06271999329328537,
-0.06353291869163513,
-0.028491323813796043,
-0.013532343320548534,
-0.031835418194532394,
-0.0626143217086792,
0.06908062100410461,
0.03333652392029762,
-0.0020415806211531162,
-0.04563062638044357,
0.060455720871686935,
-0.03254379704594612,
-0.14416660368442535,
0.04843306541442871,
-0.12352368980646133,
-0.18787595629692078,
-0.031442612409591675,
0.0653943344950676,
-0.004578650929033756,
0.08548994362354279,
-0.020261777564883232,
-0.009891982190310955,
0.1310425102710724,
-0.011446631513535976,
-0.060581982135772705,
-0.09759306162595749,
0.13280849158763885,
-0.055924251675605774,
0.2105790674686432,
-0.010576268658041954,
0.0738014504313469,
0.1259901523590088,
0.014425008557736874,
-0.13145042955875397,
0.008824467658996582,
0.07280968129634857,
-0.08050195872783661,
0.01771186850965023,
0.1363714337348938,
-0.038136076182127,
0.09179943054914474,
0.038145434111356735,
-0.16790570318698883,
-0.017930755391716957,
-0.0170981977134943,
0.0072108069434762,
-0.10212347656488419,
0.012321507558226585,
-0.08227484673261642,
0.14537295699119568,
0.22169943153858185,
-0.05150991305708885,
-0.020472876727581024,
-0.07448222488164902,
0.06224077194929123,
0.053859893232584,
0.09055310487747192,
-0.03514475002884865,
-0.23567324876785278,
-0.0012208069674670696,
-0.025965271517634392,
-0.0016332302475348115,
-0.26393845677375793,
-0.07183686643838882,
0.050415098667144775,
-0.04290909692645073,
-0.0488017313182354,
0.09230375289916992,
0.07887163758277893,
0.047386594116687775,
-0.04611595720052719,
-0.10144944489002228,
-0.08161783963441849,
0.14580833911895752,
-0.13646799325942993,
-0.06562451273202896
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07839290797710419,
0.10040147602558136,
-0.0026020014192909002,
0.0767892375588417,
0.13187457621097565,
0.0338105745613575,
0.10717862099409103,
0.12349247932434082,
-0.12725725769996643,
0.06837628036737442,
0.09249667078256607,
0.08599116653203964,
0.03349749371409416,
0.1307498663663864,
-0.039742663502693176,
-0.2316565215587616,
0.008515499532222748,
-0.017045199871063232,
-0.0658365935087204,
0.10207400470972061,
0.08847477287054062,
-0.09731566160917282,
0.08143222332000732,
-0.002681369660422206,
-0.1842210441827774,
0.02734282985329628,
-0.02109898068010807,
-0.051270242780447006,
0.09856371581554413,
-0.004617601167410612,
0.0827663242816925,
0.0068703629076480865,
0.1184915155172348,
-0.19260503351688385,
0.013520503416657448,
0.0734999030828476,
0.03256107121706009,
0.09532392770051956,
0.02073756419122219,
-0.003862750716507435,
0.1658039689064026,
-0.1344180405139923,
0.09986226260662079,
0.029588785022497177,
-0.0872376561164856,
-0.16998521983623505,
-0.09676899760961533,
0.01643209531903267,
0.03568662330508232,
0.08813085407018661,
0.008415351621806622,
0.1781684160232544,
-0.09182095527648926,
0.08021606504917145,
0.2389495074748993,
-0.27752459049224854,
-0.07858394086360931,
0.04331827163696289,
0.04961766302585602,
0.07793421298265457,
-0.11904466152191162,
-0.01571093685925007,
0.020882314071059227,
0.029618162661790848,
0.09225510060787201,
-0.024934161454439163,
-0.08464393764734268,
-0.0069521949626505375,
-0.11898276954889297,
0.003873489797115326,
0.11340601742267609,
0.04482133314013481,
-0.05048747360706329,
-0.0506938211619854,
-0.057776231318712234,
-0.0742797702550888,
-0.0342695452272892,
-0.034661486744880676,
0.043262094259262085,
-0.05671168863773346,
-0.12068501114845276,
-0.03356906771659851,
-0.04839744418859482,
-0.07752041518688202,
-0.02054368518292904,
0.20469290018081665,
0.048513151705265045,
0.032817911356687546,
-0.05161401256918907,
0.08198737353086472,
0.011566976085305214,
-0.12916049361228943,
-0.030316518619656563,
0.002740905387327075,
-0.0800829753279686,
-0.0386672206223011,
-0.055249523371458054,
0.015523442067205906,
0.04323913902044296,
0.21398651599884033,
-0.05373941361904144,
0.08355483412742615,
0.0312372874468565,
-0.020061064511537552,
-0.020996445789933205,
0.12509123980998993,
-0.02029295451939106,
-0.07896685600280762,
0.028271574527025223,
0.05778408795595169,
0.025079334154725075,
0.0017964477883651853,
-0.056108973920345306,
-0.0323033332824707,
0.08336780965328217,
0.03228024020791054,
-0.05977940931916237,
0.025930456817150116,
0.002390463137999177,
-0.01937483251094818,
0.006224527955055237,
-0.11398684233427048,
0.017370980232954025,
-0.006425009109079838,
-0.07405875623226166,
-0.014646251685917377,
0.01053900457918644,
-0.015808574855327606,
0.010288916528224945,
0.09980709105730057,
-0.08953619003295898,
-0.022128969430923462,
-0.07584605365991592,
-0.07046966999769211,
-0.0008335163001902401,
-0.14848698675632477,
0.012703810818493366,
-0.07364411652088165,
-0.15463349223136902,
-0.034013886004686356,
0.04277687147259712,
-0.07085391879081726,
-0.0278532262891531,
-0.03859875351190567,
-0.07727445662021637,
0.026186570525169373,
0.003103397088125348,
0.1870780885219574,
-0.0529368557035923,
0.08025332540273666,
0.01931452751159668,
0.05030602589249611,
-0.022266024723649025,
0.03097325749695301,
-0.08863802254199982,
0.004209475591778755,
-0.16744990646839142,
0.061304934322834015,
-0.0758587047457695,
0.01721116714179516,
-0.12939096987247467,
-0.086798295378685,
-0.02688921056687832,
-0.02831815369427204,
0.08337132632732391,
0.10129451006650925,
-0.13877946138381958,
-0.027526771649718285,
0.11442960053682327,
-0.07756934314966202,
-0.05868307873606682,
0.0666746199131012,
-0.06717648357152939,
0.051427073776721954,
0.05753794312477112,
0.1829075664281845,
0.07255235314369202,
-0.11715236306190491,
-0.02298741228878498,
0.0005343612283468246,
0.02835788205265999,
-0.010266796685755253,
0.05067776143550873,
0.009397093206644058,
0.023301174864172935,
0.01602841541171074,
-0.03775850310921669,
0.0031856787391006947,
-0.09744174778461456,
-0.06257680058479309,
-0.05760643631219864,
-0.0817771852016449,
-0.03674999624490738,
0.013284478336572647,
0.037391580641269684,
-0.08115575462579727,
-0.08469657599925995,
0.08174511045217514,
0.1415146440267563,
-0.043168094009160995,
0.021362530067563057,
-0.07324288785457611,
0.02054334431886673,
-0.05285545438528061,
-0.03096715919673443,
-0.2044067680835724,
-0.06543150544166565,
0.03248688578605652,
-0.02665981836616993,
0.056147363036870956,
0.0034906112123280764,
0.07254915684461594,
0.03738412633538246,
-0.03730112686753273,
0.005821467377245426,
-0.09001988172531128,
-0.006163017358630896,
-0.09336762875318527,
-0.22228321433067322,
-0.03849262371659279,
-0.03095233626663685,
0.128301203250885,
-0.16390524804592133,
-0.008262615650892258,
-0.03246210888028145,
0.12040139734745026,
0.030233953148126602,
-0.06365353614091873,
-0.015241095796227455,
0.03037160076200962,
0.0015411644708365202,
-0.09292076528072357,
0.03176833316683769,
0.017264513298869133,
-0.07235458493232727,
-0.057721156626939774,
-0.117962546646595,
0.005702725145965815,
0.079527847468853,
0.06321550905704498,
-0.09785053879022598,
0.006072587799280882,
-0.06501951068639755,
-0.033160123974084854,
-0.05705547705292702,
0.04012204706668854,
0.17604580521583557,
0.006882655434310436,
0.10741596668958664,
-0.07924424111843109,
-0.07390854507684708,
0.021759718656539917,
0.005395143758505583,
0.042550768703222275,
0.09152007848024368,
0.1107153445482254,
-0.1267699897289276,
0.06585091352462769,
0.08175598084926605,
-0.06111617758870125,
0.12543796002864838,
-0.03836941719055176,
-0.07988378405570984,
-0.03504659980535507,
-0.0184178464114666,
-0.013770808465778828,
0.1370290368795395,
-0.05276646837592125,
0.024981295689940453,
0.030507726594805717,
0.03953920677304268,
0.02020210772752762,
-0.15211161971092224,
-0.0029271431267261505,
0.009044129401445389,
-0.04292230308055878,
-0.017225317656993866,
0.02237674966454506,
0.019334381446242332,
0.09739287942647934,
0.03444338217377663,
-0.015596030279994011,
-0.007359963376075029,
-0.004052153322845697,
-0.05270509049296379,
0.1903393268585205,
-0.0906919315457344,
-0.04165501520037651,
-0.07781557738780975,
-0.0010427441447973251,
-0.03762088343501091,
-0.04188108816742897,
0.02803593873977661,
-0.08801589161157608,
-0.03888573870062828,
-0.0739477202296257,
-0.0020199620630592108,
-0.04775642231106758,
0.025886723771691322,
0.030051618814468384,
0.00456268573179841,
0.06108753755688667,
-0.1352079063653946,
0.004410086665302515,
-0.0727168470621109,
-0.10469431430101395,
0.01621825061738491,
0.06353534013032913,
0.0912233293056488,
0.058630481362342834,
-0.028685761615633965,
0.0222913920879364,
-0.031815774738788605,
0.25312361121177673,
-0.05651918426156044,
-0.001512865419499576,
0.10771351307630539,
0.023990575224161148,
0.045452605932950974,
0.09275035560131073,
0.034865766763687134,
-0.10211307555437088,
0.029932651668787003,
0.08538265526294708,
-0.03494918346405029,
-0.239464670419693,
-0.005426750052720308,
-0.03541893884539604,
-0.11337142437696457,
0.08317967504262924,
0.05072414502501488,
-0.039841677993535995,
0.06397836655378342,
0.00841299444437027,
0.023453906178474426,
-0.04973483085632324,
0.09369343519210815,
0.10014273226261139,
0.07370416820049286,
0.10190390795469284,
-0.04784025996923447,
-0.019886046648025513,
0.06975258141756058,
-0.005576256196945906,
0.293157696723938,
-0.02614900842308998,
0.06964129954576492,
0.05347675830125809,
0.13807526230812073,
-0.02193751372396946,
0.036365218460559845,
0.006522177252918482,
-0.004189241677522659,
-0.025410616770386696,
-0.05789407342672348,
-0.02863500639796257,
-0.0013969294959679246,
-0.07810098677873611,
0.05557052791118622,
-0.06140986084938049,
0.06254077702760696,
0.02062060311436653,
0.26135680079460144,
-0.002683020429685712,
-0.28113287687301636,
-0.07921171933412552,
-0.022053753957152367,
-0.03874930366873741,
-0.04552493244409561,
0.012368898838758469,
0.0983218103647232,
-0.1045011579990387,
0.05189075693488121,
-0.056722115725278854,
0.08140844106674194,
-0.02812632918357849,
-0.0046783084981143475,
0.03052421286702156,
0.18210183084011078,
-0.016588857397437096,
0.04909621924161911,
-0.1994953453540802,
0.216246098279953,
0.018201328814029694,
0.13304473459720612,
-0.05033222213387489,
0.008294920437037945,
0.023747557774186134,
-0.0007350771338678896,
0.07633481919765472,
-0.0055730449967086315,
-0.07437612116336823,
-0.1256931722164154,
-0.07464917749166489,
0.07956452667713165,
0.13966140151023865,
-0.016165152192115784,
0.10244238376617432,
-0.049537673592567444,
0.019147425889968872,
0.04044171795248985,
-0.06785240769386292,
-0.15593327581882477,
-0.09747891128063202,
-0.01630968414247036,
0.03697005659341812,
-0.09208932518959045,
-0.04698064178228378,
-0.07538168877363205,
-0.011460105888545513,
0.11596307903528214,
0.02382081001996994,
-0.01926012523472309,
-0.13747833669185638,
0.08681868016719818,
0.14979220926761627,
-0.07303944230079651,
0.024438396096229553,
-0.007483903784304857,
0.06476334482431412,
0.04294167459011078,
-0.09367740899324417,
0.04739735275506973,
-0.05735950917005539,
-0.1611415296792984,
-0.047078315168619156,
0.0910760909318924,
0.07253993302583694,
0.039876457303762436,
-0.0046683745458722115,
0.05084788054227829,
-0.021676640957593918,
-0.09996051341295242,
0.012648474425077438,
0.03754901885986328,
0.05090729892253876,
0.03582929074764252,
-0.08203883469104767,
0.05991366133093834,
-0.033070288598537445,
-0.005869539454579353,
0.11330936849117279,
0.2431897222995758,
-0.08948231488466263,
0.086145780980587,
0.05705966055393219,
-0.06770708411931992,
-0.142137810587883,
0.0629468634724617,
0.10361591726541519,
-0.0008360369829460979,
0.05601397901773453,
-0.1945982724428177,
0.14043597877025604,
0.11480838060379028,
-0.012441698461771011,
0.03872417286038399,
-0.2727086544036865,
-0.11853276193141937,
0.0597168393433094,
0.1316683292388916,
0.11953864246606827,
-0.1341419816017151,
-0.013626452535390854,
-0.01798405312001705,
-0.12732505798339844,
0.10758428275585175,
-0.11352250725030899,
0.1342424899339676,
-0.03370572626590729,
0.10871128737926483,
0.005262936465442181,
-0.03027486428618431,
0.10816767066717148,
0.04951129108667374,
0.09747393429279327,
-0.042596235871315,
0.01111266016960144,
0.060094840824604034,
-0.04864427447319031,
0.01335013099014759,
-0.07194177806377411,
0.082926444709301,
-0.12138836830854416,
-0.007106054108589888,
-0.07866387069225311,
0.05077033117413521,
-0.036933258175849915,
-0.0527067705988884,
-0.053997136652469635,
0.03667420148849487,
0.05647891014814377,
-0.03676113113760948,
0.055555764585733414,
-0.0014194573741406202,
0.09318364411592484,
0.02278883010149002,
0.06814694404602051,
-0.0008320464985445142,
-0.04854697361588478,
0.020318172872066498,
-0.009753894060850143,
0.06053655594587326,
-0.1385958194732666,
0.0054748570546507835,
0.1059432402253151,
0.05221761390566826,
0.09751544892787933,
0.043507181107997894,
-0.04749462753534317,
0.011940151453018188,
0.038127221167087555,
-0.11386335641145706,
-0.10024270415306091,
0.048376038670539856,
-0.040351271629333496,
-0.1387064903974533,
0.04583805426955223,
0.11413039267063141,
-0.048493750393390656,
-0.02259608544409275,
-0.018029820173978806,
0.006375768221914768,
-0.021737467497587204,
0.1845378577709198,
0.04231279715895653,
0.04078283905982971,
-0.10217037796974182,
0.12999697029590607,
0.030068131163716316,
-0.02283567003905773,
0.05932841822504997,
0.08534426987171173,
-0.09471634775400162,
0.0027753233443945646,
0.09347745776176453,
0.1751978099346161,
-0.07129640132188797,
-0.01523328386247158,
-0.10405495017766953,
-0.0705350786447525,
0.061894603073596954,
0.15959006547927856,
0.05591485649347305,
-0.016775265336036682,
-0.04998815059661865,
0.04173224791884422,
-0.14170783758163452,
0.061512961983680725,
0.032565806061029434,
0.06992559134960175,
-0.0873696580529213,
0.05891580134630203,
0.007899190299212933,
0.003625989891588688,
-0.01673797331750393,
0.013953674584627151,
-0.09305468946695328,
-0.02893771231174469,
-0.07604765892028809,
0.010651993565261364,
-0.01336622517555952,
0.016028722748160362,
-0.010175925679504871,
-0.0682191401720047,
-0.06991801410913467,
0.03635216876864433,
-0.07709942758083344,
-0.052755746990442276,
0.012452048249542713,
0.04274865239858627,
-0.13395395874977112,
0.006392099894583225,
0.01594160869717598,
-0.08974496275186539,
0.08561913669109344,
0.08989871293306351,
0.026278268545866013,
0.03349792957305908,
-0.1305071860551834,
-0.03249185532331467,
0.013650506734848022,
0.0017342130886390805,
0.0651046484708786,
-0.09644884616136551,
-0.004054935183376074,
-0.021191006526350975,
0.07527746260166168,
0.011002589017152786,
0.08125048130750656,
-0.13120946288108826,
0.008507528342306614,
-0.08329000324010849,
-0.04329854995012283,
-0.065910704433918,
0.01600133813917637,
0.1024523377418518,
0.053003475069999695,
0.16270853579044342,
-0.07706914097070694,
0.01882566511631012,
-0.20856967568397522,
-0.0276393610984087,
-0.0054622613824903965,
-0.0532669723033905,
-0.13529539108276367,
-0.03993338719010353,
0.07711835205554962,
-0.038582660257816315,
0.10042440891265869,
-0.022091098129749298,
0.060773227363824844,
0.03951001912355423,
-0.03199350833892822,
-0.06351178139448166,
-0.029045049101114273,
0.19824160635471344,
0.07782670855522156,
-0.016047228127717972,
0.1076679602265358,
-0.005533072166144848,
0.05204212665557861,
0.029840441420674324,
0.20375312864780426,
0.20622844994068146,
0.005970880389213562,
0.07020926475524902,
0.06063377112150192,
-0.08151860535144806,
-0.06742750108242035,
0.178653746843338,
-0.02665477991104126,
0.07224993407726288,
-0.02922075428068638,
0.18776385486125946,
0.11384011059999466,
-0.15109406411647797,
0.03179248794913292,
-0.03165822476148605,
-0.07658365368843079,
-0.1406024694442749,
0.001290812506340444,
-0.09622848033905029,
-0.11842317134141922,
0.04512028023600578,
-0.12007013708353043,
0.05638077110052109,
0.08040690422058105,
0.01318612601608038,
0.03384881466627121,
0.12662461400032043,
-0.02631462551653385,
0.004306369461119175,
0.040255725383758545,
0.007467054761946201,
-0.029430389404296875,
-0.04251401871442795,
-0.07689718157052994,
0.05030199885368347,
0.0061344243586063385,
0.08827453851699829,
-0.04416654258966446,
-0.009230561554431915,
0.042057476937770844,
-0.028988240286707878,
-0.07660653442144394,
0.025202617049217224,
0.0367162860929966,
0.05652669817209244,
0.04738342761993408,
0.044507283717393875,
-0.006261172704398632,
-0.03305346891283989,
0.2806588411331177,
-0.05862996727228165,
-0.09665626287460327,
-0.11487144231796265,
0.20474034547805786,
0.04114154353737831,
-0.030015937983989716,
0.03786754980683327,
-0.08355828374624252,
-0.011701710522174835,
0.15623115003108978,
0.15601646900177002,
-0.06086764857172966,
-0.024090010672807693,
-0.011307193897664547,
-0.01660405471920967,
-0.03962250426411629,
0.11397874355316162,
0.09516459703445435,
0.0023644957691431046,
-0.05297841876745224,
-0.02658740244805813,
-0.03549858182668686,
-0.014608890749514103,
-0.041347406804561615,
0.025269929319620132,
0.014371499419212341,
-0.022944319993257523,
-0.034607045352458954,
0.06278715282678604,
-0.0012520031305029988,
-0.2431611269712448,
0.06241032853722572,
-0.14451706409454346,
-0.16823260486125946,
-0.025070631876587868,
0.04852493479847908,
-0.010233016684651375,
0.050207968801259995,
-0.022198928520083427,
-0.004929376300424337,
0.08128216862678528,
-0.01977778598666191,
-0.05674632266163826,
-0.12476897239685059,
0.11163229495286942,
-0.057398516684770584,
0.18130658566951752,
-0.017407190054655075,
0.06820771098136902,
0.1182430237531662,
0.04354972019791603,
-0.1396481841802597,
0.04667516425251961,
0.04729028418660164,
-0.1136208027601242,
0.01824261248111725,
0.1433202475309372,
-0.04624611884355545,
0.08588363975286484,
0.044084563851356506,
-0.0956353172659874,
-0.009871484711766243,
-0.0474247932434082,
-0.026441078633069992,
-0.07093550264835358,
-0.012633057311177254,
-0.06753399968147278,
0.1703486293554306,
0.19655741751194,
-0.02556782215833664,
0.013200763612985611,
-0.09387961030006409,
0.028910862281918526,
0.06859221309423447,
0.03636574000120163,
-0.050534412264823914,
-0.20811986923217773,
0.019408093765378,
0.04999992623925209,
-0.002881812397390604,
-0.23207040131092072,
-0.0784924328327179,
0.04083404690027237,
-0.03424215316772461,
-0.0557892881333828,
0.09604617953300476,
0.03556210547685623,
0.04796294867992401,
-0.03692609816789627,
-0.14837850630283356,
-0.03715681657195091,
0.15512977540493011,
-0.1789735108613968,
-0.04989899322390556
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07886964827775955,
0.09789744764566422,
-0.0025399387814104557,
0.07672044634819031,
0.13214616477489471,
0.03407711908221245,
0.10750025510787964,
0.12184839695692062,
-0.12593485414981842,
0.06748748570680618,
0.09142174571752548,
0.0871812030673027,
0.03380478173494339,
0.13115394115447998,
-0.039088040590286255,
-0.23252056539058685,
0.00841349083930254,
-0.01677027903497219,
-0.0646953359246254,
0.1020253449678421,
0.0887673869729042,
-0.09780201315879822,
0.08084610104560852,
-0.002700232435017824,
-0.18440483510494232,
0.027452338486909866,
-0.020800625905394554,
-0.050111137330532074,
0.09892381727695465,
-0.0061830440536141396,
0.08251297473907471,
0.006467795465141535,
0.12049267441034317,
-0.19099564850330353,
0.01356901042163372,
0.07312244921922684,
0.032653260976076126,
0.09488007426261902,
0.02043868415057659,
-0.0028516645543277264,
0.16517946124076843,
-0.1348954439163208,
0.1002219170331955,
0.028959164395928383,
-0.08734767138957977,
-0.16900701820850372,
-0.09689563512802124,
0.01594087854027748,
0.03725761920213699,
0.08720483630895615,
0.008087340742349625,
0.1781577616930008,
-0.092220239341259,
0.08033434301614761,
0.2396310716867447,
-0.2786039412021637,
-0.07852883636951447,
0.0450984463095665,
0.05140208825469017,
0.07881741225719452,
-0.11870085448026657,
-0.01646529510617256,
0.02133561484515667,
0.02938833460211754,
0.09319189190864563,
-0.02537619322538376,
-0.08453480154275894,
-0.007292384747415781,
-0.12033711373806,
0.005550628527998924,
0.1124173030257225,
0.04506835713982582,
-0.050139814615249634,
-0.0506882444024086,
-0.05850130692124367,
-0.07426075637340546,
-0.035605285316705704,
-0.03551280498504639,
0.04370642453432083,
-0.05690903589129448,
-0.11931062489748001,
-0.0332174152135849,
-0.04745646193623543,
-0.07794897258281708,
-0.020334450528025627,
0.20397445559501648,
0.048813872039318085,
0.0324825793504715,
-0.05139657109975815,
0.08121109753847122,
0.008717119693756104,
-0.1292150318622589,
-0.02907995879650116,
0.002974693663418293,
-0.08129677176475525,
-0.03954916074872017,
-0.0559450201690197,
0.015568983741104603,
0.04247326776385307,
0.21387776732444763,
-0.051146626472473145,
0.08380855619907379,
0.0327899344265461,
-0.02030770480632782,
-0.020885419100522995,
0.12541563808918,
-0.020687159150838852,
-0.08069942146539688,
0.028082234784960747,
0.05818125233054161,
0.025163713842630386,
0.0026822153013199568,
-0.05689084529876709,
-0.032361917197704315,
0.08367589861154556,
0.03204019367694855,
-0.06154685840010643,
0.026276694610714912,
0.002702486701309681,
-0.019136451184749603,
0.007479575462639332,
-0.1145835816860199,
0.01795921102166176,
-0.00688618840649724,
-0.07483457028865814,
-0.014447338879108429,
0.008990234695374966,
-0.014919256791472435,
0.010650694370269775,
0.09900926053524017,
-0.08968013525009155,
-0.02163536101579666,
-0.07652842998504639,
-0.07153015583753586,
-0.0008876444189809263,
-0.1509554386138916,
0.013262788765132427,
-0.07251862436532974,
-0.1568448394536972,
-0.034526098519563675,
0.04211416468024254,
-0.07039057463407516,
-0.027888232842087746,
-0.03960621356964111,
-0.07743482291698456,
0.024898186326026917,
0.0038487608544528484,
0.18858453631401062,
-0.05239921063184738,
0.07898475229740143,
0.019165363162755966,
0.051667358726263046,
-0.02291492186486721,
0.031159428879618645,
-0.08784977346658707,
0.004211977124214172,
-0.1682419329881668,
0.06101342290639877,
-0.07606607675552368,
0.019097531214356422,
-0.12870414555072784,
-0.08647723495960236,
-0.0272930096834898,
-0.027974024415016174,
0.08331416547298431,
0.10095565766096115,
-0.1414700150489807,
-0.026450807228684425,
0.1149725466966629,
-0.07727713137865067,
-0.057899534702301025,
0.06526965647935867,
-0.06748280674219131,
0.05215158313512802,
0.05885833874344826,
0.18291467428207397,
0.07267803698778152,
-0.11672844737768173,
-0.02246638387441635,
0.0006177459727041423,
0.028131503611803055,
-0.011892544105648994,
0.05017242580652237,
0.0101384948939085,
0.024392500519752502,
0.01605243608355522,
-0.03660322725772858,
0.0025258956011384726,
-0.09715229272842407,
-0.06245160475373268,
-0.05785951390862465,
-0.08264589309692383,
-0.0368366464972496,
0.013360636308789253,
0.037540607154369354,
-0.08155526220798492,
-0.08400092273950577,
0.08315014839172363,
0.14131231606006622,
-0.04300525784492493,
0.020567970350384712,
-0.07291507720947266,
0.02002118155360222,
-0.053172409534454346,
-0.03111332096159458,
-0.20472225546836853,
-0.06458462029695511,
0.032868318259716034,
-0.027627237141132355,
0.0561288483440876,
0.005799661390483379,
0.07308568060398102,
0.037192750722169876,
-0.03662661463022232,
0.006102686282247305,
-0.08925562351942062,
-0.006001344416290522,
-0.09506011009216309,
-0.22184669971466064,
-0.03899925574660301,
-0.031095784157514572,
0.1276537925004959,
-0.16423749923706055,
-0.007934770546853542,
-0.034725021570920944,
0.12013132870197296,
0.029412241652607918,
-0.06328541785478592,
-0.015457781963050365,
0.029914431273937225,
0.002306628040969372,
-0.09310230612754822,
0.03200488165020943,
0.01605621911585331,
-0.07157707959413528,
-0.059199970215559006,
-0.11899419873952866,
0.003984816838055849,
0.07901924103498459,
0.06377612054347992,
-0.09751187264919281,
0.006960894446820021,
-0.06469567120075226,
-0.032409682869911194,
-0.05591869726777077,
0.039834342896938324,
0.17483210563659668,
0.006259332410991192,
0.10700281709432602,
-0.07946684956550598,
-0.07471539825201035,
0.021610554307699203,
0.006187676917761564,
0.04385941103100777,
0.09138091653585434,
0.11137256026268005,
-0.1263429820537567,
0.06520999222993851,
0.08377623558044434,
-0.06103892996907234,
0.12376735359430313,
-0.038532715290784836,
-0.0801403820514679,
-0.03490410000085831,
-0.01958651840686798,
-0.014654658734798431,
0.13670314848423004,
-0.05344193056225777,
0.023542633280158043,
0.02986280433833599,
0.03909417241811752,
0.0199020653963089,
-0.15289197862148285,
-0.002579356776550412,
0.008502375334501266,
-0.0421573631465435,
-0.018097981810569763,
0.02205611579120159,
0.01851600408554077,
0.09741426259279251,
0.03359677642583847,
-0.016729658469557762,
-0.008081335574388504,
-0.004309884738177061,
-0.05209474265575409,
0.19058793783187866,
-0.09033993631601334,
-0.039433252066373825,
-0.07597069442272186,
-0.0015882749576121569,
-0.03659006208181381,
-0.041937414556741714,
0.02715017832815647,
-0.0880882516503334,
-0.03897020220756531,
-0.07379695028066635,
-0.0024844645522534847,
-0.04713805019855499,
0.02583475038409233,
0.03122669644653797,
0.004446478560566902,
0.060151513665914536,
-0.13540637493133545,
0.004347480367869139,
-0.07385849952697754,
-0.10493774712085724,
0.01562744937837124,
0.06358405947685242,
0.0907042920589447,
0.060233570635318756,
-0.028655802831053734,
0.022216467186808586,
-0.03144388273358345,
0.253728985786438,
-0.05644179508090019,
0.0002390784356975928,
0.10777200013399124,
0.024075781926512718,
0.04543423280119896,
0.09390763938426971,
0.03484807536005974,
-0.1025114357471466,
0.029597973451018333,
0.08478076010942459,
-0.034401942044496536,
-0.23980803787708282,
-0.0055383360013365746,
-0.03506312519311905,
-0.11545806378126144,
0.08293118327856064,
0.05031954497098923,
-0.039541784673929214,
0.06495441496372223,
0.009056911803781986,
0.02436770871281624,
-0.04964751377701759,
0.09289155155420303,
0.09887824207544327,
0.07308845967054367,
0.10234130173921585,
-0.048227228224277496,
-0.020729679614305496,
0.06857620924711227,
-0.0058023324236273766,
0.2939237654209137,
-0.024633698165416718,
0.06867070496082306,
0.05318827927112579,
0.13742657005786896,
-0.022228442132472992,
0.03825140371918678,
0.0069753145799040794,
-0.005087577737867832,
-0.025697333738207817,
-0.05770571529865265,
-0.026878155767917633,
-0.0013775703264400363,
-0.07844909280538559,
0.05538693815469742,
-0.06068265438079834,
0.061828743666410446,
0.0202416330575943,
0.2597048878669739,
-0.0011539163533598185,
-0.2814692556858063,
-0.0779697597026825,
-0.021703019738197327,
-0.03972243517637253,
-0.044945668429136276,
0.01214817725121975,
0.09665703773498535,
-0.10395035892724991,
0.05280974879860878,
-0.05650889128446579,
0.08169639110565186,
-0.027161022648215294,
-0.004029210656881332,
0.02980278618633747,
0.1834140419960022,
-0.016308899968862534,
0.049186527729034424,
-0.2000165432691574,
0.21561941504478455,
0.018247440457344055,
0.13367317616939545,
-0.051016341894865036,
0.007867006585001945,
0.023852385580539703,
-0.0009406033786945045,
0.0764092281460762,
-0.005531142931431532,
-0.07505778968334198,
-0.12572450935840607,
-0.07441407442092896,
0.07952383905649185,
0.14035776257514954,
-0.014877868816256523,
0.10273203998804092,
-0.048931047320365906,
0.018299084156751633,
0.04088890925049782,
-0.06886216253042221,
-0.1570400595664978,
-0.09661497175693512,
-0.017021890729665756,
0.035899631679058075,
-0.09353287518024445,
-0.04631663113832474,
-0.07539840042591095,
-0.01387769915163517,
0.11431840807199478,
0.02585316076874733,
-0.01930190995335579,
-0.13698002696037292,
0.0870438888669014,
0.1503138244152069,
-0.07279729843139648,
0.025299083441495895,
-0.006935546174645424,
0.06500287353992462,
0.04303227365016937,
-0.0935853123664856,
0.04771239683032036,
-0.05738777294754982,
-0.16060896217823029,
-0.04657365009188652,
0.09200119972229004,
0.07304854691028595,
0.03999674692749977,
-0.0034832521341741085,
0.05103779584169388,
-0.02204452082514763,
-0.10012656450271606,
0.01203582901507616,
0.037490881979465485,
0.04990437254309654,
0.03594467416405678,
-0.08294980227947235,
0.0586630143225193,
-0.03300568833947182,
-0.004256852436810732,
0.11288870126008987,
0.2409517914056778,
-0.08943602442741394,
0.08553814888000488,
0.05647725611925125,
-0.06795433163642883,
-0.14206574857234955,
0.06399526447057724,
0.10370459407567978,
-0.000735256529878825,
0.05656209588050842,
-0.1931518018245697,
0.14179910719394684,
0.11466493457555771,
-0.012003546580672264,
0.03841893747448921,
-0.2733202278614044,
-0.11848971992731094,
0.060139670968055725,
0.1321936994791031,
0.11989010125398636,
-0.13375353813171387,
-0.013272294774651527,
-0.019082967191934586,
-0.12735728919506073,
0.10773511230945587,
-0.11450454592704773,
0.13387379050254822,
-0.03349518030881882,
0.10814746469259262,
0.005075610242784023,
-0.030374053865671158,
0.10691282153129578,
0.05157681927084923,
0.09799802303314209,
-0.04283071681857109,
0.010060494765639305,
0.061672404408454895,
-0.04798256233334541,
0.014020097441971302,
-0.07115878909826279,
0.08239042013883591,
-0.11997717618942261,
-0.007066350430250168,
-0.07817395031452179,
0.05006181448698044,
-0.03668747469782829,
-0.05230848863720894,
-0.05372312664985657,
0.036394089460372925,
0.05591845139861107,
-0.03677039220929146,
0.05403253436088562,
-0.00039223581552505493,
0.0926993191242218,
0.019536912441253662,
0.06896616518497467,
-0.0015524134505540133,
-0.04733264818787575,
0.02076529711484909,
-0.009419208392500877,
0.05957625433802605,
-0.13934776186943054,
0.005051913671195507,
0.10608406364917755,
0.052299413830041885,
0.09721596539020538,
0.04358994960784912,
-0.04741239547729492,
0.011341420002281666,
0.03779216483235359,
-0.11237434297800064,
-0.10201580822467804,
0.04932523891329765,
-0.041395511478185654,
-0.13865099847316742,
0.0476180836558342,
0.11253948509693146,
-0.04854816198348999,
-0.023213718086481094,
-0.018888888880610466,
0.006032301113009453,
-0.02173912152647972,
0.18585282564163208,
0.0431993193924427,
0.040677208453416824,
-0.10289355367422104,
0.12925802171230316,
0.029879113659262657,
-0.021399687975645065,
0.058325741440057755,
0.08619214594364166,
-0.09555719792842865,
0.002697213087230921,
0.09501016139984131,
0.17641520500183105,
-0.06954853981733322,
-0.0148512227460742,
-0.10406151413917542,
-0.07131548225879669,
0.062052372843027115,
0.16059863567352295,
0.05660346895456314,
-0.01810990460216999,
-0.04975240305066109,
0.042063143104314804,
-0.14139071106910706,
0.06097641587257385,
0.0326564759016037,
0.07018399983644485,
-0.08700414001941681,
0.05947858840227127,
0.008000952191650867,
0.004074648953974247,
-0.016875404864549637,
0.015114960260689259,
-0.0928681418299675,
-0.029568789526820183,
-0.0759502425789833,
0.009068597108125687,
-0.013958721421658993,
0.01590154506266117,
-0.010473543778061867,
-0.06825003772974014,
-0.06968189030885696,
0.03581995889544487,
-0.07727422565221786,
-0.052918609231710434,
0.012086480855941772,
0.04157959669828415,
-0.13339687883853912,
0.00626583443954587,
0.015236549079418182,
-0.08898791670799255,
0.0851019099354744,
0.08901694416999817,
0.02693389169871807,
0.034661076962947845,
-0.1313468962907791,
-0.03244657814502716,
0.014205421321094036,
0.002208466175943613,
0.06571749597787857,
-0.0953989252448082,
-0.004114123526960611,
-0.02089976705610752,
0.07716094702482224,
0.010358983650803566,
0.07892803847789764,
-0.13024665415287018,
0.00883533526211977,
-0.08437248319387436,
-0.0433768555521965,
-0.06611461937427521,
0.015712114050984383,
0.10162725299596786,
0.05224024876952171,
0.1633007973432541,
-0.07591398060321808,
0.018837006762623787,
-0.20931221544742584,
-0.0279606431722641,
-0.005967502947896719,
-0.053888458758592606,
-0.13498930633068085,
-0.040746282786130905,
0.07758712023496628,
-0.03855239227414131,
0.1019834578037262,
-0.021405883133411407,
0.06164567917585373,
0.03887895867228508,
-0.028006136417388916,
-0.06358363479375839,
-0.02847663126885891,
0.19821695983409882,
0.07781165093183517,
-0.016102932393550873,
0.10689716786146164,
-0.004622564185410738,
0.05220073461532593,
0.0277244932949543,
0.2034175992012024,
0.20599228143692017,
0.004923942498862743,
0.06998598575592041,
0.06143491715192795,
-0.08142592012882233,
-0.06603424996137619,
0.18038028478622437,
-0.02778504602611065,
0.0709189623594284,
-0.02924761176109314,
0.1903221160173416,
0.11244592070579529,
-0.15094660222530365,
0.03215596824884415,
-0.0314013846218586,
-0.07721572369337082,
-0.1400851607322693,
0.0007008836837485433,
-0.09676893800497055,
-0.11804590374231339,
0.04503757879137993,
-0.11983876675367355,
0.05603146553039551,
0.08145666122436523,
0.01301872543990612,
0.03338243439793587,
0.12723490595817566,
-0.02651466801762581,
0.005040737334638834,
0.0401083379983902,
0.007301053963601589,
-0.029029155150055885,
-0.0422995388507843,
-0.07604221999645233,
0.050319869071245193,
0.005616583861410618,
0.08801912516355515,
-0.04547445848584175,
-0.010540798306465149,
0.0415164977312088,
-0.028343580663204193,
-0.0763329342007637,
0.025465311482548714,
0.03638092428445816,
0.05598358064889908,
0.04613450914621353,
0.045072462409734726,
-0.006501555442810059,
-0.03338463231921196,
0.2790498733520508,
-0.05853315815329552,
-0.09822198748588562,
-0.114003986120224,
0.20342442393302917,
0.041487183421850204,
-0.029783114790916443,
0.0381850004196167,
-0.08337179571390152,
-0.010497042909264565,
0.15722213685512543,
0.1562253087759018,
-0.061097871512174606,
-0.02430197410285473,
-0.011403960175812244,
-0.01708560809493065,
-0.04013179615139961,
0.11445114016532898,
0.09565869718790054,
0.0006540176691487432,
-0.052743520587682724,
-0.026324881240725517,
-0.034884121268987656,
-0.01510623749345541,
-0.04213029891252518,
0.024105533957481384,
0.015569915063679218,
-0.022954409942030907,
-0.03300248831510544,
0.06380512565374374,
0.00003708790973178111,
-0.24295677244663239,
0.06054706871509552,
-0.14383935928344727,
-0.16848058998584747,
-0.025475822389125824,
0.04892319440841675,
-0.008748812600970268,
0.05020482465624809,
-0.0224311463534832,
-0.004656652454286814,
0.08098464459180832,
-0.01975669525563717,
-0.05688492953777313,
-0.12508000433444977,
0.11208127439022064,
-0.0598507821559906,
0.180231973528862,
-0.017360106110572815,
0.06909316778182983,
0.11800980567932129,
0.04243599995970726,
-0.13887178897857666,
0.04739275574684143,
0.04692875221371651,
-0.11292597651481628,
0.019341522827744484,
0.14283140003681183,
-0.04573266953229904,
0.08249185979366302,
0.04359188303351402,
-0.09593627601861954,
-0.009197727777063847,
-0.04655349999666214,
-0.026779040694236755,
-0.07129506021738052,
-0.011354190297424793,
-0.06747352331876755,
0.1706400215625763,
0.1965746134519577,
-0.025606833398342133,
0.013848605565726757,
-0.0942525565624237,
0.02800106257200241,
0.06852104514837265,
0.036651939153671265,
-0.051047779619693756,
-0.20847512781620026,
0.019027236849069595,
0.048822756856679916,
-0.0027891728095710278,
-0.2300223410129547,
-0.07771068066358566,
0.039256751537323,
-0.035287629812955856,
-0.05599404498934746,
0.0950988158583641,
0.03670458868145943,
0.048079997301101685,
-0.036737363785505295,
-0.14914315938949585,
-0.03734571114182472,
0.155757874250412,
-0.17948000133037567,
-0.04940410330891609
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07802659273147583,
0.09948325157165527,
-0.0026417432818561792,
0.07692567259073257,
0.13131004571914673,
0.03439343720674515,
0.10732249170541763,
0.12274336814880371,
-0.1267111450433731,
0.06776710599660873,
0.0918915644288063,
0.08612226694822311,
0.03334876149892807,
0.13058683276176453,
-0.03928334265947342,
-0.23267784714698792,
0.008252945728600025,
-0.018124006688594818,
-0.06660693883895874,
0.10207771509885788,
0.08936261385679245,
-0.09672965109348297,
0.08088182657957077,
-0.002468029735609889,
-0.18493452668190002,
0.02752065844833851,
-0.019915038719773293,
-0.04980606213212013,
0.09868568927049637,
-0.005334279499948025,
0.08270541578531265,
0.007487569935619831,
0.12023352086544037,
-0.1919022500514984,
0.013257491402328014,
0.07228871434926987,
0.03290898725390434,
0.09517323970794678,
0.021008599549531937,
-0.0027918003033846617,
0.1652510166168213,
-0.13389936089515686,
0.09957735985517502,
0.029086364433169365,
-0.08709680289030075,
-0.1694875806570053,
-0.09713035076856613,
0.0167557280510664,
0.037528205662965775,
0.08634517341852188,
0.008858587592840195,
0.1772051453590393,
-0.09194815158843994,
0.08001895248889923,
0.23877348005771637,
-0.2792491316795349,
-0.07884812355041504,
0.043738625943660736,
0.05030488967895508,
0.07938344031572342,
-0.11780824512243271,
-0.016017910093069077,
0.021166695281863213,
0.029458820819854736,
0.09214577823877335,
-0.025489144027233124,
-0.08462963253259659,
-0.007445469498634338,
-0.12016253173351288,
0.003779831575229764,
0.1127428486943245,
0.04462498053908348,
-0.05116097629070282,
-0.04923730343580246,
-0.058932531625032425,
-0.07209229469299316,
-0.034403786063194275,
-0.03610405698418617,
0.04370134323835373,
-0.05603434890508652,
-0.11916299164295197,
-0.035137202590703964,
-0.04879825562238693,
-0.07852292060852051,
-0.02062852308154106,
0.20532359182834625,
0.04890033230185509,
0.033806268125772476,
-0.051189374178647995,
0.08249678462743759,
0.010766404680907726,
-0.12884247303009033,
-0.029665522277355194,
0.0032149325124919415,
-0.08083701133728027,
-0.03907309100031853,
-0.05551016703248024,
0.016356421634554863,
0.04284336417913437,
0.21330995857715607,
-0.053688421845436096,
0.08324531465768814,
0.03174147009849548,
-0.019519958645105362,
-0.020669864490628242,
0.12504532933235168,
-0.01909446157515049,
-0.07812490314245224,
0.028210395947098732,
0.05773553624749184,
0.02560247853398323,
0.0025931682903319597,
-0.05674811080098152,
-0.03284255415201187,
0.08457718789577484,
0.03309885412454605,
-0.06096986308693886,
0.024288592860102654,
0.0014011182356625795,
-0.019363364204764366,
0.005997275002300739,
-0.11459381878376007,
0.01794389635324478,
-0.0063977534882724285,
-0.07353389263153076,
-0.015591961331665516,
0.009738236665725708,
-0.01481624972075224,
0.011170115321874619,
0.09808921813964844,
-0.08839602023363113,
-0.02093661017715931,
-0.07509704679250717,
-0.06961999088525772,
-0.000883219763636589,
-0.14818260073661804,
0.0127076031640172,
-0.07328303158283234,
-0.15480414032936096,
-0.03311898931860924,
0.042838942259550095,
-0.07176532596349716,
-0.03024153783917427,
-0.03848958760499954,
-0.07641933858394623,
0.02602849155664444,
0.0030591501854360104,
0.18825428187847137,
-0.05244583263993263,
0.07996851205825806,
0.019333837553858757,
0.05134012550115585,
-0.022506214678287506,
0.030262239277362823,
-0.08713372051715851,
0.005121165886521339,
-0.16848385334014893,
0.06152864173054695,
-0.07520923018455505,
0.01599411480128765,
-0.1298762708902359,
-0.08588326722383499,
-0.02618350274860859,
-0.028950631618499756,
0.08253388106822968,
0.10114941745996475,
-0.1396133303642273,
-0.02631058171391487,
0.11417809873819351,
-0.07891474664211273,
-0.057656459510326385,
0.06641745567321777,
-0.06730198860168457,
0.05370006710290909,
0.05710497498512268,
0.18280954658985138,
0.07300063222646713,
-0.1170889362692833,
-0.020671557635068893,
0.0021207842510193586,
0.02783086523413658,
-0.010271639563143253,
0.051687564700841904,
0.009346920065581799,
0.022014783695340157,
0.015791917219758034,
-0.03872443363070488,
0.002432922599837184,
-0.09780319035053253,
-0.06320001184940338,
-0.05781397596001625,
-0.08212366700172424,
-0.03573136404156685,
0.011935168877243996,
0.03785369545221329,
-0.08081723004579544,
-0.08339996635913849,
0.08231621235609055,
0.1418331265449524,
-0.04249882325530052,
0.021749716252088547,
-0.0732247531414032,
0.019994203001260757,
-0.053840428590774536,
-0.03156571835279465,
-0.2045152336359024,
-0.06525731086730957,
0.03347114846110344,
-0.027568470686674118,
0.05612872540950775,
0.00584162212908268,
0.0723409652709961,
0.037432942539453506,
-0.03693942725658417,
0.00582378963008523,
-0.08987151831388474,
-0.006563416216522455,
-0.09450002759695053,
-0.22258634865283966,
-0.03866906091570854,
-0.031136464327573776,
0.12841954827308655,
-0.1637348085641861,
-0.009001918137073517,
-0.03367457538843155,
0.12001819908618927,
0.02972600795328617,
-0.06302861869335175,
-0.01614399440586567,
0.0290105901658535,
0.001725932233966887,
-0.09235428273677826,
0.03194582462310791,
0.017334409058094025,
-0.07222379744052887,
-0.057019930332899094,
-0.11700339615345001,
0.00520941661670804,
0.07765225321054459,
0.06355299055576324,
-0.09770455211400986,
0.006107720546424389,
-0.06503038853406906,
-0.03317127749323845,
-0.05719396099448204,
0.04015791788697243,
0.17536954581737518,
0.0060127307660877705,
0.10723738372325897,
-0.07960927486419678,
-0.07412590086460114,
0.021486077457666397,
0.004318669438362122,
0.04274318739771843,
0.09123505651950836,
0.11114130169153214,
-0.1280432790517807,
0.06457220017910004,
0.08337648957967758,
-0.06102267652750015,
0.12342450022697449,
-0.038531381636857986,
-0.07986900210380554,
-0.03624073043465614,
-0.017334487289190292,
-0.014191524125635624,
0.13631942868232727,
-0.052332159131765366,
0.025403402745723724,
0.030017027631402016,
0.039813145995140076,
0.0197443887591362,
-0.15351322293281555,
-0.002754736226052046,
0.008890864439308643,
-0.04373795911669731,
-0.016170185059309006,
0.021509401500225067,
0.0194665789604187,
0.09785261005163193,
0.034266188740730286,
-0.01640070602297783,
-0.0068381838500499725,
-0.004218742251396179,
-0.053228959441185,
0.18994544446468353,
-0.0901264026761055,
-0.04138310253620148,
-0.07766850292682648,
-0.0009253804455511272,
-0.03658343851566315,
-0.04173652082681656,
0.02773997187614441,
-0.08706299215555191,
-0.03835226222872734,
-0.07399817556142807,
-0.0014599512796849012,
-0.04804607480764389,
0.026451749727129936,
0.03156571835279465,
0.004653341602534056,
0.062270089983940125,
-0.1347445845603943,
0.004428672604262829,
-0.07337348908185959,
-0.10591171681880951,
0.016604505479335785,
0.06389304995536804,
0.09026359021663666,
0.05918121337890625,
-0.027994785457849503,
0.022029154002666473,
-0.03100329264998436,
0.2540542185306549,
-0.0557192862033844,
-0.0006216666079126298,
0.10807774215936661,
0.022746341302990913,
0.046019501984119415,
0.09365236014127731,
0.03419056534767151,
-0.10236196964979172,
0.029806792736053467,
0.08441659808158875,
-0.034795165061950684,
-0.23972494900226593,
-0.005764233414083719,
-0.034380264580249786,
-0.1145855188369751,
0.08287583291530609,
0.05054950714111328,
-0.042758308351039886,
0.06391455978155136,
0.00944790430366993,
0.0233150701969862,
-0.04942190274596214,
0.09291898459196091,
0.09976644068956375,
0.07318339496850967,
0.1019495502114296,
-0.04755783453583717,
-0.0206751711666584,
0.07005389034748077,
-0.004849878139793873,
0.29306063055992126,
-0.02512115240097046,
0.06784813851118088,
0.05346149951219559,
0.13875392079353333,
-0.02239062264561653,
0.036461759358644485,
0.007283901795744896,
-0.004277628380805254,
-0.02596931718289852,
-0.05768788605928421,
-0.027914339676499367,
-0.00038191903149709105,
-0.07737254351377487,
0.055231641978025436,
-0.06155877932906151,
0.06424174457788467,
0.019866332411766052,
0.26151013374328613,
-0.0009659511270001531,
-0.2795619070529938,
-0.07788196951150894,
-0.021099425852298737,
-0.039600178599357605,
-0.04474887251853943,
0.01214990857988596,
0.09876825660467148,
-0.1048407107591629,
0.05156651884317398,
-0.056255411356687546,
0.08093688637018204,
-0.02784736268222332,
-0.004920946899801493,
0.029083307832479477,
0.1815953254699707,
-0.01529714372009039,
0.049699146300554276,
-0.1980976015329361,
0.21595150232315063,
0.01808422990143299,
0.13223662972450256,
-0.04957273602485657,
0.00837748497724533,
0.023099170997738838,
-0.0008508849423378706,
0.07695402950048447,
-0.004628437105566263,
-0.0758441910147667,
-0.12591564655303955,
-0.07572971284389496,
0.07933390140533447,
0.14052142202854156,
-0.01623721793293953,
0.10235770791769028,
-0.049575913697481155,
0.019032299518585205,
0.0403265617787838,
-0.06764915585517883,
-0.15645790100097656,
-0.09604191780090332,
-0.016630547121167183,
0.03473818302154541,
-0.09396521002054214,
-0.04744835942983627,
-0.07546140998601913,
-0.009907849133014679,
0.11636415868997574,
0.024778243154287338,
-0.019490445032715797,
-0.13673391938209534,
0.0870724692940712,
0.14976507425308228,
-0.0736614316701889,
0.024126240983605385,
-0.007161407265812159,
0.06544609367847443,
0.042968787252902985,
-0.09379096329212189,
0.04762304201722145,
-0.05721199885010719,
-0.16195710003376007,
-0.04669087380170822,
0.09255553781986237,
0.07275404781103134,
0.04028678312897682,
-0.0034109936095774174,
0.0504562184214592,
-0.02083183079957962,
-0.09985926747322083,
0.013285416178405285,
0.03776205703616142,
0.04982202127575874,
0.035772524774074554,
-0.08258745819330215,
0.06118517369031906,
-0.032736197113990784,
-0.005627034232020378,
0.1143670529127121,
0.24422261118888855,
-0.08985751867294312,
0.08741088211536407,
0.05641040951013565,
-0.06857693195343018,
-0.1423136293888092,
0.06191030517220497,
0.10542717576026917,
-0.0014638676075264812,
0.05784499645233154,
-0.19381164014339447,
0.14096102118492126,
0.11444994062185287,
-0.013196409679949284,
0.037414681166410446,
-0.2742822468280792,
-0.11854829639196396,
0.05906837061047554,
0.13193871080875397,
0.12156718969345093,
-0.13316768407821655,
-0.014164013788104057,
-0.017588168382644653,
-0.12727373838424683,
0.10681313276290894,
-0.11270380765199661,
0.13390350341796875,
-0.03348156809806824,
0.10883903503417969,
0.005278219003230333,
-0.029670339077711105,
0.10873015224933624,
0.04993932321667671,
0.0965319499373436,
-0.04227552190423012,
0.010681670159101486,
0.05964149162173271,
-0.048812925815582275,
0.01333571132272482,
-0.07086890935897827,
0.08311212807893753,
-0.12101580947637558,
-0.007397642824798822,
-0.07807581126689911,
0.04984113201498985,
-0.03748997300863266,
-0.05225122347474098,
-0.0536331981420517,
0.03615355119109154,
0.056152407079935074,
-0.0366799458861351,
0.05359872430562973,
0.00012496237468440086,
0.09141844511032104,
0.024093717336654663,
0.0678219348192215,
0.00021377357188612223,
-0.04832082986831665,
0.019313860684633255,
-0.009573782794177532,
0.05992387607693672,
-0.13851338624954224,
0.00635835574939847,
0.10532739758491516,
0.051667194813489914,
0.09750866889953613,
0.043254826217889786,
-0.04801870882511139,
0.012377559207379818,
0.03766154870390892,
-0.1135781779885292,
-0.10258205235004425,
0.048629842698574066,
-0.04045303910970688,
-0.13901293277740479,
0.04552011936903,
0.11507067829370499,
-0.047970712184906006,
-0.02316940575838089,
-0.018722044304013252,
0.007023909594863653,
-0.02208271063864231,
0.18469828367233276,
0.041988275945186615,
0.04127528518438339,
-0.10174119472503662,
0.12935154139995575,
0.02990218810737133,
-0.02078438550233841,
0.058355558663606644,
0.08544615656137466,
-0.09459759294986725,
0.003155748127028346,
0.09474769979715347,
0.175324946641922,
-0.07132080942392349,
-0.014544608071446419,
-0.10386443138122559,
-0.07146516442298889,
0.0614682137966156,
0.15905028581619263,
0.05697823315858841,
-0.01723705232143402,
-0.0496511310338974,
0.041724976152181625,
-0.14040927588939667,
0.06161278486251831,
0.03344061225652695,
0.07015740126371384,
-0.0879540741443634,
0.059142742305994034,
0.007603875827044249,
0.0055827125906944275,
-0.01713026873767376,
0.01404655072838068,
-0.09278750419616699,
-0.02947768196463585,
-0.07752519100904465,
0.011235145851969719,
-0.012688078917562962,
0.015871888026595116,
-0.009723012335598469,
-0.06882961839437485,
-0.06962567567825317,
0.036661356687545776,
-0.07720398902893066,
-0.052884653210639954,
0.011407801881432533,
0.04216879606246948,
-0.13352279365062714,
0.0055754538625478745,
0.016389429569244385,
-0.08991430699825287,
0.08633525669574738,
0.08984939008951187,
0.026951342821121216,
0.03407788649201393,
-0.12946216762065887,
-0.0326525941491127,
0.01458264235407114,
0.002129636937752366,
0.06515828520059586,
-0.0971536785364151,
-0.004774930886924267,
-0.020686285570263863,
0.07643413543701172,
0.010532882064580917,
0.08207748085260391,
-0.13104525208473206,
0.00898839347064495,
-0.08473673462867737,
-0.04507572948932648,
-0.06581816077232361,
0.015107019804418087,
0.10106749832630157,
0.053749989718198776,
0.1632676124572754,
-0.07714943587779999,
0.019124427810311317,
-0.20904316008090973,
-0.02771104872226715,
-0.005869786720722914,
-0.05181426927447319,
-0.13553577661514282,
-0.0405513197183609,
0.07651003450155258,
-0.038452982902526855,
0.10039272159337997,
-0.021959902718663216,
0.06039540842175484,
0.03914911672472954,
-0.030839674174785614,
-0.06203388795256615,
-0.02869330905377865,
0.19664721190929413,
0.07789746671915054,
-0.01573123037815094,
0.10733415931463242,
-0.006131739355623722,
0.05287311226129532,
0.0277967881411314,
0.20476557314395905,
0.20609118044376373,
0.005494064651429653,
0.07009027898311615,
0.06131841242313385,
-0.0809139832854271,
-0.06808653473854065,
0.17884854972362518,
-0.027407819405198097,
0.07104161381721497,
-0.02826474979519844,
0.18839699029922485,
0.11349672079086304,
-0.1516745388507843,
0.030876867473125458,
-0.031416598707437515,
-0.07734610140323639,
-0.14135180413722992,
0.0016078782500699162,
-0.09753336012363434,
-0.11833085864782333,
0.04489108920097351,
-0.11979575455188751,
0.05644409731030464,
0.08050879091024399,
0.012350868433713913,
0.034201718866825104,
0.12488394230604172,
-0.025700310245156288,
0.0048421709798276424,
0.03945476934313774,
0.007579747121781111,
-0.028279433026909828,
-0.04069286957383156,
-0.07670695334672928,
0.05022671818733215,
0.0074706049636006355,
0.0877755805850029,
-0.044346291571855545,
-0.009374664165079594,
0.04103432968258858,
-0.02918780967593193,
-0.07665438950061798,
0.02505110576748848,
0.036299072206020355,
0.056202806532382965,
0.04509381949901581,
0.04535121098160744,
-0.006181024946272373,
-0.03296465426683426,
0.28046509623527527,
-0.05830603837966919,
-0.09750723093748093,
-0.11423555016517639,
0.20563556253910065,
0.039903923869132996,
-0.028928032144904137,
0.03926113247871399,
-0.08314531296491623,
-0.01216390822082758,
0.15546146035194397,
0.15552350878715515,
-0.06277225911617279,
-0.024170557036995888,
-0.011684589087963104,
-0.016839945688843727,
-0.039229728281497955,
0.11539053171873093,
0.09488628059625626,
0.0023819045163691044,
-0.05347974970936775,
-0.02690327726304531,
-0.03616868332028389,
-0.014462059363722801,
-0.04236986115574837,
0.025055205449461937,
0.014362344518303871,
-0.021614711731672287,
-0.03441880643367767,
0.06289634108543396,
0.00012635828170459718,
-0.24230161309242249,
0.061140988022089005,
-0.1431395709514618,
-0.16896256804466248,
-0.0244733989238739,
0.049595821648836136,
-0.010618691332638264,
0.050237253308296204,
-0.02337850071489811,
-0.005020121578127146,
0.0813775509595871,
-0.019949812442064285,
-0.057254817336797714,
-0.12404686212539673,
0.11284615099430084,
-0.05915207043290138,
0.18139460682868958,
-0.01649872399866581,
0.06976883113384247,
0.11745277047157288,
0.042971283197402954,
-0.13984254002571106,
0.04624452069401741,
0.04703705385327339,
-0.11316227167844772,
0.018603358417749405,
0.14376148581504822,
-0.04603554680943489,
0.08441166579723358,
0.0450497530400753,
-0.09551502019166946,
-0.010943354107439518,
-0.04637448862195015,
-0.026741761714220047,
-0.07069344818592072,
-0.013242701068520546,
-0.06872305274009705,
0.16966284811496735,
0.19586992263793945,
-0.025861164554953575,
0.01425524614751339,
-0.09374244511127472,
0.028812702745199203,
0.06852016597986221,
0.03822701424360275,
-0.04989440366625786,
-0.2085990309715271,
0.018734874203801155,
0.049791570752859116,
-0.0023839829955250025,
-0.23120947182178497,
-0.07950048893690109,
0.03987240046262741,
-0.03420990705490112,
-0.055800653994083405,
0.0965661108493805,
0.03517165780067444,
0.04750911146402359,
-0.03641407936811447,
-0.1502823531627655,
-0.03791264444589615,
0.15511369705200195,
-0.1789587140083313,
-0.049595046788454056
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07837171107530594,
0.09832243621349335,
-0.0026014791801571846,
0.0778467059135437,
0.1325543224811554,
0.034790925681591034,
0.10770206153392792,
0.12202980369329453,
-0.12676450610160828,
0.0673133134841919,
0.09197584539651871,
0.08613269031047821,
0.03260539844632149,
0.13021376729011536,
-0.03917759284377098,
-0.23263980448246002,
0.007809036877006292,
-0.017510689795017242,
-0.06609777361154556,
0.10211953520774841,
0.08847824484109879,
-0.0973258838057518,
0.08140780031681061,
-0.0027519825380295515,
-0.18590159714221954,
0.02790413796901703,
-0.020524607971310616,
-0.049562208354473114,
0.0989077165722847,
-0.005243822932243347,
0.08310067653656006,
0.0067190020345151424,
0.12008300423622131,
-0.19043362140655518,
0.013535791076719761,
0.07254602015018463,
0.032485660165548325,
0.09476659446954727,
0.020500196143984795,
-0.003182904329150915,
0.1640348881483078,
-0.134297177195549,
0.09948055446147919,
0.028991306200623512,
-0.08735398948192596,
-0.17016993463039398,
-0.09660717844963074,
0.015377836301922798,
0.03654830902814865,
0.08729296177625656,
0.008525155484676361,
0.17666883766651154,
-0.09212631732225418,
0.08039910346269608,
0.2372220903635025,
-0.2799191176891327,
-0.07921633869409561,
0.04425247013568878,
0.05034049227833748,
0.07966476678848267,
-0.11831767112016678,
-0.015234563499689102,
0.021401679143309593,
0.03030155971646309,
0.09242303669452667,
-0.025833332911133766,
-0.08474359661340714,
-0.007001756690442562,
-0.11999210715293884,
0.0053115966729819775,
0.11336710304021835,
0.044810228049755096,
-0.050797391682863235,
-0.049402397125959396,
-0.058226633816957474,
-0.07336375117301941,
-0.03482953459024429,
-0.035133350640535355,
0.04382065311074257,
-0.056918032467365265,
-0.11955226957798004,
-0.03355354443192482,
-0.048367250710725784,
-0.07748103141784668,
-0.020806003361940384,
0.2044505774974823,
0.04869409278035164,
0.03353533893823624,
-0.050971053540706635,
0.0817435085773468,
0.010350840166211128,
-0.12864963710308075,
-0.028627993538975716,
0.002697241259738803,
-0.08039553463459015,
-0.038999296724796295,
-0.056168850511312485,
0.01754661090672016,
0.04305047541856766,
0.21349915862083435,
-0.05376528203487396,
0.08387257158756256,
0.03234371542930603,
-0.020126355811953545,
-0.02066984586417675,
0.12407595664262772,
-0.0205275546759367,
-0.07963395118713379,
0.028292737901210785,
0.05772020295262337,
0.024994194507598877,
0.002393250120803714,
-0.057015299797058105,
-0.032016903162002563,
0.08328940719366074,
0.03259578347206116,
-0.062035221606492996,
0.02625383995473385,
0.0023467272985726595,
-0.01895335502922535,
0.00585538474842906,
-0.11437821388244629,
0.017649473622441292,
-0.006658175960183144,
-0.0736813023686409,
-0.014682683162391186,
0.00928327813744545,
-0.01563011109828949,
0.010881082154810429,
0.09861422330141068,
-0.08929304033517838,
-0.021520964801311493,
-0.07569995522499084,
-0.07065387070178986,
-0.0014657736755907536,
-0.14811864495277405,
0.013698805123567581,
-0.0730532631278038,
-0.1551254689693451,
-0.03430502489209175,
0.04266606271266937,
-0.07156632095575333,
-0.028889916837215424,
-0.0384991392493248,
-0.07725195586681366,
0.025684403255581856,
0.0032325484789907932,
0.18887965381145477,
-0.05282016098499298,
0.07892709970474243,
0.02028687112033367,
0.051317889243364334,
-0.023141218349337578,
0.03040640987455845,
-0.0869247168302536,
0.004440594464540482,
-0.16905897855758667,
0.06078696623444557,
-0.07589387148618698,
0.017839644104242325,
-0.12871819734573364,
-0.08695974946022034,
-0.025295253843069077,
-0.02779713273048401,
0.08219832926988602,
0.1005130261182785,
-0.13942000269889832,
-0.02634311653673649,
0.1134568601846695,
-0.07763715833425522,
-0.057766299694776535,
0.06631287932395935,
-0.06761195510625839,
0.052690885961055756,
0.057505346834659576,
0.18308378756046295,
0.07296900451183319,
-0.11663031578063965,
-0.021480081602931023,
0.0013038866454735398,
0.0281026903539896,
-0.01087472215294838,
0.050126541405916214,
0.010315782390534878,
0.02209552749991417,
0.016313867643475533,
-0.03749043866991997,
0.0024016452953219414,
-0.09738967567682266,
-0.06271200627088547,
-0.0572366863489151,
-0.0820135846734047,
-0.036428917199373245,
0.012395113706588745,
0.037709563970565796,
-0.08173765987157822,
-0.08377842605113983,
0.08167169988155365,
0.14149194955825806,
-0.04295312613248825,
0.020630590617656708,
-0.07348060607910156,
0.02054290845990181,
-0.053784508258104324,
-0.031074220314621925,
-0.20425201952457428,
-0.06612329930067062,
0.032443080097436905,
-0.025468526408076286,
0.05632752552628517,
0.006101751234382391,
0.07290943711996078,
0.037645719945430756,
-0.03720909357070923,
0.005632034037262201,
-0.08861417323350906,
-0.006066279020160437,
-0.09429438412189484,
-0.22302861511707306,
-0.03886876627802849,
-0.030939986929297447,
0.12846402823925018,
-0.16426262259483337,
-0.008670528419315815,
-0.03357967361807823,
0.11949405074119568,
0.02935832180082798,
-0.06244960054755211,
-0.015490229241549969,
0.030487047508358955,
0.002516313688829541,
-0.09234198182821274,
0.03250188007950783,
0.017028585076332092,
-0.07107102125883102,
-0.05834011361002922,
-0.1174655556678772,
0.006526813842356205,
0.07875271886587143,
0.062330834567546844,
-0.09818483889102936,
0.0064653209410607815,
-0.06453391164541245,
-0.0329948328435421,
-0.055449776351451874,
0.04012150689959526,
0.17641788721084595,
0.005455199629068375,
0.10714582353830338,
-0.07891909033060074,
-0.07370468229055405,
0.021631214767694473,
0.005414155311882496,
0.043928537517786026,
0.09120793640613556,
0.11044225841760635,
-0.12644392251968384,
0.06487684696912766,
0.08281741291284561,
-0.061464156955480576,
0.12410265952348709,
-0.03864814341068268,
-0.0792556181550026,
-0.03570995479822159,
-0.018505943939089775,
-0.014572449028491974,
0.1370178759098053,
-0.052791424095630646,
0.024365531280636787,
0.029661521315574646,
0.03938915580511093,
0.020042547956109047,
-0.15258218348026276,
-0.002726297825574875,
0.008166939951479435,
-0.042709607630968094,
-0.01690664142370224,
0.02214112877845764,
0.018947159871459007,
0.09751536697149277,
0.034151818603277206,
-0.01746555231511593,
-0.0068320645950734615,
-0.004108417313545942,
-0.052317071706056595,
0.19028203189373016,
-0.09060418605804443,
-0.04021541774272919,
-0.0773942843079567,
-0.0027910657227039337,
-0.03765596076846123,
-0.04217282310128212,
0.027401618659496307,
-0.08828327059745789,
-0.03870569169521332,
-0.07367262989282608,
-0.0021021224092692137,
-0.047966498881578445,
0.026284897699952126,
0.03093637153506279,
0.004116298630833626,
0.06176205724477768,
-0.13457784056663513,
0.004554091952741146,
-0.07365605235099792,
-0.10591398924589157,
0.01706642657518387,
0.064534492790699,
0.09096077084541321,
0.05883416533470154,
-0.02801605500280857,
0.02208837680518627,
-0.030873622745275497,
0.25489145517349243,
-0.05587515980005264,
-0.00040364006417803466,
0.10785487294197083,
0.023511061444878578,
0.045053403824567795,
0.09372693300247192,
0.035172365605831146,
-0.10276442021131516,
0.029487237334251404,
0.08492723852396011,
-0.03440818563103676,
-0.2396230548620224,
-0.005349263548851013,
-0.03485141694545746,
-0.11474793404340744,
0.08242106437683105,
0.05045672133564949,
-0.042231567203998566,
0.0645127147436142,
0.0103509072214365,
0.024066559970378876,
-0.050202127546072006,
0.09259364753961563,
0.10142830014228821,
0.07271867245435715,
0.10267051309347153,
-0.04789765924215317,
-0.020969165489077568,
0.06938160955905914,
-0.00472659058868885,
0.29375624656677246,
-0.02537846565246582,
0.06756291538476944,
0.0539393424987793,
0.13821136951446533,
-0.02180304378271103,
0.037038654088974,
0.00667544174939394,
-0.004959261976182461,
-0.02595043182373047,
-0.057444535195827484,
-0.026868922635912895,
-0.0013864111388102174,
-0.0788833275437355,
0.05499541759490967,
-0.061533086001873016,
0.06284648925065994,
0.020217837765812874,
0.26075977087020874,
-0.0015637788455933332,
-0.2816687226295471,
-0.07838238030672073,
-0.021803870797157288,
-0.03908461704850197,
-0.04437991604208946,
0.012468870729207993,
0.09816136956214905,
-0.10429411381483078,
0.05220774561166763,
-0.05617605894804001,
0.08081542700529099,
-0.027871206402778625,
-0.0038764735218137503,
0.03104487992823124,
0.18317009508609772,
-0.016251729801297188,
0.04921228811144829,
-0.19876481592655182,
0.21493177115917206,
0.018374161794781685,
0.1328285038471222,
-0.04969325289130211,
0.008151582442224026,
0.023935623466968536,
0.0005833861068822443,
0.07652536779642105,
-0.005220294930040836,
-0.07562398165464401,
-0.12598879635334015,
-0.07458892464637756,
0.08030108362436295,
0.14005599915981293,
-0.015102341771125793,
0.10299085825681686,
-0.04885837808251381,
0.01855727843940258,
0.040218789130449295,
-0.06883063167333603,
-0.15654191374778748,
-0.09663692116737366,
-0.017512260004878044,
0.036228325217962265,
-0.09374281764030457,
-0.04657571390271187,
-0.07575266063213348,
-0.011581354774534702,
0.1155897006392479,
0.026329437270760536,
-0.019544146955013275,
-0.13715502619743347,
0.08616970479488373,
0.14976362884044647,
-0.07278116047382355,
0.024292122572660446,
-0.006805011536926031,
0.0643463134765625,
0.04392950236797333,
-0.09364058077335358,
0.04786597564816475,
-0.05783754587173462,
-0.16063907742500305,
-0.04680079594254494,
0.09157387912273407,
0.07232007384300232,
0.03956719860434532,
-0.0039313253946602345,
0.05048207566142082,
-0.021655134856700897,
-0.10046223551034927,
0.013403463177382946,
0.036238353699445724,
0.05060320347547531,
0.03576631844043732,
-0.0835539922118187,
0.06105545908212662,
-0.03207513689994812,
-0.004739605356007814,
0.11295295506715775,
0.2418098747730255,
-0.08943011611700058,
0.08531766384840012,
0.0564643070101738,
-0.0681108757853508,
-0.14201374351978302,
0.06333836168050766,
0.10392527282238007,
-0.0010979799553751945,
0.05633390322327614,
-0.1944044679403305,
0.14196276664733887,
0.11395719647407532,
-0.01216895878314972,
0.03921902924776077,
-0.27178776264190674,
-0.1178240180015564,
0.05929281562566757,
0.13260534405708313,
0.12270452082157135,
-0.13372722268104553,
-0.01346105057746172,
-0.017794722691178322,
-0.1270129382610321,
0.10619551688432693,
-0.11520305275917053,
0.13410261273384094,
-0.033877331763505936,
0.1094624474644661,
0.0045790765434503555,
-0.029679978266358376,
0.1078375056385994,
0.05079146847128868,
0.09768978506326675,
-0.0427112840116024,
0.011657798662781715,
0.05964370444417,
-0.048064153641462326,
0.012997311539947987,
-0.07163821160793304,
0.08260359615087509,
-0.12121900171041489,
-0.007215667981654406,
-0.0786302387714386,
0.04982674494385719,
-0.03730621561408043,
-0.0520988367497921,
-0.05300908908247948,
0.03628665208816528,
0.0551036112010479,
-0.03693530708551407,
0.052492547780275345,
-0.0007919789641164243,
0.09193731099367142,
0.02279367670416832,
0.06844945251941681,
-0.0013174605555832386,
-0.0486951544880867,
0.020607467740774155,
-0.010092084296047688,
0.059927504509687424,
-0.13819250464439392,
0.005080344155430794,
0.10591371357440948,
0.05097398906946182,
0.097067691385746,
0.04401935264468193,
-0.04710277169942856,
0.01174959260970354,
0.038469187915325165,
-0.11388308554887772,
-0.10152916610240936,
0.048884764313697815,
-0.04355713725090027,
-0.13837221264839172,
0.04681605100631714,
0.11486436426639557,
-0.04777701571583748,
-0.022848477587103844,
-0.018679436296224594,
0.006295592989772558,
-0.02210000716149807,
0.185195192694664,
0.04254329204559326,
0.04073292762041092,
-0.10271178185939789,
0.12899532914161682,
0.029757240787148476,
-0.02120950259268284,
0.05832904204726219,
0.08650463819503784,
-0.09547144919633865,
0.0022455293219536543,
0.09401259571313858,
0.17709305882453918,
-0.07081585377454758,
-0.01522546075284481,
-0.10493292659521103,
-0.07182559370994568,
0.061345212161540985,
0.15866559743881226,
0.056776680052280426,
-0.018432259559631348,
-0.04985366761684418,
0.041276976466178894,
-0.14113959670066833,
0.06124595180153847,
0.03231225907802582,
0.07059107720851898,
-0.08794401586055756,
0.05842875689268112,
0.007462202571332455,
0.004853290971368551,
-0.017095662653446198,
0.014639783650636673,
-0.09338793903589249,
-0.029561862349510193,
-0.07745250314474106,
0.01031596027314663,
-0.012647153809666634,
0.01643478497862816,
-0.01002135593444109,
-0.06785867363214493,
-0.07024963200092316,
0.03674706071615219,
-0.0773310735821724,
-0.05249942094087601,
0.013125881552696228,
0.042631592601537704,
-0.13283590972423553,
0.005875828675925732,
0.01526302844285965,
-0.08950145542621613,
0.0860297903418541,
0.08966095745563507,
0.026650220155715942,
0.03430354595184326,
-0.1286671906709671,
-0.033162012696266174,
0.0140140475705266,
0.0015245453687384725,
0.065943643450737,
-0.0960443988442421,
-0.0042524393647909164,
-0.02096637152135372,
0.07692737877368927,
0.01072115357965231,
0.08122992515563965,
-0.13038946688175201,
0.009611942805349827,
-0.08391276001930237,
-0.043650805950164795,
-0.0664370059967041,
0.015441840514540672,
0.10186775773763657,
0.05332145467400551,
0.16321980953216553,
-0.07710553705692291,
0.01851690001785755,
-0.20903247594833374,
-0.02806982956826687,
-0.006001665256917477,
-0.053328659385442734,
-0.13503585755825043,
-0.040615275502204895,
0.07713141292333603,
-0.03911115229129791,
0.10209358483552933,
-0.021783698350191116,
0.06051107123494148,
0.03870163857936859,
-0.03052501380443573,
-0.06309837847948074,
-0.02839021198451519,
0.19620588421821594,
0.07734153419733047,
-0.01628505066037178,
0.10712514817714691,
-0.004854035098105669,
0.05215137079358101,
0.029021041467785835,
0.20322665572166443,
0.205516055226326,
0.006393582560122013,
0.07007439434528351,
0.061823442578315735,
-0.08134276419878006,
-0.06668591499328613,
0.17979386448860168,
-0.027019783854484558,
0.07158506661653519,
-0.0292730163782835,
0.1871863603591919,
0.11258335411548615,
-0.15062710642814636,
0.031187448650598526,
-0.032352183014154434,
-0.07739948481321335,
-0.14029793441295624,
0.002870141062885523,
-0.0969613566994667,
-0.11888958513736725,
0.04491206631064415,
-0.12023243308067322,
0.05563066527247429,
0.08162978291511536,
0.012860282324254513,
0.033694952726364136,
0.12639780342578888,
-0.024936972185969353,
0.005612930748611689,
0.039668694138526917,
0.007209986913949251,
-0.028704136610031128,
-0.04071710258722305,
-0.07624232023954391,
0.04959488660097122,
0.005999820772558451,
0.08708648383617401,
-0.04484463483095169,
-0.0098541509360075,
0.04161207377910614,
-0.02883152663707733,
-0.07628129422664642,
0.024896850809454918,
0.036896560341119766,
0.0558246374130249,
0.04606591910123825,
0.04494428262114525,
-0.007039562799036503,
-0.03303902596235275,
0.27956390380859375,
-0.058299824595451355,
-0.09650824218988419,
-0.1145726814866066,
0.20528429746627808,
0.039621155709028244,
-0.02944580279290676,
0.03808869794011116,
-0.0823078602552414,
-0.011148780584335327,
0.15664871037006378,
0.15740583837032318,
-0.06310901790857315,
-0.024417787790298462,
-0.011304806917905807,
-0.01684996299445629,
-0.03944046422839165,
0.11582932621240616,
0.09526103734970093,
0.0015456678811460733,
-0.05340665951371193,
-0.026780899614095688,
-0.0356353260576725,
-0.014198093675076962,
-0.042463432997465134,
0.024492036551237106,
0.015405064448714256,
-0.02204938977956772,
-0.03367491438984871,
0.06283008307218552,
-0.000537228537723422,
-0.2422819882631302,
0.06177148222923279,
-0.1429269164800644,
-0.16920150816440582,
-0.0255315862596035,
0.049146492034196854,
-0.009868254885077477,
0.05076372250914574,
-0.023419279605150223,
-0.005228140857070684,
0.0819663479924202,
-0.020357763394713402,
-0.05629755184054375,
-0.12482188642024994,
0.1126413568854332,
-0.05955716222524643,
0.18034496903419495,
-0.017016146332025528,
0.06941591948270798,
0.11808996647596359,
0.04265660047531128,
-0.1395210474729538,
0.04682905599474907,
0.04639768972992897,
-0.11374947428703308,
0.019033094868063927,
0.14281359314918518,
-0.04590432718396187,
0.0837521106004715,
0.04397356137633324,
-0.09469857066869736,
-0.010067093186080456,
-0.04630451649427414,
-0.02669411338865757,
-0.07065777480602264,
-0.013260409235954285,
-0.06852459907531738,
0.17007088661193848,
0.19722911715507507,
-0.02565886825323105,
0.014177229255437851,
-0.0937165841460228,
0.02870573103427887,
0.06899075210094452,
0.036870408803224564,
-0.050676655024290085,
-0.20841248333454132,
0.019081654027104378,
0.05071026459336281,
-0.0027992762625217438,
-0.2320736050605774,
-0.07834679633378983,
0.03972028195858002,
-0.034023065119981766,
-0.05599227920174599,
0.09634395688772202,
0.036142461001873016,
0.04842698574066162,
-0.036872539669275284,
-0.14820757508277893,
-0.03788754343986511,
0.15496376156806946,
-0.1790139228105545,
-0.05022982507944107
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-42
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
{'exact_match': 66.90633869441817, 'f1': 77.54482247690522}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-42
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
{'exact_match': 66.90633869441817, 'f1': 77.54482247690522}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10",
"### Training results\n\n{'exact_match': 66.90633869441817, 'f1': 77.54482247690522}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10",
"### Training results\n\n{'exact_match': 66.90633869441817, 'f1': 77.54482247690522}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
52,
46,
6,
12,
8,
3,
105,
35,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10### Training results\n\n{'exact_match': 66.90633869441817, 'f1': 77.54482247690522}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.09826037287712097,
0.09277694672346115,
-0.002391536021605134,
0.07818219810724258,
0.14819078147411346,
0.04446703940629959,
0.12166230380535126,
0.10845138132572174,
-0.09997343271970749,
0.06589320302009583,
0.06757728010416031,
0.07498832792043686,
0.042413946241140366,
0.13263021409511566,
-0.05138673260807991,
-0.23718562722206116,
-0.0009666557889431715,
0.00017652643145993352,
-0.12706655263900757,
0.10548562556505203,
0.10597943514585495,
-0.09409990906715393,
0.08092797547578812,
0.005610298365354538,
-0.19301095604896545,
0.036943744868040085,
-0.0041224597953259945,
-0.03892757371068001,
0.10578668862581253,
0.02249351516366005,
0.10043583065271378,
0.007411791477352381,
0.11492300778627396,
-0.18037787079811096,
0.014257028698921204,
0.09033671766519547,
0.024922911077737808,
0.10212204605340958,
0.04536442458629608,
-0.03944724425673485,
0.12311089038848877,
-0.1203368529677391,
0.07447987794876099,
0.04536781460046768,
-0.09854276478290558,
-0.21042223274707794,
-0.10829571634531021,
0.046919383108615875,
0.047954969108104706,
0.08594821393489838,
0.010013366118073463,
0.1659352332353592,
-0.06413434445858002,
0.07918427884578705,
0.24572736024856567,
-0.28098124265670776,
-0.08473639190196991,
0.05460905656218529,
0.04523221775889397,
0.04270993173122406,
-0.10686107724905014,
-0.0042828405275940895,
0.025974173098802567,
0.028362859040498734,
0.0830388218164444,
-0.0355742871761322,
-0.12136291712522507,
-0.005757697857916355,
-0.11634610593318939,
0.014651340432465076,
0.11478476226329803,
0.04917412996292114,
-0.052220992743968964,
-0.03146963566541672,
-0.07372859865427017,
-0.07349079847335815,
-0.02141503244638443,
-0.03919408097863197,
0.04881913959980011,
-0.06018480658531189,
-0.11020401865243912,
-0.04049227386713028,
-0.05313308537006378,
-0.07885351032018661,
-0.034395188093185425,
0.2268049269914627,
0.029265716671943665,
0.0426003560423851,
-0.05764191970229149,
0.10388302803039551,
-0.02549068070948124,
-0.13558098673820496,
-0.01994481310248375,
-0.002728254534304142,
-0.10008623450994492,
-0.03651630878448486,
-0.05803797394037247,
0.01312266755849123,
0.023947808891534805,
0.23900489509105682,
-0.07497120648622513,
0.08187773823738098,
0.040206264704465866,
-0.01764175482094288,
-0.030803903937339783,
0.1323060244321823,
-0.06182406097650528,
-0.0790335163474083,
0.011890839785337448,
0.0536203496158123,
0.023163367062807083,
0.00023543126008007675,
-0.055843252688646317,
-0.03935522958636284,
0.05840586498379707,
0.0334775485098362,
-0.036187030375003815,
0.03191839158535004,
-0.002377989236265421,
-0.03147229552268982,
0.019957784563302994,
-0.11003002524375916,
0.01726633310317993,
-0.0016726787434890866,
-0.09871319681406021,
-0.02609821781516075,
0.029344316571950912,
0.001002845587208867,
0.0023985847365111113,
0.10361849516630173,
-0.0965307205915451,
-0.007827097550034523,
-0.09040752053260803,
-0.09429434686899185,
-0.0028452470432966948,
-0.13510486483573914,
0.006151446606963873,
-0.05925745889544487,
-0.159810870885849,
-0.038582731038331985,
0.04879963397979736,
-0.06894247233867645,
-0.02522437460720539,
-0.00193815550301224,
-0.08844497054815292,
0.026397718116641045,
-0.00011015603377018124,
0.2012365609407425,
-0.04665372893214226,
0.06451728194952011,
0.031803570687770844,
0.044914428144693375,
-0.03170621767640114,
0.027087755501270294,
-0.081035315990448,
0.018939727917313576,
-0.1730431765317917,
0.05482945591211319,
-0.08880694210529327,
0.010802357457578182,
-0.13099420070648193,
-0.08842309564352036,
-0.0021900602150708437,
-0.016709523275494576,
0.08850026875734329,
0.10615314543247223,
-0.14370110630989075,
-0.02470654621720314,
0.10504423826932907,
-0.07455728948116302,
-0.08728732913732529,
0.05132775381207466,
-0.0501285195350647,
0.04898703098297119,
0.03546430915594101,
0.1455293446779251,
0.09329085052013397,
-0.11991778016090393,
-0.03381279855966568,
0.002423606114462018,
0.03203146159648895,
-0.0015298834769055247,
0.049554355442523956,
-0.003646186552941799,
0.029613588005304337,
0.008640487678349018,
-0.07173831015825272,
-0.0141220111399889,
-0.09898146986961365,
-0.062088433653116226,
-0.05832919478416443,
-0.0867449939250946,
-0.010744280181825161,
0.024520915001630783,
0.028380705043673515,
-0.07373419404029846,
-0.10364947468042374,
0.11235323548316956,
0.13550256192684174,
-0.03878233581781387,
0.0175143051892519,
-0.07951072603464127,
-0.011812358163297176,
-0.02194357104599476,
-0.03398789092898369,
-0.21399104595184326,
-0.11463349312543869,
0.010055997408926487,
-0.04184378683567047,
0.04861443489789963,
-0.019529271870851517,
0.08554946631193161,
0.036013662815093994,
-0.049534622579813004,
0.004397208336740732,
-0.08040058612823486,
-0.014130529947578907,
-0.08982445299625397,
-0.21587032079696655,
-0.08249396830797195,
-0.015750275924801826,
0.17391516268253326,
-0.1769692599773407,
0.006882813759148121,
-0.016853634268045425,
0.145027294754982,
0.024051859974861145,
-0.052660539746284485,
-0.016587017104029655,
0.030812757089734077,
0.002998847048729658,
-0.0962805449962616,
0.027127353474497795,
0.00851449090987444,
-0.07205557078123093,
-0.04541059583425522,
-0.1496613621711731,
0.04362058639526367,
0.08188185095787048,
0.04683569446206093,
-0.09630528092384338,
0.004833058454096317,
-0.06109997630119324,
-0.042818620800971985,
-0.05307940021157265,
0.02314675599336624,
0.1433742344379425,
0.017944637686014175,
0.10291074216365814,
-0.06392772495746613,
-0.06877247989177704,
0.01127247791737318,
0.007309648208320141,
0.041346561163663864,
0.08772262930870056,
0.09488002955913544,
-0.10801605135202408,
0.08042716234922409,
0.06207509711384773,
-0.07586418837308884,
0.1241861879825592,
-0.0377633161842823,
-0.06361082941293716,
-0.0437314547598362,
-0.030096886679530144,
-0.009701848030090332,
0.14657801389694214,
-0.03721651807427406,
0.03218403458595276,
0.038297127932310104,
0.019633663818240166,
0.03245704621076584,
-0.16378498077392578,
-0.0014367165276780725,
0.0036531880032271147,
-0.0380977988243103,
-0.0288544874638319,
0.003169178729876876,
0.03715474531054497,
0.09303693473339081,
0.027056995779275894,
-0.0034400050062686205,
-0.000545389368198812,
-0.0038291537202894688,
-0.06700713187456131,
0.22433724999427795,
-0.0980796292424202,
-0.05328121408820152,
-0.10664643347263336,
0.021555406972765923,
-0.054155319929122925,
-0.04728308692574501,
0.005210948642343283,
-0.08045956492424011,
-0.05590404197573662,
-0.0597061887383461,
-0.0005678175366483629,
-0.013253606855869293,
0.003598528914153576,
0.01891985535621643,
-0.0017002193490043283,
0.07974644750356674,
-0.14440274238586426,
0.014132427982985973,
-0.058003831654787064,
-0.11108400672674179,
0.010439596138894558,
0.07699496299028397,
0.09009288251399994,
0.08790876716375351,
-0.024913519620895386,
0.02368570677936077,
-0.018353542312979698,
0.2523992955684662,
-0.07256507873535156,
0.012780692428350449,
0.13805194199085236,
0.026374686509370804,
0.045272909104824066,
0.09570483863353729,
0.0500328429043293,
-0.08939627557992935,
0.028032343834638596,
0.10442592203617096,
-0.03196367621421814,
-0.26376545429229736,
-0.016707995906472206,
-0.019192999228835106,
-0.10227957367897034,
0.07231421023607254,
0.03824901208281517,
-0.006890513002872467,
0.07473073899745941,
-0.012244329787790775,
0.008642110973596573,
-0.03580465167760849,
0.07752706855535507,
0.09708120673894882,
0.07880837470293045,
0.12290144711732864,
-0.04570909962058067,
-0.006659029982984066,
0.06317982822656631,
0.0026003792881965637,
0.26820775866508484,
-0.035185400396585464,
0.07840146869421005,
0.05154880881309509,
0.1312180459499359,
-0.033707182854413986,
0.05395650863647461,
0.014449147507548332,
-0.01838797703385353,
0.0021574371494352818,
-0.06605303287506104,
0.0021730028092861176,
0.007559353951364756,
-0.07972127199172974,
0.05964566394686699,
-0.05682648718357086,
0.05396118387579918,
0.020051119849085808,
0.27049684524536133,
0.005266278050839901,
-0.2869894206523895,
-0.09031736850738525,
-0.014409102499485016,
-0.009315147064626217,
-0.06353543698787689,
-0.012042602524161339,
0.10289973020553589,
-0.11958523094654083,
0.07766038924455643,
-0.06462620198726654,
0.09232980012893677,
0.005274526309221983,
-0.006379222963005304,
0.07106433063745499,
0.18083566427230835,
-0.022779393941164017,
0.04200347140431404,
-0.19722920656204224,
0.22698736190795898,
0.029524128884077072,
0.12230014055967331,
-0.03521811589598656,
0.014286192134022713,
0.033686913549900055,
0.023558827117085457,
0.04971982538700104,
-0.0035679317079484463,
-0.052314627915620804,
-0.13294434547424316,
-0.047645650804042816,
0.0659237802028656,
0.14007267355918884,
-0.03030550293624401,
0.1040436252951622,
-0.04070301726460457,
0.0035482062958180904,
0.0418923981487751,
-0.05741847679018974,
-0.1542574167251587,
-0.0804755762219429,
0.003004674334079027,
0.015576834790408611,
-0.05045256018638611,
-0.060554251074790955,
-0.08780612051486969,
-0.004503726959228516,
0.12795251607894897,
-0.02774842455983162,
-0.03995838388800621,
-0.14639979600906372,
0.08536767959594727,
0.1630188673734665,
-0.07754526287317276,
0.03221774846315384,
0.0002631166425999254,
0.08576542139053345,
0.03485202416777611,
-0.11076401174068451,
0.06565040349960327,
-0.06606041640043259,
-0.1721700131893158,
-0.026851825416088104,
0.10083867609500885,
0.07120037078857422,
0.03589488938450813,
-0.01822970062494278,
0.05325889587402344,
-0.01717262528836727,
-0.10136089473962784,
0.015457072295248508,
0.003871844382956624,
0.04035552591085434,
0.06048844754695892,
-0.05026514455676079,
0.03375215083360672,
-0.02913854643702507,
0.005738188978284597,
0.08385691046714783,
0.23290245234966278,
-0.09793559461832047,
0.03674681857228279,
0.03695668280124664,
-0.06599779427051544,
-0.16299450397491455,
0.09976579993963242,
0.11988998204469681,
-0.008159693330526352,
0.053201571106910706,
-0.2078145146369934,
0.16611982882022858,
0.13018211722373962,
-0.025446036830544472,
0.06836534291505814,
-0.24922813475131989,
-0.14240190386772156,
0.07402174919843674,
0.11072538793087006,
0.07718050479888916,
-0.153092160820961,
-0.02725799009203911,
-0.03186293691396713,
-0.2037912905216217,
0.1544003039598465,
-0.14562296867370605,
0.10679392516613007,
-0.020613111555576324,
0.09810822457075119,
0.014387661591172218,
-0.030092786997556686,
0.13092100620269775,
0.059743426740169525,
0.11091695725917816,
-0.02954108454287052,
0.00912100076675415,
0.06802086532115936,
-0.0466308668255806,
0.01476253941655159,
-0.039326850324869156,
0.07622141391038895,
-0.12346012890338898,
0.007655580062419176,
-0.09243589639663696,
0.07140180468559265,
-0.05056961625814438,
-0.04800541326403618,
-0.043753258883953094,
0.03896275907754898,
0.01541421003639698,
-0.03773416206240654,
0.08181485533714294,
0.0002752969739958644,
0.14272239804267883,
0.09198659658432007,
0.0832751914858818,
-0.02434844896197319,
-0.06528829783201218,
0.026464195922017097,
-0.011698581278324127,
0.06322738528251648,
-0.13517342507839203,
0.016960734501481056,
0.12109890580177307,
0.07409286499023438,
0.09169391542673111,
0.04910879582166672,
-0.052081748843193054,
0.0012019657297059894,
0.042355503886938095,
-0.1175202876329422,
-0.11480919271707535,
0.0214646328240633,
-0.06160842999815941,
-0.12983490526676178,
0.06013362109661102,
0.12728473544120789,
-0.03366396576166153,
-0.019394634291529655,
-0.008839468471705914,
0.002909056842327118,
-0.024256078526377678,
0.20301851630210876,
0.06235363706946373,
0.0638195052742958,
-0.10940469801425934,
0.12537208199501038,
0.03488650918006897,
-0.04249487817287445,
0.03765164688229561,
0.10179968178272247,
-0.08141886442899704,
0.0011830262374132872,
0.067408986389637,
0.12483926117420197,
-0.1171661913394928,
-0.03427911922335625,
-0.11303478479385376,
-0.09602716565132141,
0.041685800999403,
0.21201829612255096,
0.05375136435031891,
-0.019049782305955887,
-0.01680481806397438,
0.03778994083404541,
-0.13548339903354645,
0.056122079491615295,
0.044270094484090805,
0.08703465759754181,
-0.09630956500768661,
0.1132209450006485,
0.014330744743347168,
0.019643040373921394,
-0.014309671707451344,
0.012808527797460556,
-0.10627207159996033,
-0.03322862088680267,
-0.1232604831457138,
-0.010905159637331963,
-0.010247558355331421,
0.007801351603120565,
-0.019084356725215912,
-0.07383547723293304,
-0.08026184886693954,
0.04839685931801796,
-0.08252149820327759,
-0.05208093672990799,
0.02005528099834919,
0.0171072855591774,
-0.15397557616233826,
0.009881807491183281,
0.028933165594935417,
-0.08585695922374725,
0.08129510283470154,
0.09277230501174927,
0.04062008485198021,
0.032944709062576294,
-0.1256210058927536,
-0.04458450898528099,
-0.003347172401845455,
0.0016070368001237512,
0.0719400942325592,
-0.10452447086572647,
-0.0037128336261957884,
-0.039336685091257095,
0.08418954908847809,
0.006367153022438288,
0.09971246123313904,
-0.12790797650814056,
0.01508051436394453,
-0.06866810470819473,
-0.022994529455900192,
-0.07286015897989273,
0.02962503209710121,
0.11081760376691818,
0.05303482338786125,
0.14714521169662476,
-0.07079111039638519,
0.011018308810889721,
-0.22571709752082825,
-0.02450084686279297,
-0.021002881228923798,
-0.05926521122455597,
-0.10534865409135818,
-0.014180667698383331,
0.08208710700273514,
-0.05004536360502243,
0.09382043778896332,
-0.01668590120971203,
0.07179584354162216,
0.047949258238077164,
-0.016011977568268776,
-0.06822352856397629,
-0.004565382841974497,
0.16999711096286774,
0.0695752501487732,
-0.0013493805890902877,
0.10726470500230789,
0.022072307765483856,
0.03672393038868904,
0.02481946162879467,
0.2202777862548828,
0.16904541850090027,
-0.025584103539586067,
0.06965101510286331,
0.07809031009674072,
-0.11384530365467072,
-0.06948207318782806,
0.1328028291463852,
-0.03140765428543091,
0.07536238431930542,
-0.047690197825431824,
0.15096156299114227,
0.11297659575939178,
-0.15952332317829132,
0.04258184880018234,
-0.04921315610408783,
-0.08215968310832977,
-0.14539143443107605,
0.027066467329859734,
-0.08231515437364578,
-0.12985847890377045,
0.03288416191935539,
-0.13395394384860992,
0.06155357137322426,
0.14062651991844177,
0.008143887855112553,
0.035591017454862595,
0.1426917016506195,
-0.04800545051693916,
-0.0022500574123114347,
0.02902616746723652,
0.014091401360929012,
-0.008911622688174248,
-0.023350294679403305,
-0.06655821949243546,
0.05655466392636299,
0.01176667120307684,
0.07567788660526276,
-0.050938017666339874,
-0.012412980198860168,
0.03919763118028641,
-0.018437305465340614,
-0.08191299438476562,
0.018305214121937752,
0.04462772235274315,
0.05196910724043846,
0.04092561826109886,
0.03349395841360092,
0.015268468298017979,
-0.044211529195308685,
0.31121668219566345,
-0.07133358716964722,
-0.10180628299713135,
-0.13214989006519318,
0.24755129218101501,
0.025720449164509773,
-0.024175520986318588,
0.04959241300821304,
-0.0910671129822731,
-0.020658308640122414,
0.13033893704414368,
0.15830616652965546,
-0.06504015624523163,
-0.023306595161557198,
-0.0037608228158205748,
-0.018102318048477173,
-0.029519587755203247,
0.11973059922456741,
0.0924350693821907,
0.039986204355955124,
-0.06703741103410721,
-0.014133986085653305,
-0.014416846446692944,
-0.028123289346694946,
-0.04905662313103676,
0.060355644673109055,
0.00892766285687685,
-0.015214699320495129,
-0.015567605383694172,
0.067191943526268,
-0.0014521757839247584,
-0.21958476305007935,
0.07212783396244049,
-0.16519853472709656,
-0.18288719654083252,
-0.03663714975118637,
0.04377191141247749,
-0.015093786641955376,
0.06147467717528343,
-0.012756330892443657,
-0.023714669048786163,
0.10183744132518768,
-0.010273636318743229,
-0.055602673441171646,
-0.1499561071395874,
0.11021604388952255,
-0.11563409864902496,
0.19434189796447754,
-0.03016829676926136,
0.06717388331890106,
0.12174658477306366,
0.022745858877897263,
-0.14633429050445557,
0.023514067754149437,
0.05261407047510147,
-0.13858899474143982,
0.016584007069468498,
0.1356945037841797,
-0.04548861086368561,
0.08819715678691864,
0.033382631838321686,
-0.14303068816661835,
-0.015002213418483734,
-0.017565470188856125,
-0.029277615249156952,
-0.07336214184761047,
-0.024541469290852547,
-0.07235131412744522,
0.15302036702632904,
0.2167034149169922,
-0.029646296054124832,
0.01800421252846718,
-0.09373574703931808,
0.021108413115143776,
0.07222285866737366,
0.07503771781921387,
-0.04625643417239189,
-0.21958371996879578,
0.05292560160160065,
0.033587969839572906,
-0.006716154515743256,
-0.23308458924293518,
-0.05535319074988365,
0.04078212007880211,
-0.04808225855231285,
-0.022264936938881874,
0.09504608064889908,
0.06220424920320511,
0.05828741937875748,
-0.03805069997906685,
-0.13129207491874695,
-0.04404667019844055,
0.1627892702817917,
-0.1801033616065979,
-0.057325053960084915
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07831566035747528,
0.09802919626235962,
-0.0025687559973448515,
0.0775735154747963,
0.13272961974143982,
0.034277863800525665,
0.10747580975294113,
0.12228147685527802,
-0.12623465061187744,
0.0677027478814125,
0.09150905162096024,
0.08669858425855637,
0.03309624269604683,
0.12999247014522552,
-0.03904004395008087,
-0.23222599923610687,
0.008330538868904114,
-0.01731536164879799,
-0.06575145572423935,
0.10199610143899918,
0.08851628005504608,
-0.09755734354257584,
0.0807613879442215,
-0.003091010032221675,
-0.1860070675611496,
0.028167951852083206,
-0.020814722403883934,
-0.049336813390254974,
0.09869252145290375,
-0.0056634871289134026,
0.08268345892429352,
0.006945311091840267,
0.11986391246318817,
-0.1908724009990692,
0.013499009422957897,
0.07290536165237427,
0.03263096138834953,
0.09493016451597214,
0.021434446796774864,
-0.0027443624567240477,
0.1652471274137497,
-0.13380147516727448,
0.09923916310071945,
0.029409348964691162,
-0.08726051449775696,
-0.16855348646640778,
-0.09711141139268875,
0.014816241338849068,
0.037091709673404694,
0.08788561820983887,
0.008047440089285374,
0.17744605243206024,
-0.09272074699401855,
0.08042964339256287,
0.23873503506183624,
-0.27866971492767334,
-0.07914502918720245,
0.04480017349123955,
0.05023091658949852,
0.07877179235219955,
-0.11910542100667953,
-0.016351839527487755,
0.021504312753677368,
0.029964439570903778,
0.09162572771310806,
-0.025206435471773148,
-0.08582647144794464,
-0.007379377260804176,
-0.1203220784664154,
0.004998097196221352,
0.11232495307922363,
0.04495035111904144,
-0.050389889627695084,
-0.049273259937763214,
-0.05855514854192734,
-0.0727735161781311,
-0.0345044881105423,
-0.03500664606690407,
0.04393456131219864,
-0.05677402392029762,
-0.11912011355161667,
-0.03344166651368141,
-0.04831337556242943,
-0.07836679369211197,
-0.02038857527077198,
0.20406293869018555,
0.04879011958837509,
0.03336681053042412,
-0.05150603875517845,
0.08167676627635956,
0.010681210085749626,
-0.12883001565933228,
-0.02949354611337185,
0.003376246662810445,
-0.08046291768550873,
-0.03911083936691284,
-0.05592677742242813,
0.01652236469089985,
0.0425434336066246,
0.21182411909103394,
-0.053522247821092606,
0.0841071680188179,
0.03165014833211899,
-0.020074142143130302,
-0.02127760462462902,
0.12418156862258911,
-0.01957845129072666,
-0.07779166102409363,
0.027620676904916763,
0.05792228877544403,
0.024668985977768898,
0.002793353982269764,
-0.05645624175667763,
-0.032157693058252335,
0.08388832211494446,
0.03227423503994942,
-0.061697158962488174,
0.02571721561253071,
0.0019159362418577075,
-0.01915794052183628,
0.006507304031401873,
-0.11425658315420151,
0.017568621784448624,
-0.0070627727545797825,
-0.07396106421947479,
-0.015482005663216114,
0.009401406161487103,
-0.015871286392211914,
0.010625401511788368,
0.09869785606861115,
-0.08977814763784409,
-0.021486520767211914,
-0.07642559707164764,
-0.0703384131193161,
-0.001091420534066856,
-0.14959336817264557,
0.013343611732125282,
-0.0722474604845047,
-0.15533015131950378,
-0.03433100879192352,
0.04220503941178322,
-0.0715920627117157,
-0.0287161935120821,
-0.03902895748615265,
-0.07771095633506775,
0.025935957208275795,
0.003556953277438879,
0.1895713061094284,
-0.05247248709201813,
0.07958575338125229,
0.020224565640091896,
0.05118978023529053,
-0.022417891770601273,
0.030981460586190224,
-0.08777832984924316,
0.004303798079490662,
-0.16837233304977417,
0.06095125898718834,
-0.07655753940343857,
0.0176909901201725,
-0.12943845987319946,
-0.08636019378900528,
-0.02648247964680195,
-0.028415847569704056,
0.08288168162107468,
0.101236991584301,
-0.13936302065849304,
-0.026110779494047165,
0.11380600929260254,
-0.07840238511562347,
-0.05771040916442871,
0.06543231755495071,
-0.06733570247888565,
0.05239558592438698,
0.05701198801398277,
0.18308240175247192,
0.07247133553028107,
-0.11615002900362015,
-0.022620512172579765,
0.0008236428839154541,
0.02869309112429619,
-0.011450006626546383,
0.05013328418135643,
0.010115932673215866,
0.022925397381186485,
0.01626509428024292,
-0.03757760673761368,
0.0022913143038749695,
-0.09745527058839798,
-0.06210760399699211,
-0.057830121368169785,
-0.08211026340723038,
-0.03675239533185959,
0.013365413062274456,
0.03755959868431091,
-0.0813823938369751,
-0.0832531526684761,
0.0817883163690567,
0.14146754145622253,
-0.042445436120033264,
0.020837079733610153,
-0.07290433347225189,
0.019454048946499825,
-0.0546288937330246,
-0.031127262860536575,
-0.2051810920238495,
-0.06661128997802734,
0.033062148839235306,
-0.02578563243150711,
0.0562206469476223,
0.005890147294849157,
0.07270999252796173,
0.036974355578422546,
-0.03717736527323723,
0.005496557801961899,
-0.08948264271020889,
-0.006661755498498678,
-0.09458411484956741,
-0.22277192771434784,
-0.039091404527425766,
-0.03147432953119278,
0.12664686143398285,
-0.16380637884140015,
-0.008846414275467396,
-0.033559203147888184,
0.11989197134971619,
0.029342077672481537,
-0.06299111247062683,
-0.01546973455697298,
0.02984819933772087,
0.0021454307716339827,
-0.09267579764127731,
0.032377488911151886,
0.01641729287803173,
-0.07071275264024734,
-0.05824364721775055,
-0.11767998337745667,
0.005072372034192085,
0.07844309508800507,
0.06370808184146881,
-0.09821218252182007,
0.006577790714800358,
-0.06467227637767792,
-0.03337443619966507,
-0.05658299848437309,
0.04077354818582535,
0.17576389014720917,
0.005715875420719385,
0.10673356801271439,
-0.0793600007891655,
-0.07397963106632233,
0.022202521562576294,
0.00550835533067584,
0.043296895921230316,
0.09185819327831268,
0.11193765699863434,
-0.12787863612174988,
0.0655319094657898,
0.08339925855398178,
-0.06079655885696411,
0.12447672337293625,
-0.03880874067544937,
-0.07978326827287674,
-0.03468102216720581,
-0.01789737492799759,
-0.014433711767196655,
0.13662520051002502,
-0.052451591938734055,
0.024721957743167877,
0.029559936374425888,
0.03977212682366371,
0.02007845789194107,
-0.15286976099014282,
-0.0028130721766501665,
0.008014791645109653,
-0.04270423948764801,
-0.01732778549194336,
0.022053590044379234,
0.01889151707291603,
0.09763132780790329,
0.033827926963567734,
-0.017191892489790916,
-0.0072258710861206055,
-0.004126593470573425,
-0.05240200459957123,
0.19097702205181122,
-0.0903232991695404,
-0.039318062365055084,
-0.0766068547964096,
-0.002018180675804615,
-0.0374523289501667,
-0.042178936302661896,
0.027541467919945717,
-0.0892564058303833,
-0.038695015013217926,
-0.07363267987966537,
-0.0020975430961698294,
-0.048089273273944855,
0.025543171912431717,
0.030463339760899544,
0.004374985583126545,
0.06121865659952164,
-0.13520821928977966,
0.004608182702213526,
-0.07402762025594711,
-0.10626225173473358,
0.016752559691667557,
0.0640200600028038,
0.09063590317964554,
0.05891536548733711,
-0.028206318616867065,
0.02196338400244713,
-0.03102535754442215,
0.2538876533508301,
-0.056023936718702316,
-0.0008907181909307837,
0.1079995259642601,
0.024387193843722343,
0.0456567108631134,
0.0937986820936203,
0.03446201607584953,
-0.10283436626195908,
0.030045464634895325,
0.08531787246465683,
-0.03468463569879532,
-0.24060697853565216,
-0.005340357776731253,
-0.0350441075861454,
-0.11501707136631012,
0.08278592675924301,
0.05072171986103058,
-0.04239490628242493,
0.06456194818019867,
0.00940057821571827,
0.023100202903151512,
-0.05032946914434433,
0.09305739402770996,
0.09986494481563568,
0.07337409257888794,
0.10259620100259781,
-0.047868113964796066,
-0.020447028800845146,
0.06872943043708801,
-0.004597989842295647,
0.2949173152446747,
-0.025372980162501335,
0.06783269345760345,
0.05357075110077858,
0.13839465379714966,
-0.022259138524532318,
0.03763166442513466,
0.006550253368914127,
-0.004987536929547787,
-0.025867585092782974,
-0.057317283004522324,
-0.027098597958683968,
-0.0014259854797273874,
-0.07939993590116501,
0.05529754236340523,
-0.06157984212040901,
0.06318319588899612,
0.019378170371055603,
0.2613871097564697,
-0.0013915469171479344,
-0.28088417649269104,
-0.07782940566539764,
-0.02199469320476055,
-0.03943922370672226,
-0.04498637095093727,
0.012248540297150612,
0.09790370613336563,
-0.10396456718444824,
0.051648836582899094,
-0.05652175471186638,
0.081486776471138,
-0.026772692799568176,
-0.004546423442661762,
0.030105292797088623,
0.18326228857040405,
-0.016070300713181496,
0.04975070431828499,
-0.19926469027996063,
0.21668195724487305,
0.0182656142860651,
0.13269543647766113,
-0.0499923974275589,
0.00819060392677784,
0.023258518427610397,
-0.0013353109825402498,
0.07684347033500671,
-0.005137294065207243,
-0.07620354741811752,
-0.12557756900787354,
-0.07408318668603897,
0.0800393596291542,
0.14072395861148834,
-0.01548776775598526,
0.1024726927280426,
-0.048969414085149765,
0.018789013847708702,
0.040921855717897415,
-0.06851888447999954,
-0.15677978098392487,
-0.09676206111907959,
-0.017097346484661102,
0.035606320947408676,
-0.09451381117105484,
-0.046515364199876785,
-0.07565054297447205,
-0.010576745495200157,
0.11546829342842102,
0.02636021003127098,
-0.019178904592990875,
-0.13702167570590973,
0.08678080886602402,
0.14998175203800201,
-0.07309350371360779,
0.024267269298434258,
-0.007120953407138586,
0.06427866965532303,
0.04308968782424927,
-0.09386114776134491,
0.048486921936273575,
-0.057841695845127106,
-0.16114991903305054,
-0.046732813119888306,
0.09139132499694824,
0.07268821448087692,
0.039837926626205444,
-0.004076710436493158,
0.050857771188020706,
-0.021601373329758644,
-0.10025738924741745,
0.013649959117174149,
0.036450181156396866,
0.05065341666340828,
0.036058247089385986,
-0.08316148817539215,
0.06015168875455856,
-0.032752860337495804,
-0.005291207227855921,
0.1126965656876564,
0.24284282326698303,
-0.08929266780614853,
0.08561702072620392,
0.05679263547062874,
-0.06840071082115173,
-0.14229600131511688,
0.06368527561426163,
0.10450805723667145,
-0.0014875404303893447,
0.0570504330098629,
-0.19440679252147675,
0.14202407002449036,
0.11394678801298141,
-0.01245474349707365,
0.03839944303035736,
-0.2722442150115967,
-0.11783981323242188,
0.05919967219233513,
0.1324119120836258,
0.12130937725305557,
-0.13340331614017487,
-0.01337104756385088,
-0.017449818551540375,
-0.1264839768409729,
0.10692109167575836,
-0.11357207596302032,
0.13440996408462524,
-0.03421180322766304,
0.10949728637933731,
0.004712828900665045,
-0.030068721622228622,
0.10686063021421432,
0.05097272992134094,
0.09775908291339874,
-0.04233326017856598,
0.01139714103192091,
0.059877097606658936,
-0.04820217937231064,
0.013178929686546326,
-0.07178054004907608,
0.08301492035388947,
-0.12040592730045319,
-0.006885865703225136,
-0.0791037455201149,
0.05013106018304825,
-0.03725204989314079,
-0.05197904258966446,
-0.05306514725089073,
0.03648946061730385,
0.055632371455430984,
-0.037124138325452805,
0.05358012393116951,
-0.0004107025742996484,
0.09323256462812424,
0.023126639425754547,
0.06874963641166687,
-0.0007835288415662944,
-0.04789804294705391,
0.019830184057354927,
-0.009422542527318,
0.060023363679647446,
-0.1389552503824234,
0.004861701745539904,
0.10577105730772018,
0.05177406966686249,
0.09698210656642914,
0.04445236548781395,
-0.047739528119564056,
0.01177914347499609,
0.03766288235783577,
-0.11329739540815353,
-0.10228176414966583,
0.04903896525502205,
-0.04026925563812256,
-0.13904337584972382,
0.046980686485767365,
0.11355968564748764,
-0.048611707985401154,
-0.022956322878599167,
-0.018593376502394676,
0.0066942791454494,
-0.021528473123908043,
0.18583343923091888,
0.042500089854002,
0.04129355773329735,
-0.10249876976013184,
0.12946563959121704,
0.029507864266633987,
-0.021826516836881638,
0.05846009775996208,
0.08621029555797577,
-0.09492150694131851,
0.0024302073288708925,
0.09509160369634628,
0.17584849894046783,
-0.07057375460863113,
-0.014038711786270142,
-0.10414604097604752,
-0.07146887481212616,
0.06167963892221451,
0.15958821773529053,
0.056664276868104935,
-0.018209706991910934,
-0.04966399446129799,
0.04167281091213226,
-0.14155468344688416,
0.06139229238033295,
0.03203078359365463,
0.07073517143726349,
-0.08758485317230225,
0.05720153823494911,
0.00770128658041358,
0.0050706625916063786,
-0.01687709055840969,
0.014931363984942436,
-0.09306154400110245,
-0.029673617333173752,
-0.07546992599964142,
0.010216325521469116,
-0.012684658169746399,
0.015909982845187187,
-0.010430222377181053,
-0.0681195929646492,
-0.06943226605653763,
0.03685726597905159,
-0.07754578441381454,
-0.05297556519508362,
0.012548168189823627,
0.04236100986599922,
-0.13314253091812134,
0.006033703219145536,
0.015590718016028404,
-0.08927087485790253,
0.08524539321660995,
0.08921214938163757,
0.026970788836479187,
0.034523606300354004,
-0.13032546639442444,
-0.03284066915512085,
0.014284864999353886,
0.0016177636571228504,
0.06566763669252396,
-0.09562709927558899,
-0.004565778188407421,
-0.021019861102104187,
0.07697679847478867,
0.010341066867113113,
0.08049681782722473,
-0.13096970319747925,
0.008751634508371353,
-0.08490004390478134,
-0.04428362846374512,
-0.06586670875549316,
0.01572478748857975,
0.10199844092130661,
0.05345754325389862,
0.1630898267030716,
-0.07682804763317108,
0.01913268305361271,
-0.20918233692646027,
-0.02790805697441101,
-0.005891728214919567,
-0.053279899060726166,
-0.13540878891944885,
-0.03997969999909401,
0.0772811770439148,
-0.03892729803919792,
0.09991627186536789,
-0.021686341613531113,
0.06101437658071518,
0.03883754089474678,
-0.030144235119223595,
-0.06326649338006973,
-0.02856556326150894,
0.19633163511753082,
0.07723144441843033,
-0.015841910615563393,
0.10857177525758743,
-0.004853746388107538,
0.0520513579249382,
0.03000856749713421,
0.2047373205423355,
0.20634417235851288,
0.005157736595720053,
0.07032521814107895,
0.06175218150019646,
-0.0818057656288147,
-0.06666165590286255,
0.17986688017845154,
-0.026439229026436806,
0.07145323604345322,
-0.029314210638403893,
0.18767252564430237,
0.11275763809680939,
-0.15050667524337769,
0.03159407153725624,
-0.03252135589718819,
-0.07708431780338287,
-0.14034223556518555,
0.0029039077926427126,
-0.09687582403421402,
-0.11882486194372177,
0.045360881835222244,
-0.12056321650743484,
0.05612574890255928,
0.08192005753517151,
0.012763641774654388,
0.033816978335380554,
0.12701451778411865,
-0.024375855922698975,
0.005420073866844177,
0.03988061100244522,
0.007147608790546656,
-0.02881692722439766,
-0.040976494550704956,
-0.0761134997010231,
0.05074235796928406,
0.006026634480804205,
0.08728731423616409,
-0.04480172321200371,
-0.010498834773898125,
0.0413215272128582,
-0.028653105720877647,
-0.0764695256948471,
0.025065172463655472,
0.03700675070285797,
0.055837295949459076,
0.04663587361574173,
0.04479648172855377,
-0.006393906194716692,
-0.033171284943819046,
0.2805931866168976,
-0.058562975376844406,
-0.09600777179002762,
-0.11370038241147995,
0.20569975674152374,
0.040586069226264954,
-0.02944309078156948,
0.03807602450251579,
-0.08294986933469772,
-0.011472459882497787,
0.1553265005350113,
0.15553715825080872,
-0.061941374093294144,
-0.024198953062295914,
-0.011564145796000957,
-0.01683427393436432,
-0.039195410907268524,
0.11564141511917114,
0.09553518146276474,
0.001909524668008089,
-0.0536511056125164,
-0.026714632287621498,
-0.035639408975839615,
-0.014777567237615585,
-0.04167455434799194,
0.024376273155212402,
0.015621259808540344,
-0.022324202582240105,
-0.0337594710290432,
0.06279456615447998,
-0.00041767285438254476,
-0.24176448583602905,
0.061086978763341904,
-0.14368580281734467,
-0.1687733680009842,
-0.025401625782251358,
0.04890485480427742,
-0.00992540642619133,
0.050772298127412796,
-0.023376859724521637,
-0.00466610211879015,
0.08039263635873795,
-0.020244203507900238,
-0.056709565222263336,
-0.1255219429731369,
0.11210471391677856,
-0.05993294343352318,
0.18063822388648987,
-0.017082586884498596,
0.06872937083244324,
0.11798236519098282,
0.042911332100629807,
-0.14019882678985596,
0.046598922461271286,
0.046598825603723526,
-0.11393415182828903,
0.018825631588697433,
0.1436590552330017,
-0.045820023864507675,
0.08394169062376022,
0.04363667964935303,
-0.09631800651550293,
-0.00985125731676817,
-0.04690048471093178,
-0.026020633056759834,
-0.07113640755414963,
-0.011812569573521614,
-0.06844735890626907,
0.17021243274211884,
0.19743359088897705,
-0.02562999166548252,
0.013572380878031254,
-0.09399967640638351,
0.028515366837382317,
0.0681905746459961,
0.03770916536450386,
-0.05036025866866112,
-0.20847313106060028,
0.0192379392683506,
0.050537023693323135,
-0.0029716496355831623,
-0.2319372445344925,
-0.07808493077754974,
0.0399833545088768,
-0.03497854247689247,
-0.05615091696381569,
0.09571909159421921,
0.036007825285196304,
0.0479174368083477,
-0.03689778596162796,
-0.14981454610824585,
-0.03775971755385399,
0.1553279459476471,
-0.1791277527809143,
-0.04985203221440315
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-1024-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-1024-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-1024-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-1024-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0788477435708046,
0.09874749183654785,
-0.0025827717036008835,
0.07803294062614441,
0.1329260915517807,
0.03475825488567352,
0.10673870891332626,
0.12214697897434235,
-0.12631137669086456,
0.0674501359462738,
0.09161190688610077,
0.0862944945693016,
0.03298954665660858,
0.12941649556159973,
-0.03939071297645569,
-0.2321317046880722,
0.007660465780645609,
-0.017204558476805687,
-0.06445813179016113,
0.10158168524503708,
0.08855926990509033,
-0.09773233532905579,
0.0807623565196991,
-0.003468698589131236,
-0.18561968207359314,
0.02794070914387703,
-0.020440781489014626,
-0.049408890306949615,
0.09896913915872574,
-0.005088993813842535,
0.08267581462860107,
0.006168887950479984,
0.12011684477329254,
-0.19135180115699768,
0.0133528346195817,
0.07314825057983398,
0.0326194129884243,
0.09492836892604828,
0.020532840862870216,
-0.0020381449721753597,
0.16477730870246887,
-0.13420411944389343,
0.0992715060710907,
0.029181156307458878,
-0.08704214543104172,
-0.1689000129699707,
-0.09662237763404846,
0.015683284029364586,
0.03692193701863289,
0.08728374540805817,
0.008898542262613773,
0.17795392870903015,
-0.09171251207590103,
0.08101323992013931,
0.23914757370948792,
-0.2781316637992859,
-0.07850367575883865,
0.044266704469919205,
0.05009093135595322,
0.07993417978286743,
-0.11831086128950119,
-0.016225306317210197,
0.02130574733018875,
0.02973584085702896,
0.09204764664173126,
-0.025890791788697243,
-0.08741962164640427,
-0.007446248549968004,
-0.12007807195186615,
0.004855351522564888,
0.11263331770896912,
0.04526689648628235,
-0.050514496862888336,
-0.048749327659606934,
-0.058965958654880524,
-0.07343024015426636,
-0.034948863089084625,
-0.03503921627998352,
0.04362070560455322,
-0.05659949779510498,
-0.1181129664182663,
-0.033702217042446136,
-0.04782206192612648,
-0.07748883217573166,
-0.02002144046127796,
0.20396563410758972,
0.04903654381632805,
0.03321133181452751,
-0.05076614022254944,
0.08142726123332977,
0.009140056557953358,
-0.12868723273277283,
-0.029095256701111794,
0.003828716930001974,
-0.0805981457233429,
-0.0393165722489357,
-0.05597672611474991,
0.015540230087935925,
0.042458005249500275,
0.21280239522457123,
-0.05320827662944794,
0.08396938443183899,
0.03184402734041214,
-0.019493594765663147,
-0.02120467834174633,
0.12497565895318985,
-0.02010856568813324,
-0.07896187156438828,
0.028606737032532692,
0.057649169117212296,
0.02522038109600544,
0.0021864569280296564,
-0.05745413154363632,
-0.0325099378824234,
0.08400581032037735,
0.032015345990657806,
-0.061647966504096985,
0.025256995111703873,
0.0018544035265222192,
-0.019282257184386253,
0.006807961966842413,
-0.11429063975811005,
0.017800476402044296,
-0.007042379584163427,
-0.07383071631193161,
-0.015067007392644882,
0.009989583864808083,
-0.015182310715317726,
0.010561949573457241,
0.09817496687173843,
-0.08899553120136261,
-0.02095624804496765,
-0.0763622522354126,
-0.07052211463451385,
-0.0009272310417145491,
-0.1491001844406128,
0.013849182985723019,
-0.07322296500205994,
-0.15555936098098755,
-0.03422931209206581,
0.04233555868268013,
-0.07109656184911728,
-0.028549516573548317,
-0.03841940313577652,
-0.07686097919940948,
0.025496216490864754,
0.0033539212308824062,
0.18849802017211914,
-0.05265706405043602,
0.07918931543827057,
0.02002781257033348,
0.05079328641295433,
-0.023534296080470085,
0.03120042011141777,
-0.08705678582191467,
0.004215538036078215,
-0.16861236095428467,
0.060996800661087036,
-0.0760820209980011,
0.018274223431944847,
-0.12843157351016998,
-0.08619680255651474,
-0.0253300741314888,
-0.027978233993053436,
0.08328187465667725,
0.10030945390462875,
-0.1395987570285797,
-0.025858471170067787,
0.1132524311542511,
-0.07770803570747375,
-0.057821039110422134,
0.06682613492012024,
-0.0674605742096901,
0.05203895643353462,
0.05750390142202377,
0.1829068660736084,
0.07340425252914429,
-0.11590076237916946,
-0.021382033824920654,
0.0019391898531466722,
0.02931755781173706,
-0.012373732402920723,
0.04992020130157471,
0.010259607806801796,
0.02308143675327301,
0.016441047191619873,
-0.03677910566329956,
0.002879892010241747,
-0.09750735014677048,
-0.062432412058115005,
-0.05804600939154625,
-0.08210718631744385,
-0.03752776235342026,
0.013699358329176903,
0.037542909383773804,
-0.08116606622934341,
-0.08348053693771362,
0.08281989395618439,
0.14143267273902893,
-0.04267947003245354,
0.021035365760326385,
-0.0723952203989029,
0.019978441298007965,
-0.053469642996788025,
-0.03114156797528267,
-0.20486052334308624,
-0.06539332121610641,
0.033028390258550644,
-0.025823917239904404,
0.05588797479867935,
0.00606633210554719,
0.07182866334915161,
0.0374307781457901,
-0.0367109477519989,
0.006413315422832966,
-0.08886074274778366,
-0.006295406259596348,
-0.09542272239923477,
-0.22225643694400787,
-0.039068713784217834,
-0.0310573261231184,
0.12707379460334778,
-0.1649017632007599,
-0.008521904237568378,
-0.034422826021909714,
0.11939491331577301,
0.028915995731949806,
-0.06292065232992172,
-0.015709511935710907,
0.03054598532617092,
0.0019657504744827747,
-0.09288350492715836,
0.032407376915216446,
0.016990626230835915,
-0.07146691530942917,
-0.05930573120713234,
-0.11763735860586166,
0.005574549548327923,
0.07798702269792557,
0.06229938194155693,
-0.09827669709920883,
0.006065301597118378,
-0.0641942024230957,
-0.03329283744096756,
-0.055950138717889786,
0.0399942547082901,
0.17636710405349731,
0.005910596344619989,
0.10798900574445724,
-0.07909733802080154,
-0.07353495061397552,
0.021989336237311363,
0.005503733642399311,
0.04407428205013275,
0.0916265919804573,
0.11174028366804123,
-0.12607695162296295,
0.06493926793336868,
0.08258871734142303,
-0.061866581439971924,
0.12355971336364746,
-0.03858426958322525,
-0.07944390177726746,
-0.03486551344394684,
-0.018058929592370987,
-0.01437944546341896,
0.1363837867975235,
-0.05370122566819191,
0.0240446999669075,
0.029689494520425797,
0.03937177360057831,
0.02024945802986622,
-0.1529252529144287,
-0.0027967768255621195,
0.008656404912471771,
-0.042293086647987366,
-0.016177885234355927,
0.021329371258616447,
0.018653908744454384,
0.0973050519824028,
0.03363645821809769,
-0.017573298886418343,
-0.006749571301043034,
-0.004078228492289782,
-0.052570343017578125,
0.19055378437042236,
-0.09042137861251831,
-0.040197793394327164,
-0.07763804495334625,
-0.002001674845814705,
-0.03748088702559471,
-0.042400941252708435,
0.027822768315672874,
-0.08781997859477997,
-0.03871475160121918,
-0.07403404265642166,
-0.003444843227043748,
-0.047965507954359055,
0.025296969339251518,
0.03058822639286518,
0.004023936111479998,
0.06160326302051544,
-0.1350710242986679,
0.004513579420745373,
-0.07347474992275238,
-0.10571800917387009,
0.017002282664179802,
0.06415195763111115,
0.09097550809383392,
0.059628404676914215,
-0.02868855930864811,
0.02160600945353508,
-0.030601318925619125,
0.2536318600177765,
-0.055555786937475204,
-0.0005750462878495455,
0.10805761069059372,
0.023391535505652428,
0.04587757587432861,
0.09324193000793457,
0.03490735962986946,
-0.10283582657575607,
0.029787909239530563,
0.08471280336380005,
-0.03504220396280289,
-0.24011410772800446,
-0.005378181114792824,
-0.035280924290418625,
-0.11489317566156387,
0.0824776217341423,
0.050724953413009644,
-0.04190400615334511,
0.064642034471035,
0.009816144593060017,
0.024448592215776443,
-0.051135387271642685,
0.09279137849807739,
0.10036744922399521,
0.07294690608978271,
0.10228876024484634,
-0.0475630983710289,
-0.020221905782818794,
0.0690780058503151,
-0.005420295055955648,
0.29351410269737244,
-0.025514541193842888,
0.0684117004275322,
0.05299689620733261,
0.1388196051120758,
-0.022432012483477592,
0.037108127027750015,
0.006210753694176674,
-0.005185981746762991,
-0.02602616511285305,
-0.057318974286317825,
-0.02799103409051895,
-0.0009785351576283574,
-0.08003076165914536,
0.055904168635606766,
-0.06155721843242645,
0.0637412965297699,
0.019133534282445908,
0.2612374722957611,
-0.0016170523595064878,
-0.2806655466556549,
-0.0779598131775856,
-0.022146202623844147,
-0.03961458057165146,
-0.04529758542776108,
0.0123123899102211,
0.09866500645875931,
-0.10385487973690033,
0.050937872380018234,
-0.05585892125964165,
0.08171965181827545,
-0.027832936495542526,
-0.00421377457678318,
0.029690155759453773,
0.18371297419071198,
-0.015968572348356247,
0.04986977204680443,
-0.2003004401922226,
0.21562156081199646,
0.018478523939847946,
0.13280418515205383,
-0.04994985833764076,
0.008618691936135292,
0.023281648755073547,
0.00027818852686323225,
0.07606366276741028,
-0.004781847819685936,
-0.07528109848499298,
-0.12682193517684937,
-0.07434903085231781,
0.07993001490831375,
0.13977569341659546,
-0.014033795334398746,
0.10207563638687134,
-0.04924054816365242,
0.018846523016691208,
0.040953975170850754,
-0.06790005415678024,
-0.15673504769802094,
-0.09701173007488251,
-0.017518367618322372,
0.036452315747737885,
-0.09422564506530762,
-0.04638311639428139,
-0.07527434825897217,
-0.012131608091294765,
0.1157456561923027,
0.026850156486034393,
-0.019208932295441628,
-0.13699010014533997,
0.08751803636550903,
0.14925317466259003,
-0.07333194464445114,
0.024418437853455544,
-0.007222621701657772,
0.06410028785467148,
0.04318402707576752,
-0.09309528768062592,
0.04848342761397362,
-0.05771861970424652,
-0.16067799925804138,
-0.04703214019536972,
0.0907154455780983,
0.07264801859855652,
0.03995087370276451,
-0.0038118967786431313,
0.0506092794239521,
-0.021943381056189537,
-0.1002620980143547,
0.01252483855932951,
0.03733768314123154,
0.050236884504556656,
0.036411792039871216,
-0.08341002464294434,
0.06091758608818054,
-0.03249240294098854,
-0.005588397849351168,
0.1137382984161377,
0.24248924851417542,
-0.08964341133832932,
0.08476002514362335,
0.05714899301528931,
-0.06856410205364227,
-0.14178188145160675,
0.0637555941939354,
0.10407271236181259,
-0.0014090348267927766,
0.05725586414337158,
-0.1938856542110443,
0.14226531982421875,
0.11500643938779831,
-0.012153931893408298,
0.038489293307065964,
-0.27233415842056274,
-0.11800922453403473,
0.05989134684205055,
0.13243839144706726,
0.12274657934904099,
-0.13411276042461395,
-0.013364946469664574,
-0.018066972494125366,
-0.12750355899333954,
0.10578128695487976,
-0.11300302296876907,
0.13421815633773804,
-0.03404507040977478,
0.10836630314588547,
0.00477339094504714,
-0.03012019954621792,
0.10725041478872299,
0.05155828967690468,
0.09777958691120148,
-0.04274721071124077,
0.011599461548030376,
0.059307970106601715,
-0.04836873710155487,
0.013510160148143768,
-0.0714912861585617,
0.08309661597013474,
-0.12205720692873001,
-0.006927581038326025,
-0.078008271753788,
0.05024342983961105,
-0.03711957111954689,
-0.05199649557471275,
-0.05305707827210426,
0.035769373178482056,
0.055488184094429016,
-0.03690236061811447,
0.054474860429763794,
-0.0003605110105127096,
0.09272866696119308,
0.023919565603137016,
0.06756506860256195,
-0.0032531896140426397,
-0.04789679870009422,
0.019755670800805092,
-0.009466108866035938,
0.05994079262018204,
-0.13892152905464172,
0.0054094367660582066,
0.10613074153661728,
0.05193204805254936,
0.09760132431983948,
0.04350200667977333,
-0.04712708666920662,
0.012001554481685162,
0.0372493639588356,
-0.11287205666303635,
-0.10142184048891068,
0.04871909320354462,
-0.040989384055137634,
-0.13869619369506836,
0.04616706445813179,
0.11379214376211166,
-0.049267951399087906,
-0.0224833395332098,
-0.018858816474676132,
0.00587654672563076,
-0.021645450964570045,
0.18521326780319214,
0.04337342083454132,
0.04100431501865387,
-0.10239682346582413,
0.12907744944095612,
0.029903709888458252,
-0.020780105143785477,
0.05854310095310211,
0.08629349619150162,
-0.09501231461763382,
0.002180113922804594,
0.09521647542715073,
0.17592152953147888,
-0.07127823680639267,
-0.014873229898512363,
-0.10454479604959488,
-0.07061723619699478,
0.06139064207673073,
0.1589186042547226,
0.05696690455079079,
-0.018495410680770874,
-0.04983767122030258,
0.041257958859205246,
-0.14158612489700317,
0.061252981424331665,
0.03196365758776665,
0.07106827944517136,
-0.08770886808633804,
0.05885685980319977,
0.008188015781342983,
0.00518425926566124,
-0.016767125576734543,
0.01468848530203104,
-0.09309954941272736,
-0.029425084590911865,
-0.077218197286129,
0.010440973564982414,
-0.012353869155049324,
0.0162087120115757,
-0.010462849400937557,
-0.06803881376981735,
-0.06928158551454544,
0.036905042827129364,
-0.07711786776781082,
-0.05289449915289879,
0.012870410457253456,
0.0422929972410202,
-0.13257986307144165,
0.005694922525435686,
0.015217742882668972,
-0.08888446539640427,
0.08550813794136047,
0.0885268896818161,
0.026654133573174477,
0.03427280858159065,
-0.1309066116809845,
-0.03259309381246567,
0.014246788807213306,
0.00232084933668375,
0.06583035737276077,
-0.09426984190940857,
-0.004050717689096928,
-0.020831184461712837,
0.07683662325143814,
0.010098869912326336,
0.08059211075305939,
-0.13116179406642914,
0.008904542773962021,
-0.08489199727773666,
-0.044321753084659576,
-0.06620205193758011,
0.015577920712530613,
0.1014697328209877,
0.052955351769924164,
0.16279444098472595,
-0.07670027762651443,
0.018993590027093887,
-0.2091914415359497,
-0.02812574990093708,
-0.005755698774009943,
-0.05310884490609169,
-0.1354321986436844,
-0.04094310477375984,
0.07715609669685364,
-0.03859417140483856,
0.10144021362066269,
-0.02144559472799301,
0.06127215921878815,
0.03859877958893776,
-0.030751753598451614,
-0.06310815364122391,
-0.02900119125843048,
0.19651557505130768,
0.07783355563879013,
-0.015675364062190056,
0.10782550275325775,
-0.004633908625692129,
0.052943672984838486,
0.02950296550989151,
0.20297357439994812,
0.206537663936615,
0.0047024874947965145,
0.0700855702161789,
0.06148204952478409,
-0.08149174600839615,
-0.06680824607610703,
0.17944207787513733,
-0.026594536378979683,
0.07212284952402115,
-0.029444027692079544,
0.187968909740448,
0.11241555213928223,
-0.15040738880634308,
0.03145967423915863,
-0.03198906034231186,
-0.07729937136173248,
-0.14043621718883514,
0.003652291838079691,
-0.09697864204645157,
-0.11869587749242783,
0.04479619488120079,
-0.12010679394006729,
0.05634472519159317,
0.08261468261480331,
0.012506779283285141,
0.03363654762506485,
0.12605313956737518,
-0.024269025772809982,
0.005599082447588444,
0.03988198935985565,
0.007077632937580347,
-0.028493870049715042,
-0.04208867624402046,
-0.07678233832120895,
0.050452254712581635,
0.006217234302312136,
0.08772008121013641,
-0.04489525035023689,
-0.00915710348635912,
0.04173300415277481,
-0.028542565181851387,
-0.07676918059587479,
0.02515791356563568,
0.036397043615579605,
0.05572612211108208,
0.04550198093056679,
0.04507437348365784,
-0.006339129526168108,
-0.0332443043589592,
0.2798646092414856,
-0.05810375511646271,
-0.096404530107975,
-0.11343052238225937,
0.20486688613891602,
0.04080010578036308,
-0.029415877535939217,
0.037957966327667236,
-0.08303162455558777,
-0.010624325834214687,
0.15582755208015442,
0.15483656525611877,
-0.06287740916013718,
-0.024179331958293915,
-0.011234700679779053,
-0.016965901479125023,
-0.04002588242292404,
0.1160912960767746,
0.09579779952764511,
0.0006981123005971313,
-0.05272721126675606,
-0.02691478654742241,
-0.035780347883701324,
-0.014712967909872532,
-0.042904287576675415,
0.02360580675303936,
0.01565578393638134,
-0.02174801379442215,
-0.03330935910344124,
0.06241992488503456,
0.00011194615944987163,
-0.24271564185619354,
0.061096321791410446,
-0.14365264773368835,
-0.168901726603508,
-0.02549566887319088,
0.049111735075712204,
-0.009270327165722847,
0.05005626007914543,
-0.02327127754688263,
-0.004250696860253811,
0.08124497532844543,
-0.020626604557037354,
-0.056797608733177185,
-0.12498236447572708,
0.11171916872262955,
-0.059341203421354294,
0.18049277365207672,
-0.01704486832022667,
0.0689782127737999,
0.11772063374519348,
0.04310264810919762,
-0.13935531675815582,
0.04683578386902809,
0.04630478471517563,
-0.11337748914957047,
0.01879100501537323,
0.1424664556980133,
-0.045783787965774536,
0.08378108590841293,
0.044049546122550964,
-0.09580733627080917,
-0.01030724123120308,
-0.04773576930165291,
-0.026472236961126328,
-0.07086555659770966,
-0.012933239340782166,
-0.06814952939748764,
0.17039532959461212,
0.19711092114448547,
-0.025489242747426033,
0.013291622512042522,
-0.09417272359132767,
0.028039876371622086,
0.06851852685213089,
0.037935350090265274,
-0.05071847513318062,
-0.208385169506073,
0.019811542704701424,
0.05019861087203026,
-0.0027787326835095882,
-0.23131322860717773,
-0.07848234474658966,
0.039771053940057755,
-0.035099610686302185,
-0.05587464198470116,
0.09584721177816391,
0.036295074969530106,
0.04809622839093208,
-0.03697903826832771,
-0.15011775493621826,
-0.03820544481277466,
0.15545344352722168,
-0.17899608612060547,
-0.05008389800786972
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08095823228359222,
0.0852503702044487,
-0.002478597220033407,
0.08370379358530045,
0.13653603196144104,
0.03219567984342575,
0.10040277987718582,
0.13986769318580627,
-0.1149863675236702,
0.04331861063838005,
0.08906418830156326,
0.08170413225889206,
0.0315076969563961,
0.13667702674865723,
-0.03383384644985199,
-0.25429388880729675,
-0.0055573927238583565,
-0.01906982809305191,
-0.09605054557323456,
0.10952239483594894,
0.09837917983531952,
-0.10029860585927963,
0.072549968957901,
-0.014115771278738976,
-0.17804011702537537,
0.016671529039740562,
-0.012697580270469189,
-0.052256107330322266,
0.11855349689722061,
-0.00900032464414835,
0.07657024264335632,
0.009751135483384132,
0.12193533778190613,
-0.19133062660694122,
0.01639603264629841,
0.07613987475633621,
0.04642181843519211,
0.09744089096784592,
0.008602033369243145,
-0.011816460639238358,
0.1323339343070984,
-0.12792237102985382,
0.09998342394828796,
0.032076455652713776,
-0.09528183937072754,
-0.20624785125255585,
-0.09538143873214722,
0.00823470763862133,
0.04330406337976456,
0.08529260754585266,
0.010156798176467419,
0.15995191037654877,
-0.09822437912225723,
0.08252694457769394,
0.23087851703166962,
-0.2740166187286377,
-0.0794408842921257,
0.05319635570049286,
0.05968784540891647,
0.08120010793209076,
-0.12685273587703705,
-0.009277819655835629,
0.006361129228025675,
0.02562081068754196,
0.1039217859506607,
-0.03182261437177658,
-0.08747614175081253,
0.002601564396172762,
-0.10461250692605972,
0.0057657030411064625,
0.10998530685901642,
0.03404351323843002,
-0.05263500288128853,
-0.07243534177541733,
-0.042769625782966614,
-0.05434262007474899,
-0.03434319794178009,
-0.01969732716679573,
0.038408972322940826,
-0.05957089737057686,
-0.1397208422422409,
-0.04809824004769325,
-0.049616869539022446,
-0.08978229761123657,
-0.0063020773231983185,
0.21865972876548767,
0.036064259707927704,
0.03096810169517994,
-0.05221787095069885,
0.10313740372657776,
0.013654351234436035,
-0.1277635097503662,
-0.030032966285943985,
-0.005357202608138323,
-0.09058087319135666,
-0.03594774752855301,
-0.05922117829322815,
0.019577305763959885,
0.035217877477407455,
0.21511228382587433,
-0.0433029904961586,
0.07785136252641678,
0.03261028230190277,
-0.017322368919849396,
-0.027273597195744514,
0.13863533735275269,
-0.021548008546233177,
-0.07378096878528595,
0.011756801046431065,
0.06334684044122696,
0.009661135263741016,
-0.005500479135662317,
-0.06254652887582779,
-0.04012309014797211,
0.06448879837989807,
0.04753383621573448,
-0.05954708158969879,
0.031022384762763977,
-0.006863785907626152,
-0.023707980290055275,
0.0012270172592252493,
-0.1149839237332344,
0.015538770705461502,
-0.00870039127767086,
-0.08201029896736145,
-0.04557465389370918,
0.008318141102790833,
-0.011314325965940952,
0.011262951418757439,
0.0974297747015953,
-0.07351227104663849,
-0.02468218095600605,
-0.07990054041147232,
-0.07519999891519547,
-0.01643318124115467,
-0.1564638465642929,
0.019107429310679436,
-0.06359101086854935,
-0.15938003361225128,
-0.030466653406620026,
0.056008946150541306,
-0.08094269782304764,
-0.027619250118732452,
-0.03244781494140625,
-0.07873354107141495,
0.02144445665180683,
0.0017616543918848038,
0.21539518237113953,
-0.049829430878162384,
0.08924830704927444,
0.008334198035299778,
0.05671153962612152,
-0.008405941538512707,
0.03628399223089218,
-0.08756622672080994,
0.008990226313471794,
-0.17665508389472961,
0.07539930939674377,
-0.07984726876020432,
0.017370112240314484,
-0.13992124795913696,
-0.08602981269359589,
-0.011312154121696949,
-0.02194075472652912,
0.0802985429763794,
0.10438566654920578,
-0.142584428191185,
-0.022303259000182152,
0.11679136753082275,
-0.06490267813205719,
-0.05628107860684395,
0.057226646691560745,
-0.07605793327093124,
0.08552774786949158,
0.05455183610320091,
0.19150251150131226,
0.0876104012131691,
-0.10707733780145645,
0.013314320705831051,
0.013192584738135338,
0.03466062247753143,
0.00342720584012568,
0.05016563832759857,
0.0050756074488162994,
0.02994326502084732,
0.01634981669485569,
-0.07197511941194534,
0.008280882611870766,
-0.09122844785451889,
-0.05960732325911522,
-0.04861357808113098,
-0.0837659016251564,
-0.006014547310769558,
0.013478567823767662,
0.03546413779258728,
-0.07994074374437332,
-0.08456949144601822,
0.07544022053480148,
0.13661707937717438,
-0.0476268008351326,
0.018381979316473007,
-0.07580608129501343,
-0.0026550055481493473,
-0.03061908483505249,
-0.023881638422608376,
-0.20368380844593048,
-0.057126909494400024,
0.03230473771691322,
-0.00738324411213398,
0.04519279673695564,
0.0006871952209621668,
0.08454346656799316,
0.02621583454310894,
-0.054867688566446304,
-0.0006956983706913888,
-0.08664927631616592,
-0.007071345113217831,
-0.09379097819328308,
-0.22039794921875,
-0.05268796160817146,
-0.03856198862195015,
0.1365993320941925,
-0.16605034470558167,
-0.003546495223417878,
-0.018034610897302628,
0.11621010303497314,
0.04326123744249344,
-0.05195290595293045,
-0.0045325844548642635,
0.029146384447813034,
0.012388844974339008,
-0.09764739871025085,
0.03818521276116371,
0.016307853162288666,
-0.09428181499242783,
-0.027953360229730606,
-0.10361451655626297,
-0.010292758233845234,
0.07104083895683289,
0.06985977292060852,
-0.10235398262739182,
-0.014747144654393196,
-0.06288262456655502,
-0.028446635231375694,
-0.05442212522029877,
0.038782838732004166,
0.18155723810195923,
0.018718171864748,
0.1096113994717598,
-0.07467129826545715,
-0.08310586214065552,
0.01899777539074421,
0.008759203366935253,
0.061090774834156036,
0.10229378193616867,
0.07492205500602722,
-0.10942256450653076,
0.058127231895923615,
0.09146293252706528,
-0.052153266966342926,
0.13778063654899597,
-0.04670077562332153,
-0.07328125834465027,
-0.03144922852516174,
-0.00887386966496706,
-0.006462298333644867,
0.15071037411689758,
-0.03723939135670662,
0.017590006813406944,
0.034319981932640076,
0.03753789886832237,
0.004545844160020351,
-0.161299467086792,
-0.015177879482507706,
0.014783020131289959,
-0.047942742705345154,
-0.02313872240483761,
0.01543796993792057,
0.01641768217086792,
0.095157690346241,
0.04206511378288269,
-0.0033865810837596655,
0.007083449512720108,
-0.010120537132024765,
-0.04305266961455345,
0.20053794980049133,
-0.09128046035766602,
-0.04423923045396805,
-0.07586602121591568,
-0.000568892399314791,
-0.02370358258485794,
-0.04024490714073181,
0.015841709449887276,
-0.0948447659611702,
-0.025517968460917473,
-0.07208289206027985,
-0.00016685928858350962,
-0.04109245911240578,
0.015024910680949688,
0.0031516142189502716,
0.014564864337444305,
0.05890187621116638,
-0.13236182928085327,
0.010078764520585537,
-0.0669173151254654,
-0.11026014387607574,
0.02848133258521557,
0.060453400015830994,
0.08059130609035492,
0.057913556694984436,
-0.032633502036333084,
0.019088292494416237,
-0.04212802276015282,
0.23077392578125,
-0.07851270586252213,
0.011940324679017067,
0.12324845790863037,
0.027101047337055206,
0.03953488543629646,
0.10454342514276505,
0.033322207629680634,
-0.0993834137916565,
0.03909729793667793,
0.07743887603282928,
-0.041357140988111496,
-0.24464742839336395,
0.0077258930541574955,
-0.03883965685963631,
-0.09445704519748688,
0.08902652561664581,
0.050784170627593994,
-0.0423780120909214,
0.06417225301265717,
0.009767544455826283,
0.010278169997036457,
-0.023704467341303825,
0.08875417709350586,
0.09020736068487167,
0.06663074344396591,
0.10733974725008011,
-0.0384654738008976,
-0.01694338768720627,
0.06422758847475052,
0.028278814628720284,
0.30333587527275085,
-0.04957485571503639,
0.08711323887109756,
0.049211811274290085,
0.1395425945520401,
-0.021617397665977478,
0.04508736729621887,
0.0077449544332921505,
-0.006143317092210054,
-0.029535649344325066,
-0.05429606884717941,
-0.02041984722018242,
0.0020403617527335882,
-0.07774460315704346,
0.04575258120894432,
-0.05243568867444992,
0.04983409494161606,
0.019865624606609344,
0.2913056015968323,
0.0016796966083347797,
-0.26306575536727905,
-0.09510429203510284,
-0.01572701334953308,
-0.03698642551898956,
-0.05263872444629669,
0.011812643148005009,
0.12151811271905899,
-0.12400896102190018,
0.03698783740401268,
-0.07146015018224716,
0.08059949427843094,
-0.029708176851272583,
-0.0020154512021690607,
0.04259783774614334,
0.1713908165693283,
-0.022030629217624664,
0.05531204864382744,
-0.22390080988407135,
0.22865378856658936,
0.014392255805432796,
0.12873980402946472,
-0.05900200456380844,
0.008759508840739727,
0.026431048288941383,
0.004178130067884922,
0.08851506561040878,
-0.00450399424880743,
-0.0644308477640152,
-0.13581861555576324,
-0.052247971296310425,
0.07353987544775009,
0.14238326251506805,
-0.04222971573472023,
0.09688957035541534,
-0.05814274773001671,
0.014881191775202751,
0.037001486867666245,
-0.0879499688744545,
-0.13331928849220276,
-0.0985792726278305,
-0.02204338274896145,
0.015399351716041565,
-0.06953247636556625,
-0.057013142853975296,
-0.06891966611146927,
0.03110099770128727,
0.1049962267279625,
0.013779657892882824,
-0.031415704637765884,
-0.1477304995059967,
0.07736589014530182,
0.15617778897285461,
-0.06713826209306717,
0.02596461772918701,
-0.003938506823033094,
0.07528161257505417,
0.03835450857877731,
-0.0826021283864975,
0.060412630438804626,
-0.06466817855834961,
-0.17910973727703094,
-0.04872662201523781,
0.09425555914640427,
0.07004373520612717,
0.04183795675635338,
-0.0035818787291646004,
0.05249081179499626,
-0.025966696441173553,
-0.09308037906885147,
0.02405882440507412,
0.02137126959860325,
0.03668910637497902,
0.036263227462768555,
-0.08473791927099228,
0.07607156783342361,
-0.03818168863654137,
-0.014459338039159775,
0.11402036994695663,
0.23188956081867218,
-0.10134202986955643,
0.09859848022460938,
0.06175022944808006,
-0.06121717020869255,
-0.16073216497898102,
0.07137687504291534,
0.10298056155443192,
0.007597919087857008,
0.07035559415817261,
-0.2106252759695053,
0.12813325226306915,
0.10327299684286118,
-0.01649526320397854,
0.04077823832631111,
-0.27267563343048096,
-0.12035831063985825,
0.044819388538599014,
0.12851905822753906,
0.09442409127950668,
-0.12538081407546997,
-0.013945778831839561,
-0.016130661591887474,
-0.11606726795434952,
0.09493045508861542,
-0.11325689405202866,
0.13565649092197418,
-0.028716688975691795,
0.11081702262163162,
0.011080297641456127,
-0.026294857263565063,
0.10320524126291275,
0.04636731371283531,
0.10212793201208115,
-0.042454179376363754,
0.0039532920345664024,
0.06309807300567627,
-0.04868278279900551,
0.0027359493542462587,
-0.07619061321020126,
0.08765789866447449,
-0.13158512115478516,
-0.003410520264878869,
-0.09181852638721466,
0.046593427658081055,
-0.03871740400791168,
-0.06928528845310211,
-0.04224119707942009,
0.05715607851743698,
0.04894071817398071,
-0.03579717501997948,
0.04852161929011345,
-0.01932714134454727,
0.10401628911495209,
0.03695176914334297,
0.08637471497058868,
0.013925721868872643,
-0.04607369005680084,
0.023602476343512535,
-0.010158000513911247,
0.06457143276929855,
-0.1650364100933075,
0.011646240018308163,
0.09984758496284485,
0.060739196836948395,
0.09762772917747498,
0.04399627074599266,
-0.04551352933049202,
0.017362361773848534,
0.030052348971366882,
-0.10147471725940704,
-0.10529332607984543,
0.04666883498430252,
-0.02841067686676979,
-0.13962435722351074,
0.04910619929432869,
0.12123649567365646,
-0.04412786662578583,
-0.026923341676592827,
-0.016769476234912872,
0.0043762424029409885,
-0.02338845282793045,
0.1818738579750061,
0.04929890111088753,
0.05533160641789436,
-0.10168193280696869,
0.12871797382831573,
0.029161937534809113,
-0.021933823823928833,
0.052248161286115646,
0.08272708207368851,
-0.10293660312891006,
-0.0029495309572666883,
0.08251453191041946,
0.13715827465057373,
-0.05848625674843788,
-0.006042656023055315,
-0.10440012812614441,
-0.08331286162137985,
0.05051104724407196,
0.14396649599075317,
0.04843546077609062,
-0.015678133815526962,
-0.05397672578692436,
0.03999381884932518,
-0.14141549170017242,
0.07155430316925049,
0.024843472987413406,
0.06333314627408981,
-0.07643458992242813,
0.060966894030570984,
0.007595478091388941,
0.012818891555070877,
-0.01660645753145218,
0.007630022708326578,
-0.09344477206468582,
-0.015684930607676506,
-0.08032657206058502,
-0.0023927611764520407,
-0.00043427638593129814,
0.016741523519158363,
-0.020739661529660225,
-0.07109616696834564,
-0.04842740669846535,
0.036737456917762756,
-0.0875508189201355,
-0.04996496066451073,
0.009295403957366943,
0.04104652628302574,
-0.12388423085212708,
-0.005305162630975246,
0.02182283066213131,
-0.09319493174552917,
0.09487314522266388,
0.07326332479715347,
0.016644945368170738,
0.030086608603596687,
-0.12482309341430664,
-0.033201996237039566,
-0.010821599513292313,
-0.00879092887043953,
0.06188134476542473,
-0.09510772675275803,
-0.008890880271792412,
-0.038043178617954254,
0.06944078207015991,
0.01351337693631649,
0.0688156709074974,
-0.13427932560443878,
0.019245052710175514,
-0.07465685158967972,
-0.04651060700416565,
-0.07517395913600922,
0.03585714474320412,
0.09772498160600662,
0.05966666713356972,
0.1511869579553604,
-0.07693839073181152,
0.02400226891040802,
-0.2058669626712799,
-0.03469616919755936,
-0.006113504525274038,
-0.060908228158950806,
-0.15145350992679596,
-0.047067031264305115,
0.0811309888958931,
-0.03781253099441528,
0.09137352555990219,
-0.01846296340227127,
0.07636621594429016,
0.038264378905296326,
-0.048044655472040176,
-0.05174947530031204,
-0.015910066664218903,
0.1995052993297577,
0.07288957387208939,
-0.01652732491493225,
0.11152162402868271,
0.0013617894146591425,
0.029438909143209457,
0.052794020622968674,
0.18159228563308716,
0.21378140151500702,
0.03279925882816315,
0.0554632768034935,
0.06356524676084518,
-0.07464095205068588,
-0.07097627967596054,
0.1782524734735489,
-0.015132625587284565,
0.07095585763454437,
-0.04905132204294205,
0.19711408019065857,
0.1090121865272522,
-0.1679404228925705,
0.04597318544983864,
-0.04340926557779312,
-0.08064094185829163,
-0.12563632428646088,
-0.012409058399498463,
-0.08549658209085464,
-0.12586553394794464,
0.03741913288831711,
-0.11669890582561493,
0.05339465290307999,
0.10847759991884232,
0.013587219640612602,
0.036781784147024155,
0.12704282999038696,
-0.020726744085550308,
0.002262219786643982,
0.06488431990146637,
0.00550650293007493,
-0.01296313013881445,
-0.03749856352806091,
-0.07930111885070801,
0.04977920651435852,
0.0015500931767746806,
0.07851526886224747,
-0.046550873667001724,
-0.015125835314393044,
0.026483403518795967,
-0.027776362374424934,
-0.08056288212537766,
0.027689212933182716,
0.04100954905152321,
0.057035256177186966,
0.049449384212493896,
0.04607293754816055,
-0.008641136810183525,
-0.0331542044878006,
0.32023948431015015,
-0.06681720167398453,
-0.10049169510602951,
-0.12236709147691727,
0.22080953419208527,
0.030223077163100243,
-0.031145691871643066,
0.03338417783379555,
-0.08338841795921326,
-0.010287458077073097,
0.15907816588878632,
0.16706517338752747,
-0.0711793303489685,
-0.022459857165813446,
-0.00555562786757946,
-0.017750250175595284,
-0.03692977502942085,
0.1266910284757614,
0.08883703500032425,
-0.023500697687268257,
-0.06269393861293793,
-0.012493819929659367,
-0.018938560038805008,
-0.031691618263721466,
-0.0399896539747715,
0.04233792796730995,
0.014681038446724415,
-0.025823522359132767,
-0.0423959344625473,
0.07395746558904648,
0.004233479965478182,
-0.2577477693557739,
0.06770122796297073,
-0.15843477845191956,
-0.17050766944885254,
-0.04391898587346077,
0.03659236803650856,
-0.0008409243891946971,
0.05695297196507454,
-0.01644243113696575,
0.00857994519174099,
0.07899037003517151,
-0.017058394849300385,
-0.031080879271030426,
-0.12505985796451569,
0.12387683987617493,
-0.06383154541254044,
0.1719907969236374,
-0.02869318053126335,
0.04842078313231468,
0.11582093685865402,
0.029199903830885887,
-0.13581791520118713,
0.0414654016494751,
0.05448148399591446,
-0.10574173182249069,
0.015234604477882385,
0.15180076658725739,
-0.04647952318191528,
0.0945357233285904,
0.042757242918014526,
-0.10840967297554016,
0.005115070380270481,
-0.057209793478250504,
-0.03401236608624458,
-0.08099089562892914,
-0.013946090824902058,
-0.05925055220723152,
0.16887108981609344,
0.21919117867946625,
-0.030093105509877205,
0.011119619943201542,
-0.10189121961593628,
0.016565322875976562,
0.07000552862882614,
0.027987420558929443,
-0.057372383773326874,
-0.18844082951545715,
0.011936072260141373,
0.06828322261571884,
-0.007183293346315622,
-0.24170345067977905,
-0.07593516260385513,
0.04023966193199158,
-0.03626246005296707,
-0.040022414177656174,
0.10365518927574158,
0.04013961926102638,
0.05204091966152191,
-0.03027372620999813,
-0.15854360163211823,
-0.0310093741863966,
0.15423890948295593,
-0.17482605576515198,
-0.03625543788075447
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08136138319969177,
0.08274868130683899,
-0.002413117792457342,
0.08361946791410446,
0.13672661781311035,
0.03251417353749275,
0.10084348171949387,
0.13830195367336273,
-0.11379057914018631,
0.04250169172883034,
0.08806810528039932,
0.08288288861513138,
0.03181825205683708,
0.13705673813819885,
-0.033244431018829346,
-0.25504064559936523,
-0.005587031599134207,
-0.01875055395066738,
-0.09472031891345978,
0.10939588397741318,
0.09858385473489761,
-0.10073868185281754,
0.07218112796545029,
-0.014091232791543007,
-0.17821873724460602,
0.01688745804131031,
-0.012415391393005848,
-0.05115538090467453,
0.11885421723127365,
-0.010624135844409466,
0.07635244727134705,
0.009239030070602894,
0.12391789257526398,
-0.1896999180316925,
0.016437392681837082,
0.0756903737783432,
0.0464383140206337,
0.09698348492383957,
0.008293251506984234,
-0.010720590129494667,
0.13206207752227783,
-0.1286015808582306,
0.10034335404634476,
0.03138351067900658,
-0.0953255370259285,
-0.20502769947052002,
-0.095509372651577,
0.007672055624425411,
0.044757626950740814,
0.08430822193622589,
0.009866621345281601,
0.1601210981607437,
-0.09851747751235962,
0.08271186798810959,
0.2316460758447647,
-0.27531641721725464,
-0.07939004898071289,
0.05493041127920151,
0.061307329684495926,
0.08208290487527847,
-0.12651115655899048,
-0.010042914189398289,
0.006811671890318394,
0.02542901784181595,
0.1047973781824112,
-0.032267920672893524,
-0.08739472180604935,
0.0023070513270795345,
-0.10599867254495621,
0.007485856767743826,
0.10906953364610672,
0.03437679260969162,
-0.052339162677526474,
-0.07229606807231903,
-0.04355653002858162,
-0.05451027303934097,
-0.03558994457125664,
-0.020641667768359184,
0.038873061537742615,
-0.059744007885456085,
-0.13844694197177887,
-0.04777944087982178,
-0.04870585352182388,
-0.09025370329618454,
-0.006224858108907938,
0.21797394752502441,
0.03642785921692848,
0.030717749148607254,
-0.052055079489946365,
0.10228107124567032,
0.010801447555422783,
-0.12772926688194275,
-0.02883133664727211,
-0.005054600071161985,
-0.09174267202615738,
-0.03688443824648857,
-0.05979686975479126,
0.019352249801158905,
0.034479301422834396,
0.21494139730930328,
-0.040929265320301056,
0.07814819365739822,
0.03411456197500229,
-0.01758589595556259,
-0.02714073657989502,
0.1388525515794754,
-0.021895339712500572,
-0.07554920762777328,
0.011706176213920116,
0.06369058787822723,
0.00998377613723278,
-0.004569927230477333,
-0.06330785900354385,
-0.04026985913515091,
0.06499031186103821,
0.04713743180036545,
-0.06147978454828262,
0.031321652233600616,
-0.006476237438619137,
-0.02336227335035801,
0.0023677607532590628,
-0.11554437130689621,
0.01619100011885166,
-0.009203549474477768,
-0.0827273577451706,
-0.04510030895471573,
0.006851537618786097,
-0.010472445748746395,
0.011682385578751564,
0.09684080630540848,
-0.07374944537878036,
-0.024160509929060936,
-0.08059658110141754,
-0.07620645314455032,
-0.016381924971938133,
-0.15873374044895172,
0.019581496715545654,
-0.06248875707387924,
-0.16159644722938538,
-0.030920173972845078,
0.05532504990696907,
-0.0804520845413208,
-0.027534890919923782,
-0.033560242503881454,
-0.07889936119318008,
0.020262937992811203,
0.0025876611471176147,
0.21670924127101898,
-0.04934156686067581,
0.08803420513868332,
0.008269852958619595,
0.05809815227985382,
-0.009272602386772633,
0.03633994981646538,
-0.08681482076644897,
0.00890921801328659,
-0.1775384396314621,
0.07496356964111328,
-0.0799441859126091,
0.01917969435453415,
-0.1391465812921524,
-0.08575311303138733,
-0.012006823904812336,
-0.021584782749414444,
0.08026260882616043,
0.10405522584915161,
-0.14527137577533722,
-0.021262871101498604,
0.11738644540309906,
-0.06480128318071365,
-0.05543096736073494,
0.05601724237203598,
-0.07631008327007294,
0.08615382760763168,
0.055948518216609955,
0.191459521651268,
0.08753730356693268,
-0.10680587589740753,
0.013528046198189259,
0.013190699741244316,
0.0342412032186985,
0.0015559401363134384,
0.04962844401597977,
0.005912043619900942,
0.031016385182738304,
0.016308056190609932,
-0.070571169257164,
0.007621516939252615,
-0.09090153127908707,
-0.05952886492013931,
-0.04895945265889168,
-0.08460160344839096,
-0.006314801052212715,
0.013517247512936592,
0.03576068952679634,
-0.0804046094417572,
-0.0839066356420517,
0.07681199908256531,
0.13650310039520264,
-0.04744613915681839,
0.017698684707283974,
-0.07553120702505112,
-0.003106052055954933,
-0.031075546517968178,
-0.02405659481883049,
-0.20391406118869781,
-0.05621521547436714,
0.032654792070388794,
-0.008561506867408752,
0.04531129449605942,
0.0031101058702915907,
0.08512113988399506,
0.02605542540550232,
-0.05418434739112854,
-0.0003222624945919961,
-0.08595762401819229,
-0.006937407422810793,
-0.09540990740060806,
-0.2199777513742447,
-0.05306578800082207,
-0.03870250657200813,
0.13569873571395874,
-0.1663251668214798,
-0.003260185942053795,
-0.020210007205605507,
0.11592700332403183,
0.04248986020684242,
-0.05167526751756668,
-0.0047813705168664455,
0.028791479766368866,
0.013157297857105732,
-0.09778349846601486,
0.03838646784424782,
0.01516887079924345,
-0.09339305758476257,
-0.029720086604356766,
-0.10468502342700958,
-0.011946658603847027,
0.07056394219398499,
0.07044614106416702,
-0.10207076370716095,
-0.013827095739543438,
-0.06252526491880417,
-0.027790691703557968,
-0.05333998426795006,
0.03843402490019798,
0.18034754693508148,
0.01811290718615055,
0.10917336493730545,
-0.07489129155874252,
-0.08386705070734024,
0.018847482278943062,
0.009574277326464653,
0.062290310859680176,
0.1020030826330185,
0.0757269486784935,
-0.10925256460905075,
0.057462178170681,
0.0933358371257782,
-0.05233778432011604,
0.1363227665424347,
-0.046797726303339005,
-0.07356557995080948,
-0.03137887641787529,
-0.00994966272264719,
-0.007373030297458172,
0.1503981649875641,
-0.037984851747751236,
0.01615222543478012,
0.033616699278354645,
0.037099260836839676,
0.004342611879110336,
-0.16202601790428162,
-0.014737778343260288,
0.014181491918861866,
-0.04720836132764816,
-0.02384069189429283,
0.015337248332798481,
0.015604358166456223,
0.09513881802558899,
0.04120736941695213,
-0.004615143407136202,
0.006373300217092037,
-0.010246001183986664,
-0.0424414686858654,
0.20065699517726898,
-0.09091050177812576,
-0.04194079712033272,
-0.07420198619365692,
-0.0009864873718470335,
-0.022828437387943268,
-0.040450651198625565,
0.015000776387751102,
-0.09491436183452606,
-0.025634994730353355,
-0.07192624360322952,
-0.0006869949284009635,
-0.04056943207979202,
0.015105812810361385,
0.0044341618195176125,
0.014395400881767273,
0.05792495980858803,
-0.1325955092906952,
0.0099378926679492,
-0.0680755004286766,
-0.11051038652658463,
0.027816712856292725,
0.060539960861206055,
0.08004540950059891,
0.059296734631061554,
-0.03270459920167923,
0.01910031959414482,
-0.04173143580555916,
0.2314155399799347,
-0.07818403840065002,
0.013576720841228962,
0.12311871349811554,
0.027294861152768135,
0.03961695358157158,
0.10549671947956085,
0.03337223455309868,
-0.09975527971982956,
0.038698505610227585,
0.07696481794118881,
-0.040744829922914505,
-0.24487042427062988,
0.007671289145946503,
-0.038462862372398376,
-0.09658867120742798,
0.08875170350074768,
0.05046717822551727,
-0.04216613247990608,
0.06512273848056793,
0.010511981323361397,
0.011252688243985176,
-0.02378501184284687,
0.08809782564640045,
0.08903464674949646,
0.06615845859050751,
0.10767868161201477,
-0.038873787969350815,
-0.01788594201207161,
0.06310216337442398,
0.027996646240353584,
0.3039777874946594,
-0.04791533574461937,
0.08596964925527573,
0.048798657953739166,
0.13881686329841614,
-0.021937374025583267,
0.047039344906806946,
0.00803367793560028,
-0.007008370477706194,
-0.029855867847800255,
-0.054186221212148666,
-0.018728259950876236,
0.0021413867361843586,
-0.0782441571354866,
0.04571637883782387,
-0.051647886633872986,
0.04937337711453438,
0.019498925656080246,
0.2896275222301483,
0.00310305692255497,
-0.2634669840335846,
-0.09374789893627167,
-0.015444695949554443,
-0.0380854494869709,
-0.052039485424757004,
0.011612205766141415,
0.11986389756202698,
-0.12329290807247162,
0.03793516755104065,
-0.07132340967655182,
0.08076813817024231,
-0.028902221471071243,
-0.001465434324927628,
0.04172397404909134,
0.1728760451078415,
-0.021752336993813515,
0.05532163009047508,
-0.2242121547460556,
0.22802262008190155,
0.014559837989509106,
0.12954701483249664,
-0.059700366109609604,
0.008336193859577179,
0.02657352201640606,
0.003916772548109293,
0.08858856558799744,
-0.004498685244470835,
-0.06527900695800781,
-0.13577882945537567,
-0.05214359983801842,
0.07354793697595596,
0.1430199146270752,
-0.0405765064060688,
0.09712395071983337,
-0.057475533336400986,
0.014069855213165283,
0.0374179482460022,
-0.08882810920476913,
-0.1345830112695694,
-0.09760195761919022,
-0.022732073441147804,
0.014730810187757015,
-0.0711069330573082,
-0.05629372596740723,
-0.06901716440916061,
0.028638619929552078,
0.10338177531957626,
0.01593562588095665,
-0.031387317925691605,
-0.1470964103937149,
0.07756661623716354,
0.15660536289215088,
-0.06690123677253723,
0.026729729026556015,
-0.0034123060759156942,
0.07542634010314941,
0.03838157281279564,
-0.08255929499864578,
0.06062404438853264,
-0.06457819789648056,
-0.17836590111255646,
-0.048194192349910736,
0.09506811201572418,
0.07062337547540665,
0.04187173768877983,
-0.0024398821406066418,
0.05261203646659851,
-0.026297559961676598,
-0.09330984950065613,
0.023374799638986588,
0.021362442523241043,
0.03579393029212952,
0.036316972225904465,
-0.08570144325494766,
0.07492458820343018,
-0.03798452392220497,
-0.012794366106390953,
0.11361829191446304,
0.22976264357566833,
-0.10124022513628006,
0.09789448231458664,
0.06115515157580376,
-0.06140134483575821,
-0.16044972836971283,
0.07234013825654984,
0.10301761329174042,
0.007702952716499567,
0.07083037495613098,
-0.209148570895195,
0.1294567734003067,
0.10321428626775742,
-0.016070643439888954,
0.040425486862659454,
-0.2731599509716034,
-0.12024188786745071,
0.04534771665930748,
0.12909169495105743,
0.09518931806087494,
-0.1250169277191162,
-0.013549371622502804,
-0.017243262380361557,
-0.11600102484226227,
0.0951790139079094,
-0.11429838091135025,
0.13535526394844055,
-0.028637805953621864,
0.11048349738121033,
0.01086809579282999,
-0.026430461555719376,
0.1018804982304573,
0.048376213759183884,
0.10272979736328125,
-0.042717330157756805,
0.0029387930408120155,
0.06468772888183594,
-0.04814442619681358,
0.003522759536281228,
-0.07554860413074493,
0.08708541840314865,
-0.13031615316867828,
-0.0033029592595994473,
-0.09127399325370789,
0.0459546223282814,
-0.03839462250471115,
-0.06886868923902512,
-0.04210793599486351,
0.05679035931825638,
0.0483221635222435,
-0.03584905341267586,
0.047048117965459824,
-0.018147967755794525,
0.10341579467058182,
0.033650897443294525,
0.08717432618141174,
0.013074728660285473,
-0.044797156006097794,
0.0241396427154541,
-0.00983980018645525,
0.06368503719568253,
-0.16553007066249847,
0.011121327057480812,
0.09998676180839539,
0.06073905900120735,
0.09734933823347092,
0.04404287412762642,
-0.045499298721551895,
0.01670643500983715,
0.02985098399221897,
-0.1001780703663826,
-0.1067822203040123,
0.047614935785532,
-0.02943071722984314,
-0.1395198553800583,
0.050909895449876785,
0.11967428028583527,
-0.04423443600535393,
-0.027509981766343117,
-0.01754351146519184,
0.0040188138373196125,
-0.023476392030715942,
0.1831013262271881,
0.050091713666915894,
0.055143047124147415,
-0.10227568447589874,
0.1279734969139099,
0.028847938403487206,
-0.020383434370160103,
0.05142940580844879,
0.08358006924390793,
-0.10371987521648407,
-0.0029243240132927895,
0.08401773124933243,
0.13889065384864807,
-0.056911177933216095,
-0.005755973514169455,
-0.10443279892206192,
-0.08380689471960068,
0.05059569329023361,
0.1450759470462799,
0.049124203622341156,
-0.016942746937274933,
-0.05380028858780861,
0.04036799445748329,
-0.14106707274913788,
0.07090406119823456,
0.024959245696663857,
0.06356938183307648,
-0.07620730996131897,
0.06154727190732956,
0.007602136582136154,
0.013143882155418396,
-0.01668383553624153,
0.008876100182533264,
-0.0933002457022667,
-0.016432160511612892,
-0.07999801635742188,
-0.0039061028510332108,
-0.0011444415431469679,
0.01664557494223118,
-0.02096310630440712,
-0.07113566249608994,
-0.048365168273448944,
0.03615792095661163,
-0.08770056813955307,
-0.050068456679582596,
0.009042125195264816,
0.039882395416498184,
-0.12335674464702606,
-0.00537294102832675,
0.021030226722359657,
-0.0924212709069252,
0.09420781582593918,
0.0724504142999649,
0.01735473796725273,
0.031266797333955765,
-0.12557029724121094,
-0.03306672349572182,
-0.010080105625092983,
-0.008386013098061085,
0.062417760491371155,
-0.09412368386983871,
-0.008959447965025902,
-0.03765074908733368,
0.07144768536090851,
0.012908002361655235,
0.06671763956546783,
-0.13327686488628387,
0.019523752853274345,
-0.07585511356592178,
-0.046598855406045914,
-0.07535544782876968,
0.03549492731690407,
0.09714967012405396,
0.05890417471528053,
0.1518380492925644,
-0.07589755952358246,
0.023898225277662277,
-0.20654623210430145,
-0.03491424396634102,
-0.006521587260067463,
-0.061454012989997864,
-0.15103286504745483,
-0.047825928777456284,
0.08152977377176285,
-0.037827782332897186,
0.09292730689048767,
-0.017796803265810013,
0.07695619761943817,
0.037672221660614014,
-0.044257745146751404,
-0.05210644751787186,
-0.015511893667280674,
0.19951796531677246,
0.0728764608502388,
-0.016605518758296967,
0.11077996343374252,
0.0021902238950133324,
0.02967253513634205,
0.05070333927869797,
0.18143121898174286,
0.21343250572681427,
0.03150881826877594,
0.055515799671411514,
0.06443288922309875,
-0.07449449598789215,
-0.06952434778213501,
0.18006284534931183,
-0.01617536135017872,
0.06955260783433914,
-0.04901772737503052,
0.1997033953666687,
0.10782360285520554,
-0.1676589399576187,
0.046169862151145935,
-0.042929235845804214,
-0.08129660785198212,
-0.1252882331609726,
-0.013046779669821262,
-0.08606735616922379,
-0.1254584789276123,
0.037457793951034546,
-0.11643977463245392,
0.05314028263092041,
0.10908154398202896,
0.01346628088504076,
0.036302804946899414,
0.12767945230007172,
-0.021049946546554565,
0.0030664599034935236,
0.06466837972402573,
0.00537356361746788,
-0.012660798616707325,
-0.03719473257660866,
-0.07853289693593979,
0.049729228019714355,
0.0010896536987274885,
0.07831095904111862,
-0.047836363315582275,
-0.016265252605080605,
0.025954488664865494,
-0.027081286534667015,
-0.0802159234881401,
0.02790476754307747,
0.04072609916329384,
0.056532859802246094,
0.04816709831357002,
0.046625904738903046,
-0.008959450758993626,
-0.03350451961159706,
0.31833890080451965,
-0.06676765531301498,
-0.10200120508670807,
-0.12148986756801605,
0.21931298077106476,
0.030575083568692207,
-0.03093426488339901,
0.033595554530620575,
-0.08321797847747803,
-0.009089463390409946,
0.1599670797586441,
0.16711175441741943,
-0.0712156891822815,
-0.02267272025346756,
-0.005738536827266216,
-0.01821937784552574,
-0.03748960793018341,
0.1269543170928955,
0.08917223662137985,
-0.025172783061861992,
-0.0624174140393734,
-0.012292678467929363,
-0.01845906674861908,
-0.03202506899833679,
-0.040741246193647385,
0.04113396257162094,
0.015842346474528313,
-0.02587086893618107,
-0.04083295911550522,
0.0748528465628624,
0.005662040784955025,
-0.25751402974128723,
0.06588057428598404,
-0.15762412548065186,
-0.17062599956989288,
-0.044115930795669556,
0.036959435790777206,
0.0004790982056874782,
0.056915033608675,
-0.016750715672969818,
0.00875411368906498,
0.07853339612483978,
-0.01706628128886223,
-0.03145662695169449,
-0.1254425346851349,
0.12417581677436829,
-0.06610187143087387,
0.17088927328586578,
-0.028518136590719223,
0.049410734325647354,
0.1156260147690773,
0.02815369889140129,
-0.13503694534301758,
0.04226095974445343,
0.053948868066072464,
-0.10519395023584366,
0.01621348224580288,
0.15136153995990753,
-0.04602030664682388,
0.09118463844060898,
0.042147669941186905,
-0.10855890810489655,
0.005658193491399288,
-0.05628965422511101,
-0.03431947901844978,
-0.0813540443778038,
-0.012651403434574604,
-0.05921993777155876,
0.1692480593919754,
0.21906527876853943,
-0.03001358173787594,
0.01179471705108881,
-0.10225692391395569,
0.015794022008776665,
0.06997158378362656,
0.027994032949209213,
-0.057895876467227936,
-0.18892468512058258,
0.011535167694091797,
0.06714348495006561,
-0.007096181157976389,
-0.2395968735218048,
-0.07521307468414307,
0.038740191608667374,
-0.03716933727264404,
-0.04038168117403984,
0.10260829329490662,
0.04123678803443909,
0.052171822637319565,
-0.030073339119553566,
-0.15922857820987701,
-0.031165074557065964,
0.1548350751399994,
-0.1754707545042038,
-0.03582720458507538
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08036793023347855,
0.08442381769418716,
-0.00251922314055264,
0.08376204967498779,
0.13586518168449402,
0.0328560397028923,
0.10077176988124847,
0.13914747536182404,
-0.11453327536582947,
0.04278719797730446,
0.08845371752977371,
0.08181539177894592,
0.03142099827528,
0.13642512261867523,
-0.033323608338832855,
-0.2552852928638458,
-0.005683843977749348,
-0.020081622526049614,
-0.09653037041425705,
0.10947085171937943,
0.09916021674871445,
-0.09970349818468094,
0.07215513288974762,
-0.013807184062898159,
-0.17864809930324554,
0.016939451918005943,
-0.011581978760659695,
-0.05085508152842522,
0.11858944594860077,
-0.00979677215218544,
0.07657603919506073,
0.010316815227270126,
0.12360503524541855,
-0.19073013961315155,
0.01612808369100094,
0.07494032382965088,
0.04664304107427597,
0.09726399928331375,
0.008871646597981453,
-0.010718279518187046,
0.13213267922401428,
-0.12762778997421265,
0.09981464594602585,
0.03152845799922943,
-0.09506993740797043,
-0.20551301538944244,
-0.09575606882572174,
0.008551735430955887,
0.0449751541018486,
0.08350509405136108,
0.010630988515913486,
0.1592433750629425,
-0.09837347269058228,
0.08232248574495316,
0.23073206841945648,
-0.27597153186798096,
-0.0796658992767334,
0.053483910858631134,
0.06025843322277069,
0.08262106776237488,
-0.12555727362632751,
-0.009569748304784298,
0.006613634992390871,
0.025542350485920906,
0.10379723459482193,
-0.03234497085213661,
-0.08751419186592102,
0.0021484007593244314,
-0.105800099670887,
0.005769337527453899,
0.10933361202478409,
0.033826783299446106,
-0.053289853036403656,
-0.07094087451696396,
-0.04394926503300667,
-0.05216675251722336,
-0.034425366669893265,
-0.021258344873785973,
0.03884708136320114,
-0.05890249088406563,
-0.13821811974048615,
-0.04967014491558075,
-0.049944087862968445,
-0.09074096381664276,
-0.006550842896103859,
0.21942280232906342,
0.03646666556596756,
0.03204931318759918,
-0.05179537460207939,
0.10348029434680939,
0.012786949053406715,
-0.12734763324260712,
-0.02936858870089054,
-0.004905286710709333,
-0.09125993400812149,
-0.03636965900659561,
-0.05935971066355705,
0.020136334002017975,
0.03486897051334381,
0.21440498530864716,
-0.043360717594623566,
0.07760154455900192,
0.03302523493766785,
-0.016832612454891205,
-0.026892127469182014,
0.13843271136283875,
-0.020362596958875656,
-0.07295691221952438,
0.011762568727135658,
0.06329265981912613,
0.010470516048371792,
-0.004782109521329403,
-0.06316500157117844,
-0.04066194221377373,
0.06573694944381714,
0.048211824148893356,
-0.06075281277298927,
0.029374592006206512,
-0.007783893030136824,
-0.02366509661078453,
0.0010907823452726007,
-0.1155444011092186,
0.0161940548568964,
-0.008681202307343483,
-0.08148074150085449,
-0.04630640894174576,
0.007524465210735798,
-0.010303881950676441,
0.012152258306741714,
0.09596533328294754,
-0.07246820628643036,
-0.023465396836400032,
-0.07923293858766556,
-0.0743575245141983,
-0.016412418335676193,
-0.15599532425403595,
0.019041163846850395,
-0.0632551908493042,
-0.15961122512817383,
-0.029524734243750572,
0.05605785548686981,
-0.08182164281606674,
-0.029846707358956337,
-0.03246622532606125,
-0.07786004990339279,
0.021330401301383972,
0.0017785187810659409,
0.21625694632530212,
-0.04943973198533058,
0.08889704197645187,
0.008291754871606827,
0.05777299031615257,
-0.00891124177724123,
0.035475678741931915,
-0.08610052615404129,
0.009823059663176537,
-0.1777302473783493,
0.07544763386249542,
-0.07916548103094101,
0.016022788360714912,
-0.14024101197719574,
-0.08522962778806686,
-0.010892022401094437,
-0.02253025770187378,
0.07950705289840698,
0.10408683121204376,
-0.14355234801769257,
-0.021191319450736046,
0.11655651777982712,
-0.06642767041921616,
-0.055282194167375565,
0.05711393803358078,
-0.07611484080553055,
0.08752870559692383,
0.05417867377400398,
0.1914793998003006,
0.0880097895860672,
-0.10718470066785812,
0.015295171178877354,
0.014590106904506683,
0.03397996351122856,
0.0033137549180537462,
0.051136162132024765,
0.005146474111825228,
0.028756991028785706,
0.016081485897302628,
-0.07260282337665558,
0.0075436849147081375,
-0.0914965495467186,
-0.06023184210062027,
-0.04886017367243767,
-0.08404982835054398,
-0.005161067936569452,
0.012169872410595417,
0.03597075492143631,
-0.07968801259994507,
-0.0833730399608612,
0.07614871859550476,
0.13694141805171967,
-0.046925146132707596,
0.018840765580534935,
-0.07588057219982147,
-0.003079795278608799,
-0.03163595125079155,
-0.02447926439344883,
-0.20365838706493378,
-0.05702550709247589,
0.03325433284044266,
-0.008446712046861649,
0.0453522615134716,
0.0030474599916487932,
0.08436237275600433,
0.026294726878404617,
-0.05448785796761513,
-0.0006170691922307014,
-0.08655352145433426,
-0.007445221301168203,
-0.0947599709033966,
-0.22070148587226868,
-0.05273428186774254,
-0.038712047040462494,
0.1365828812122345,
-0.16575177013874054,
-0.004294774495065212,
-0.019145822152495384,
0.11592117697000504,
0.04278390854597092,
-0.05139638110995293,
-0.005466389004141092,
0.027919655665755272,
0.012609634548425674,
-0.09699322283267975,
0.038407836109399796,
0.016417372971773148,
-0.09400127828121185,
-0.02748848870396614,
-0.10291873663663864,
-0.010791580192744732,
0.06929057091474533,
0.07018031179904938,
-0.10210313647985458,
-0.014632653445005417,
-0.06286809593439102,
-0.028493337333202362,
-0.05454454943537712,
0.03880387172102928,
0.18093045055866241,
0.01784905232489109,
0.10933525115251541,
-0.07501532137393951,
-0.08331865072250366,
0.018717559054493904,
0.007723816204816103,
0.06115902215242386,
0.10180024802684784,
0.07545936107635498,
-0.11082277446985245,
0.05686619132757187,
0.09300719946622849,
-0.052258990705013275,
0.135982945561409,
-0.04674520343542099,
-0.07326225191354752,
-0.03267751261591911,
-0.007910053245723248,
-0.006893314886838198,
0.1500592827796936,
-0.03689815476536751,
0.018017850816249847,
0.03377394750714302,
0.03782409429550171,
0.0042396062053740025,
-0.1626286506652832,
-0.014888142235577106,
0.014530826359987259,
-0.04870142042636871,
-0.022098876535892487,
0.01469701062887907,
0.01657727360725403,
0.09557640552520752,
0.041946906596422195,
-0.004229498561471701,
0.007523389533162117,
-0.010245030745863914,
-0.04359593242406845,
0.199918732047081,
-0.09079619497060776,
-0.04388128221035004,
-0.0757785215973854,
-0.00032868204289115965,
-0.02289132960140705,
-0.040249474346637726,
0.015571856871247292,
-0.09386356174945831,
-0.025085728615522385,
-0.07213576883077621,
0.0002897153317462653,
-0.04141562432050705,
0.0156391728669405,
0.004827980417758226,
0.01461368054151535,
0.059986136853694916,
-0.13193681836128235,
0.010002389550209045,
-0.06760627031326294,
-0.11140576750040054,
0.02876059152185917,
0.06087033823132515,
0.07967337220907211,
0.05842379480600357,
-0.032026153057813644,
0.018926236778497696,
-0.04125381261110306,
0.23169419169425964,
-0.07757783681154251,
0.01272303145378828,
0.12339796870946884,
0.025894850492477417,
0.040158532559871674,
0.10531653463840485,
0.03272208198904991,
-0.09959761798381805,
0.03891780227422714,
0.07662145793437958,
-0.04114655777812004,
-0.24476996064186096,
0.007306466344743967,
-0.03781146556138992,
-0.09564635157585144,
0.08860115706920624,
0.05065475404262543,
-0.04530469700694084,
0.06403883546590805,
0.0108658317476511,
0.010109980590641499,
-0.023387234658002853,
0.08806439489126205,
0.08996714651584625,
0.0662628561258316,
0.10731890052556992,
-0.03826656937599182,
-0.017765086144208908,
0.06453990191221237,
0.028912698850035667,
0.30296432971954346,
-0.04843399301171303,
0.08512575924396515,
0.049160875380039215,
0.14004483819007874,
-0.02206607721745968,
0.045214492827653885,
0.008351773954927921,
-0.006190972402691841,
-0.030062422156333923,
-0.054161280393600464,
-0.019820384681224823,
0.0031566075049340725,
-0.07706419378519058,
0.0455053336918354,
-0.05247354507446289,
0.051700592041015625,
0.019277270883321762,
0.2913191616535187,
0.0032912632450461388,
-0.261576384305954,
-0.09366296976804733,
-0.014778800308704376,
-0.0378149077296257,
-0.05178208276629448,
0.011635564267635345,
0.1218528226017952,
-0.12413815408945084,
0.036892376840114594,
-0.07102862745523453,
0.0800555944442749,
-0.029525509104132652,
-0.002296567428857088,
0.041139476001262665,
0.17108118534088135,
-0.020725157111883163,
0.05588226020336151,
-0.22232311964035034,
0.228200763463974,
0.014406584203243256,
0.12812019884586334,
-0.058298259973526,
0.008814780041575432,
0.025808259844779968,
0.004061133600771427,
0.08910536020994186,
-0.003643434029072523,
-0.06604847311973572,
-0.13593851029872894,
-0.05341634899377823,
0.07339812815189362,
0.14329424500465393,
-0.04188959673047066,
0.0967777892947197,
-0.0581628754734993,
0.014805231243371964,
0.03689415007829666,
-0.08766045421361923,
-0.13403359055519104,
-0.09717490524053574,
-0.022336158901453018,
0.013490368612110615,
-0.07159730046987534,
-0.05739928036928177,
-0.06905075162649155,
0.032410718500614166,
0.10547307133674622,
0.014813525602221489,
-0.031583379954099655,
-0.14686639606952667,
0.0774509608745575,
0.15600137412548065,
-0.06769603490829468,
0.02556207962334156,
-0.0036114929243922234,
0.07593603432178497,
0.038328975439071655,
-0.08281015604734421,
0.06057175248861313,
-0.06442318111658096,
-0.17970705032348633,
-0.04829491674900055,
0.09558619558811188,
0.07033094763755798,
0.04220890998840332,
-0.002411414170637727,
0.05212469398975372,
-0.025033650919795036,
-0.09304404258728027,
0.024641485884785652,
0.021723495796322823,
0.03565921261906624,
0.036076415330171585,
-0.08533140271902084,
0.07732797414064407,
-0.037748415023088455,
-0.014043043367564678,
0.11504954099655151,
0.23287785053253174,
-0.10165679454803467,
0.09991221874952316,
0.06117568165063858,
-0.06200939416885376,
-0.16083554923534393,
0.07025858759880066,
0.10458908975124359,
0.006942973006516695,
0.07210785150527954,
-0.20973658561706543,
0.12863096594810486,
0.10299041122198105,
-0.017228800803422928,
0.039344701915979385,
-0.2740517854690552,
-0.12033510953187943,
0.044239018112421036,
0.1288098841905594,
0.09657342731952667,
-0.12443362921476364,
-0.014456371776759624,
-0.015741858631372452,
-0.11598136276006699,
0.0943882018327713,
-0.1126767173409462,
0.13529439270496368,
-0.028571898117661476,
0.11101685464382172,
0.011079246178269386,
-0.025742188096046448,
0.10369422286748886,
0.046651389449834824,
0.10117540508508682,
-0.04224015399813652,
0.003565044840797782,
0.06273769587278366,
-0.04891897737979889,
0.0028416900895535946,
-0.07518292218446732,
0.0877695083618164,
-0.13122063875198364,
-0.0036498638801276684,
-0.09117268770933151,
0.045683640986680984,
-0.03917238861322403,
-0.06884686648845673,
-0.0419640839099884,
0.05664055794477463,
0.04866357892751694,
-0.035751670598983765,
0.04679747298359871,
-0.01769115962088108,
0.1022709310054779,
0.03821327164769173,
0.08620823174715042,
0.014696680009365082,
-0.04561031237244606,
0.022735238075256348,
-0.0100207244977355,
0.06402010470628738,
-0.16467735171318054,
0.012490272521972656,
0.09927155822515488,
0.060096003115177155,
0.09755797684192657,
0.043780095875263214,
-0.04615079239010811,
0.017792075872421265,
0.029722725972533226,
-0.10138200223445892,
-0.10748333483934402,
0.04695213586091995,
-0.028471192345023155,
-0.1398153156042099,
0.04880871623754501,
0.12216707319021225,
-0.04362540692090988,
-0.027454126626253128,
-0.01731676422059536,
0.0050117927603423595,
-0.023852547630667686,
0.18187400698661804,
0.048906415700912476,
0.05570165812969208,
-0.1011452004313469,
0.12810823321342468,
0.02892952784895897,
-0.01987319439649582,
0.051457058638334274,
0.08284848928451538,
-0.10284868627786636,
-0.00250235921703279,
0.08376608043909073,
0.13768331706523895,
-0.05849640443921089,
-0.005568890366703272,
-0.10427384078502655,
-0.08408339321613312,
0.05002332851290703,
0.14352160692214966,
0.04943959414958954,
-0.016154594719409943,
-0.05363137274980545,
0.0401146300137043,
-0.14010171592235565,
0.07159310579299927,
0.025556284934282303,
0.0635019838809967,
-0.07709931582212448,
0.06109646335244179,
0.007311682682484388,
0.014663529582321644,
-0.01701565831899643,
0.007818805053830147,
-0.09325123578310013,
-0.016288207843899727,
-0.08167166262865067,
-0.0018522734753787518,
0.00002916994708357379,
0.016643265262246132,
-0.02027549035847187,
-0.07170619070529938,
-0.04836980625987053,
0.03698519617319107,
-0.08757635205984116,
-0.05000489577651024,
0.008323724381625652,
0.04045139253139496,
-0.12356487661600113,
-0.006054968573153019,
0.022166995331645012,
-0.09329936653375626,
0.0954023227095604,
0.07324419915676117,
0.017389880493283272,
0.030769014731049538,
-0.12372957915067673,
-0.03333901986479759,
-0.009762306697666645,
-0.0084049291908741,
0.06194501742720604,
-0.09583289921283722,
-0.009633657522499561,
-0.0374574288725853,
0.07056912034749985,
0.013117731548845768,
0.06980269402265549,
-0.13397149741649628,
0.019773948937654495,
-0.07613646239042282,
-0.048344455659389496,
-0.07503608614206314,
0.03493291139602661,
0.09651761502027512,
0.060348957777023315,
0.15180256962776184,
-0.07709058374166489,
0.02420066110789776,
-0.20625700056552887,
-0.03467579558491707,
-0.0064045279286801815,
-0.05938934534788132,
-0.15152297914028168,
-0.047597359865903854,
0.080476313829422,
-0.03772468492388725,
0.09148633480072021,
-0.0182869341224432,
0.07578102499246597,
0.03799417242407799,
-0.046885084360837936,
-0.050654035061597824,
-0.015652311965823174,
0.19783620536327362,
0.07293852418661118,
-0.016256511211395264,
0.11116141825914383,
0.0006428438355214894,
0.030231988057494164,
0.05065201222896576,
0.18275512754917145,
0.21347163617610931,
0.0322367399930954,
0.05547664314508438,
0.06440349668264389,
-0.07401344180107117,
-0.0716339647769928,
0.17859584093093872,
-0.015899980440735817,
0.06969013065099716,
-0.04808983951807022,
0.1977710872888565,
0.10882502794265747,
-0.1683979630470276,
0.04498067870736122,
-0.0430128239095211,
-0.08135267347097397,
-0.12653562426567078,
-0.012185215950012207,
-0.08679132908582687,
-0.12580221891403198,
0.03726973012089729,
-0.11641374230384827,
0.05346818268299103,
0.10830950736999512,
0.01274713221937418,
0.03715694695711136,
0.1253347247838974,
-0.020351707935333252,
0.0029728219378739595,
0.06395316869020462,
0.0056567504070699215,
-0.011940021067857742,
-0.035716764628887177,
-0.07910808175802231,
0.04958048090338707,
0.0028821390587836504,
0.0780431404709816,
-0.0466858446598053,
-0.015121664851903915,
0.025448501110076904,
-0.027909018099308014,
-0.08054274320602417,
0.02753717266023159,
0.040607936680316925,
0.056714218109846115,
0.047094929963350296,
0.04690570384263992,
-0.008655322715640068,
-0.03305778652429581,
0.3197321593761444,
-0.06650818139314651,
-0.10129676014184952,
-0.12176346778869629,
0.22162508964538574,
0.028963133692741394,
-0.03000848926603794,
0.034726582467556,
-0.083041712641716,
-0.010770957916975021,
0.15821944177150726,
0.1665867269039154,
-0.07295717298984528,
-0.022495510056614876,
-0.005963696166872978,
-0.017947839573025703,
-0.03662748262286186,
0.1279257833957672,
0.08850197494029999,
-0.02336038462817669,
-0.0631180852651596,
-0.012886998243629932,
-0.019755858927965164,
-0.03145479038357735,
-0.04093458876013756,
0.042046211659908295,
0.014695245772600174,
-0.024538323283195496,
-0.04223066195845604,
0.07395751029253006,
0.005547812674194574,
-0.2567841112613678,
0.06644336879253387,
-0.1570926010608673,
-0.17112509906291962,
-0.043166663497686386,
0.037767741829156876,
-0.0014516511000692844,
0.05702150613069534,
-0.017651628702878952,
0.008355730213224888,
0.07896365225315094,
-0.017261475324630737,
-0.03172576427459717,
-0.12432236224412918,
0.12491635233163834,
-0.06544945389032364,
0.17214339971542358,
-0.027728095650672913,
0.05002523586153984,
0.11509889364242554,
0.02870142087340355,
-0.13594533503055573,
0.04110647737979889,
0.054139021784067154,
-0.10539022833108902,
0.015392377972602844,
0.15223707258701324,
-0.046329058706760406,
0.09312380850315094,
0.04373233765363693,
-0.10817231237888336,
0.0039956532418727875,
-0.056152649223804474,
-0.03435251861810684,
-0.08070684969425201,
-0.014433850534260273,
-0.060467205941677094,
0.16824349761009216,
0.21831093728542328,
-0.030248064547777176,
0.012174106203019619,
-0.10177476704120636,
0.01656036265194416,
0.06990941613912582,
0.029780343174934387,
-0.05672059208154678,
-0.1890006959438324,
0.01128421351313591,
0.06806330382823944,
-0.006683614570647478,
-0.24087941646575928,
-0.07697436958551407,
0.039352353662252426,
-0.03614848479628563,
-0.04015108197927475,
0.1040765792131424,
0.03976983204483986,
0.051575712859630585,
-0.029850447550415993,
-0.16043800115585327,
-0.03172304853796959,
0.15414893627166748,
-0.17487895488739014,
-0.036013275384902954
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08074939996004105,
0.08326296508312225,
-0.0024755450431257486,
0.08468569815158844,
0.1370745301246643,
0.03320882096886635,
0.10110161453485489,
0.13839173316955566,
-0.1145801693201065,
0.0423574298620224,
0.08859833329916,
0.08181970566511154,
0.030597276985645294,
0.13613656163215637,
-0.033255744725465775,
-0.25512999296188354,
-0.006136881187558174,
-0.019502591341733932,
-0.09599366784095764,
0.10943789035081863,
0.09826112538576126,
-0.1003284901380539,
0.07271284610033035,
-0.014062217436730862,
-0.17969666421413422,
0.017333999276161194,
-0.01219107024371624,
-0.050604112446308136,
0.11880922317504883,
-0.00968946237117052,
0.07688413560390472,
0.009527619928121567,
0.12345702201128006,
-0.18914823234081268,
0.016408242285251617,
0.07509555667638779,
0.046227436512708664,
0.096894271671772,
0.008347020484507084,
-0.01099756546318531,
0.13096001744270325,
-0.12796783447265625,
0.09961962699890137,
0.03139353170990944,
-0.095280222594738,
-0.2061282992362976,
-0.09522518515586853,
0.007178287487477064,
0.044009167701005936,
0.08433897793292999,
0.010335221886634827,
0.15875642001628876,
-0.09843296557664871,
0.08275066316127777,
0.22921593487262726,
-0.2766142785549164,
-0.08004350960254669,
0.05405723676085472,
0.06025722995400429,
0.08291245251893997,
-0.1260690987110138,
-0.008795286528766155,
0.006920092739164829,
0.026343848556280136,
0.10406723618507385,
-0.03273389860987663,
-0.08765582740306854,
0.002548362361267209,
-0.10567785054445267,
0.0072780391201376915,
0.1100628525018692,
0.03404795750975609,
-0.05297374352812767,
-0.0710364431142807,
-0.043321896344423294,
-0.05342941731214523,
-0.034857217222452164,
-0.020324530079960823,
0.039010077714920044,
-0.05974253639578819,
-0.13860197365283966,
-0.04808507859706879,
-0.0495477057993412,
-0.08967549353837967,
-0.006725548766553402,
0.21854399144649506,
0.036312248557806015,
0.03176305070519447,
-0.0516253300011158,
0.10270503163337708,
0.012389776296913624,
-0.1271803379058838,
-0.02834862843155861,
-0.005309786647558212,
-0.09073520451784134,
-0.036356475204229355,
-0.06003131344914436,
0.021384933963418007,
0.03517116606235504,
0.21467989683151245,
-0.04345494881272316,
0.07821563631296158,
0.03360079973936081,
-0.017441438511013985,
-0.026900405064225197,
0.1374734789133072,
-0.021668797358870506,
-0.07450512051582336,
0.011963210999965668,
0.06325535476207733,
0.009860082529485226,
-0.004868195857852697,
-0.06345732510089874,
-0.03983859717845917,
0.06451790779829025,
0.047682806849479675,
-0.06189672648906708,
0.031282659620046616,
-0.006821638438850641,
-0.023227466270327568,
0.0009266905835829675,
-0.11534861475229263,
0.01583823375403881,
-0.008986865170300007,
-0.0816149041056633,
-0.04535554721951485,
0.007046493701636791,
-0.01113093551248312,
0.01191942859441042,
0.09637846052646637,
-0.073377825319767,
-0.024038638919591904,
-0.07981617748737335,
-0.07533371448516846,
-0.016949161887168884,
-0.1559145152568817,
0.020052578300237656,
-0.06305055320262909,
-0.1599096804857254,
-0.030660033226013184,
0.05590242147445679,
-0.08156206458806992,
-0.028577759861946106,
-0.03246879577636719,
-0.0786357969045639,
0.0209896769374609,
0.0019851403776556253,
0.2168775051832199,
-0.049795087426900864,
0.08784905076026917,
0.009335079230368137,
0.05781036615371704,
-0.009551582857966423,
0.03556150197982788,
-0.08588714152574539,
0.009131981991231441,
-0.17832109332084656,
0.07470405101776123,
-0.079792320728302,
0.01789913885295391,
-0.1391131728887558,
-0.08621951192617416,
-0.010012563318014145,
-0.02140934392809868,
0.07911461591720581,
0.10354360193014145,
-0.14331835508346558,
-0.021201975643634796,
0.11583880335092545,
-0.06515093147754669,
-0.055396951735019684,
0.057056769728660583,
-0.07642509043216705,
0.08664362132549286,
0.05459078401327133,
0.1917695701122284,
0.08791225403547287,
-0.10669034719467163,
0.014498849399387836,
0.013826405629515648,
0.03423884138464928,
0.0026436017360538244,
0.04953857138752937,
0.006112989503890276,
0.028781911358237267,
0.01660168170928955,
-0.07136116921901703,
0.00751535315066576,
-0.0911114290356636,
-0.059759099036455154,
-0.04830368980765343,
-0.0839623436331749,
-0.005958911497145891,
0.012475975789129734,
0.03588854521512985,
-0.08061909675598145,
-0.08358396589756012,
0.0754300132393837,
0.13668952882289886,
-0.047388527542352676,
0.017741495743393898,
-0.0761328637599945,
-0.0025349839124828577,
-0.03168065845966339,
-0.02399914525449276,
-0.20344842970371246,
-0.05779974162578583,
0.032273292541503906,
-0.006463557947427034,
0.045526452362537384,
0.003363260067999363,
0.0848788246512413,
0.026516180485486984,
-0.054677996784448624,
-0.0007702043512836099,
-0.08523213863372803,
-0.007011776324361563,
-0.09458725899457932,
-0.22114254534244537,
-0.052836041897535324,
-0.03856166824698448,
0.1364533007144928,
-0.16630153357982635,
-0.003964981995522976,
-0.019105875864624977,
0.11535287648439407,
0.04241421818733215,
-0.05079831928014755,
-0.004884908441454172,
0.02931123599410057,
0.013369218446314335,
-0.09698433429002762,
0.03890153020620346,
0.01610281504690647,
-0.09284335374832153,
-0.028824767097830772,
-0.103282131254673,
-0.00937715545296669,
0.07034848630428314,
0.06901010125875473,
-0.10258006304502487,
-0.014276417903602123,
-0.06237630918622017,
-0.02832442708313465,
-0.052836671471595764,
0.03878645598888397,
0.18200936913490295,
0.017226051539182663,
0.10927629470825195,
-0.07439560443162918,
-0.08290643990039825,
0.01886809803545475,
0.008798220194876194,
0.06234278157353401,
0.101760633289814,
0.07478498667478561,
-0.10937851667404175,
0.057119693607091904,
0.09247495979070663,
-0.05269403010606766,
0.13660486042499542,
-0.04689858481287956,
-0.07261213660240173,
-0.0321909636259079,
-0.009013760834932327,
-0.007303756661713123,
0.15072381496429443,
-0.03726588562130928,
0.01697961986064911,
0.033386219292879105,
0.03738706558942795,
0.004513708874583244,
-0.16171446442604065,
-0.014830953441560268,
0.013823427259922028,
-0.047725412994623184,
-0.022679023444652557,
0.015386506915092468,
0.015973735600709915,
0.09522820264101028,
0.04175780713558197,
-0.005371488630771637,
0.0075477370992302895,
-0.010064555332064629,
-0.04266760125756264,
0.2003190666437149,
-0.09120769053697586,
-0.04267709702253342,
-0.07550633698701859,
-0.0022425467614084482,
-0.023925133049488068,
-0.040688056498765945,
0.015286457724869251,
-0.09503529965877533,
-0.02545601688325405,
-0.07181552797555923,
-0.0003946351644117385,
-0.04134773835539818,
0.015568859875202179,
0.004312665667384863,
0.014018295332789421,
0.05952432006597519,
-0.1317920982837677,
0.010109925642609596,
-0.06791841983795166,
-0.11139735579490662,
0.029228411614894867,
0.061523910611867905,
0.08035831153392792,
0.05790194496512413,
-0.03209543600678444,
0.01897677220404148,
-0.04112733528017998,
0.23258155584335327,
-0.07764837145805359,
0.012959669344127178,
0.12317093461751938,
0.026670370250940323,
0.03923948481678963,
0.10532024502754211,
0.03370233625173569,
-0.10003239661455154,
0.03861341252923012,
0.07706243544816971,
-0.04075096920132637,
-0.24461650848388672,
0.007754290010780096,
-0.038257449865341187,
-0.0959169790148735,
0.08822917193174362,
0.05061180889606476,
-0.044775817543268204,
0.06462357938289642,
0.01181792188435793,
0.010984697379171848,
-0.024291759356856346,
0.08779095113277435,
0.09169742465019226,
0.06585859507322311,
0.10802441090345383,
-0.03860686346888542,
-0.018099548295140266,
0.06389816105365753,
0.028989113867282867,
0.3037879168987274,
-0.04856019839644432,
0.08492833375930786,
0.0496085025370121,
0.13957637548446655,
-0.02144990861415863,
0.04580650478601456,
0.007707667071372271,
-0.006868524942547083,
-0.030115116387605667,
-0.05393160879611969,
-0.018689896911382675,
0.002118200995028019,
-0.07869482785463333,
0.04526856169104576,
-0.05241204798221588,
0.050404615700244904,
0.019564202055335045,
0.2905523478984833,
0.0026233261451125145,
-0.2636806070804596,
-0.09412795305252075,
-0.015537366271018982,
-0.0374077744781971,
-0.051359787583351135,
0.01197919249534607,
0.12128196656703949,
-0.12355411797761917,
0.03743667155504227,
-0.07092952728271484,
0.079932801425457,
-0.02959439717233181,
-0.001253359718248248,
0.04296485707163811,
0.17270709574222565,
-0.02166086621582508,
0.05532433092594147,
-0.22297479212284088,
0.227240189909935,
0.014716505073010921,
0.1286940574645996,
-0.05838339775800705,
0.00859815627336502,
0.0266877468675375,
0.0054541584104299545,
0.08864607661962509,
-0.00419956399127841,
-0.06595290452241898,
-0.1359563171863556,
-0.052333518862724304,
0.07436154782772064,
0.14278411865234375,
-0.040777575224637985,
0.09739616513252258,
-0.057418566197156906,
0.01433262787759304,
0.03678428754210472,
-0.08878904581069946,
-0.1341479867696762,
-0.09770035743713379,
-0.02326130121946335,
0.015085420571267605,
-0.07141545414924622,
-0.056525819003582,
-0.06936462968587875,
0.03075699508190155,
0.1045764833688736,
0.0164283849298954,
-0.031558673828840256,
-0.14724160730838776,
0.07658811658620834,
0.1560024619102478,
-0.06688520312309265,
0.025726119056344032,
-0.0032884052488952875,
0.07480129599571228,
0.03925963491201401,
-0.08269517868757248,
0.06078910827636719,
-0.0650663748383522,
-0.17838071286678314,
-0.04839693009853363,
0.09455815702676773,
0.06992500275373459,
0.04143763333559036,
-0.0028751373756676912,
0.05214333161711693,
-0.02591158263385296,
-0.09366750717163086,
0.024683430790901184,
0.020184140652418137,
0.036482393741607666,
0.03600388392806053,
-0.08634994179010391,
0.07718843966722488,
-0.037067096680402756,
-0.013187670148909092,
0.1135990247130394,
0.2305755466222763,
-0.10119825601577759,
0.09779983758926392,
0.06124631687998772,
-0.06161611154675484,
-0.16042041778564453,
0.07163187861442566,
0.1031341552734375,
0.007293387316167355,
0.07063285261392593,
-0.2103283703327179,
0.1297101229429245,
0.10249175131320953,
-0.01621920056641102,
0.04116560146212578,
-0.2715425193309784,
-0.11961178481578827,
0.04446132108569145,
0.12943817675113678,
0.09789153933525085,
-0.12493832409381866,
-0.013716467656195164,
-0.01592971570789814,
-0.11563689261674881,
0.09370089322328568,
-0.11508060991764069,
0.13550783693790436,
-0.029002739116549492,
0.1117204949259758,
0.010354535654187202,
-0.02572636678814888,
0.102805495262146,
0.0475754588842392,
0.10238800942897797,
-0.04262065142393112,
0.004512168932706118,
0.06269882619380951,
-0.04816659912467003,
0.0025449469685554504,
-0.07604275643825531,
0.08731585741043091,
-0.13142342865467072,
-0.003478654893115163,
-0.09161162376403809,
0.045686524361371994,
-0.03898380696773529,
-0.06863013654947281,
-0.04136718809604645,
0.056688714772462845,
0.0475911945104599,
-0.035981129854917526,
0.04565620422363281,
-0.01846625842154026,
0.1026783436536789,
0.036873091012239456,
0.08673848956823349,
0.013203158043324947,
-0.04598976671695709,
0.023981977254152298,
-0.010522757656872272,
0.06402812898159027,
-0.1643865555524826,
0.01118785422295332,
0.0998411700129509,
0.05936385318636894,
0.09713968634605408,
0.04450207203626633,
-0.04520886018872261,
0.01716289483010769,
0.03050833009183407,
-0.10172759741544724,
-0.10640738159418106,
0.04723039269447327,
-0.03156979754567146,
-0.13918228447437286,
0.050137970596551895,
0.12194596976041794,
-0.04348365217447281,
-0.027142653241753578,
-0.017313893884420395,
0.0042525529861450195,
-0.023830898106098175,
0.1824081391096115,
0.04942740872502327,
0.05516374111175537,
-0.10211458057165146,
0.12772446870803833,
0.028773389756679535,
-0.020238829776644707,
0.05143234133720398,
0.08388080447912216,
-0.10372179746627808,
-0.003364975331351161,
0.08311746269464493,
0.13962005078792572,
-0.0580935999751091,
-0.006197922397404909,
-0.10532067716121674,
-0.08432944118976593,
0.04990527033805847,
0.14322270452976227,
0.049340590834617615,
-0.017346030101180077,
-0.05388399586081505,
0.039634399116039276,
-0.14076536893844604,
0.07120853662490845,
0.024500444531440735,
0.06397223472595215,
-0.0770777016878128,
0.060356076806783676,
0.007149709854274988,
0.013911382295191288,
-0.016964256763458252,
0.008382569998502731,
-0.09381502866744995,
-0.01646796055138111,
-0.0815432220697403,
-0.0026612861547619104,
0.00009322731784777716,
0.017208049073815346,
-0.02054382674396038,
-0.07075860351324081,
-0.04900025576353073,
0.03710418567061424,
-0.08772293478250504,
-0.049622759222984314,
0.010070313699543476,
0.0409441739320755,
-0.12281031161546707,
-0.005750461481511593,
0.021027645096182823,
-0.09287437051534653,
0.09511683136224747,
0.07303418964147568,
0.01706923358142376,
0.03092561475932598,
-0.1228371188044548,
-0.0338086374104023,
-0.010279879905283451,
-0.008977432735264301,
0.06268946826457977,
-0.0947350487112999,
-0.009122407995164394,
-0.03766557574272156,
0.07117591053247452,
0.013252802193164825,
0.06896607577800751,
-0.1333470493555069,
0.02033974416553974,
-0.07536985725164413,
-0.046905744820833206,
-0.07565050572156906,
0.03513574227690697,
0.09732209146022797,
0.059952814131975174,
0.1517815887928009,
-0.07704798132181168,
0.023512039333581924,
-0.20623783767223358,
-0.035021647810935974,
-0.006531955674290657,
-0.06096290796995163,
-0.15113787353038788,
-0.047714393585920334,
0.08102891594171524,
-0.03838128224015236,
0.09317617118358612,
-0.01813902147114277,
0.07580281794071198,
0.03752506896853447,
-0.046617548912763596,
-0.051740363240242004,
-0.015439636074006557,
0.19745762646198273,
0.07239901274442673,
-0.01676134020090103,
0.11092983186244965,
0.00192981434520334,
0.029605844989418983,
0.05183299630880356,
0.1811966747045517,
0.21291591227054596,
0.033013127744197845,
0.055527735501527786,
0.0648098737001419,
-0.07442964613437653,
-0.07020580768585205,
0.17952843010425568,
-0.015544122084975243,
0.07026417553424835,
-0.048999588936567307,
0.19653275609016418,
0.10795184969902039,
-0.16733220219612122,
0.04520426318049431,
-0.0438663475215435,
-0.08138258010149002,
-0.12551440298557281,
-0.010936065576970577,
-0.08628612756729126,
-0.12631016969680786,
0.0372982881963253,
-0.11688628047704697,
0.052698057144880295,
0.10935389250516891,
0.013264786452054977,
0.036646079272031784,
0.12680353224277496,
-0.019568726420402527,
0.0036746824625879526,
0.06416486203670502,
0.005230995360761881,
-0.012383067049086094,
-0.035684239119291306,
-0.0787276104092598,
0.04892924055457115,
0.001393522834405303,
0.07734599709510803,
-0.047219350934028625,
-0.015532568097114563,
0.026026686653494835,
-0.027569212019443512,
-0.08013952523469925,
0.027383944019675255,
0.04118905961513519,
0.05636363849043846,
0.04799366369843483,
0.04649130254983902,
-0.009518933482468128,
-0.03309193626046181,
0.3187480568885803,
-0.06645750999450684,
-0.10029507428407669,
-0.12204000353813171,
0.22118043899536133,
0.028708137571811676,
-0.030537685379385948,
0.03353431820869446,
-0.08208640664815903,
-0.009773560799658298,
0.15938952565193176,
0.16835176944732666,
-0.07324180752038956,
-0.022761650383472443,
-0.005674745887517929,
-0.017991285771131516,
-0.03682376816868782,
0.12833799421787262,
0.08885599672794342,
-0.024286756291985512,
-0.06300857663154602,
-0.012828541919589043,
-0.019285133108496666,
-0.03110790252685547,
-0.041067130863666534,
0.04140234738588333,
0.015689609572291374,
-0.024983583018183708,
-0.04148021712899208,
0.07382968813180923,
0.005016830284148455,
-0.25676971673965454,
0.06706243008375168,
-0.15678425133228302,
-0.17133860290050507,
-0.044157326221466064,
0.03730707988142967,
-0.000689864216838032,
0.057447027415037155,
-0.01774260587990284,
0.008178041316568851,
0.07949379086494446,
-0.017701406031847,
-0.030857058241963387,
-0.12511664628982544,
0.12474773079156876,
-0.06587155163288116,
0.1710255742073059,
-0.028189165517687798,
0.04975955933332443,
0.11571578681468964,
0.028363676741719246,
-0.13566257059574127,
0.04167572408914566,
0.05343937501311302,
-0.10601303726434708,
0.015863539651036263,
0.151230588555336,
-0.04617714136838913,
0.09245361387729645,
0.04262680932879448,
-0.10727467387914658,
0.004805786069482565,
-0.055989399552345276,
-0.03422364592552185,
-0.08066266775131226,
-0.014481631107628345,
-0.06030212342739105,
0.16868959367275238,
0.21958155930042267,
-0.030050814151763916,
0.012103961780667305,
-0.10168343037366867,
0.016515936702489853,
0.07044481486082077,
0.02830660343170166,
-0.05754556506872177,
-0.18886534869670868,
0.011601944454014301,
0.06892302632331848,
-0.00716285640373826,
-0.2416258454322815,
-0.07586292177438736,
0.039153747260570526,
-0.03594289720058441,
-0.040373485535383224,
0.10385346412658691,
0.040770433843135834,
0.05249466747045517,
-0.03025597520172596,
-0.1583404392004013,
-0.03172263875603676,
0.15396440029144287,
-0.17495623230934143,
-0.036655623465776443
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-42
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 39.04446546830653, 'f1': 49.90230650794353}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-42
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 39.04446546830653, 'f1': 49.90230650794353}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 39.04446546830653, 'f1': 49.90230650794353}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 39.04446546830653, 'f1': 49.90230650794353}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
52,
46,
6,
12,
8,
3,
104,
33,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results\n\n{'exact_match': 39.04446546830653, 'f1': 49.90230650794353}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.09683026373386383,
0.07495033740997314,
-0.0027644417714327574,
0.0821521133184433,
0.14331397414207458,
0.04113554209470749,
0.10319000482559204,
0.12183528393507004,
-0.10625623166561127,
0.049156997352838516,
0.06629222631454468,
0.0790088027715683,
0.047030895948410034,
0.12207284569740295,
-0.043230149894952774,
-0.2594362199306488,
-0.01061440259218216,
-0.004642900079488754,
-0.11639925092458725,
0.11530039459466934,
0.1061239093542099,
-0.09240435063838959,
0.07421748340129852,
0.0058064647018909454,
-0.17529666423797607,
0.03627760335803032,
0.001035825232975185,
-0.041979625821113586,
0.11209416389465332,
-0.0024179783649742603,
0.10164962708950043,
0.012457893230021,
0.13042671978473663,
-0.1707635223865509,
0.015504387207329273,
0.08962677419185638,
0.03731533885002136,
0.10551467537879944,
0.05577554926276207,
-0.03247649222612381,
0.13315530121326447,
-0.1219235435128212,
0.08097949624061584,
0.049164049327373505,
-0.10032746940851212,
-0.2195560783147812,
-0.1113223135471344,
0.03318513557314873,
0.05285964533686638,
0.08529046177864075,
0.01410412136465311,
0.14304909110069275,
-0.0675027146935463,
0.0846589207649231,
0.2528340518474579,
-0.30065563321113586,
-0.08721216022968292,
0.06911230832338333,
0.05360430106520653,
0.0570196658372879,
-0.11087542772293091,
-0.0016684859292581677,
0.01026533916592598,
0.03971705213189125,
0.10085030645132065,
-0.03958170488476753,
-0.11348936706781387,
-0.008093264885246754,
-0.11813132464885712,
0.01338738389313221,
0.08259782940149307,
0.03709530457854271,
-0.06084325909614563,
-0.034771960228681564,
-0.06413744390010834,
-0.07831040769815445,
-0.026477869600057602,
-0.04605768248438835,
0.05576581135392189,
-0.05812658742070198,
-0.10179208219051361,
-0.066741444170475,
-0.05687885358929634,
-0.08794384449720383,
-0.035602569580078125,
0.22104153037071228,
0.030537815764546394,
0.04522472992539406,
-0.048232048749923706,
0.10891380161046982,
-0.0206924881786108,
-0.13012954592704773,
-0.017311228439211845,
-0.006711774971336126,
-0.09564047306776047,
-0.030903691425919533,
-0.06237173080444336,
-0.009213188663125038,
0.008523880504071712,
0.20627263188362122,
-0.09406294673681259,
0.08198410272598267,
0.0395105816423893,
-0.010212847962975502,
-0.05040643736720085,
0.14605319499969482,
-0.05381214618682861,
-0.08255670219659805,
0.007201717235147953,
0.062273766845464706,
0.014660668559372425,
0.005532819777727127,
-0.055598169565200806,
-0.05506889522075653,
0.05132686346769333,
0.041657183319330215,
-0.05149583891034126,
0.033156801015138626,
-0.004877518862485886,
-0.033484138548374176,
0.019789820536971092,
-0.11264557391405106,
0.014054836705327034,
-0.014283332042396069,
-0.10620735585689545,
-0.039968155324459076,
0.03058534860610962,
0.0010713479714468122,
-0.005161052569746971,
0.1084897369146347,
-0.08201730996370316,
0.0017620634753257036,
-0.0908709466457367,
-0.08968950062990189,
-0.016012970358133316,
-0.15641865134239197,
-0.004216338973492384,
-0.05373499169945717,
-0.17082473635673523,
-0.04170358553528786,
0.05076432228088379,
-0.07711844891309738,
-0.017284510657191277,
-0.0044250525534152985,
-0.08083374798297882,
0.015601346269249916,
-0.000891400792170316,
0.22091197967529297,
-0.045204948633909225,
0.07673588395118713,
0.018151091411709785,
0.05560844764113426,
-0.028121596202254295,
0.03338248282670975,
-0.07781149446964264,
0.012366808019578457,
-0.1869659423828125,
0.06265268474817276,
-0.08394725620746613,
0.0026360347401350737,
-0.1369277685880661,
-0.10088492929935455,
-0.010674478486180305,
-0.02037445828318596,
0.09567468613386154,
0.1008254811167717,
-0.15902884304523468,
-0.009495865553617477,
0.10991063714027405,
-0.07643298804759979,
-0.0688413679599762,
0.051064085215330124,
-0.04683392122387886,
0.06079194322228432,
0.03714919835329056,
0.16829411685466766,
0.08530988544225693,
-0.11993720382452011,
0.0026531266048550606,
0.007598105352371931,
0.017411591485142708,
-0.006722957827150822,
0.04838940501213074,
-0.008012650534510612,
0.03759891167283058,
0.01262324396520853,
-0.0678820088505745,
-0.019434576854109764,
-0.09702783823013306,
-0.05842691287398338,
-0.05518316477537155,
-0.09415844827890396,
0.003739100880920887,
0.021468689665198326,
0.035065460950136185,
-0.07731376588344574,
-0.09232427924871445,
0.08547218888998032,
0.1284228265285492,
-0.030774708837270737,
0.021450763568282127,
-0.08138330280780792,
-0.0168266911059618,
-0.019404735416173935,
-0.03471506014466286,
-0.2275509387254715,
-0.08808152377605438,
0.011502603068947792,
-0.03154193237423897,
0.05004153773188591,
-0.00790676660835743,
0.09486962109804153,
0.03310941532254219,
-0.05861695110797882,
0.003119649598374963,
-0.07713747769594193,
-0.01694880612194538,
-0.09937265515327454,
-0.20448750257492065,
-0.08093255758285522,
-0.014195980504155159,
0.1296488493680954,
-0.16840101778507233,
0.004072227980941534,
-0.025178954005241394,
0.13414974510669708,
0.02824467420578003,
-0.05146075412631035,
-0.006512963678687811,
0.032773107290267944,
0.020847972482442856,
-0.09468584507703781,
0.04355674609541893,
0.00906346458941698,
-0.07531481981277466,
-0.02441391348838806,
-0.13178230822086334,
0.032642628997564316,
0.06691056489944458,
0.047592371702194214,
-0.1038631722331047,
0.003585502039641142,
-0.06326841562986374,
-0.04061703756451607,
-0.04917779192328453,
0.03990504890680313,
0.17039690911769867,
0.02334289811551571,
0.11155915260314941,
-0.07123107463121414,
-0.07108954340219498,
0.011646651662886143,
0.00720469793304801,
0.05123605579137802,
0.09285333007574081,
0.0642695501446724,
-0.11097466200590134,
0.061627306044101715,
0.08250582963228226,
-0.0778122991323471,
0.12666016817092896,
-0.03683197498321533,
-0.06722100079059601,
-0.03806965798139572,
-0.02539500594139099,
-0.014012644067406654,
0.14475424587726593,
-0.015725497156381607,
0.033203087747097015,
0.03806529939174652,
0.03273129463195801,
0.02726767584681511,
-0.1740749031305313,
-0.0037858604919165373,
0.00728100910782814,
-0.0496676079928875,
-0.030044380575418472,
0.011455771513283253,
0.03845833241939545,
0.10182694345712662,
0.02802445739507675,
-0.004968919325619936,
-0.0028627358842641115,
-0.007196164224296808,
-0.05242122709751129,
0.20963841676712036,
-0.09653924405574799,
-0.058812204748392105,
-0.09947607666254044,
0.027519606053829193,
-0.04820016399025917,
-0.04631604999303818,
0.007398214656859636,
-0.08230498433113098,
-0.04069798067212105,
-0.06345370411872864,
-0.013553333468735218,
-0.021887417882680893,
0.0015389543259516358,
0.009723486378788948,
0.0030552081298083067,
0.08801258355379105,
-0.13933664560317993,
0.010000421665608883,
-0.0708703026175499,
-0.12173443287611008,
0.011466491967439651,
0.08427219837903976,
0.08260651677846909,
0.07649944722652435,
-0.042404938489198685,
0.021104643121361732,
-0.028910228982567787,
0.2394266128540039,
-0.06758540123701096,
0.018969064578413963,
0.1610306352376938,
0.030972223728895187,
0.053034037351608276,
0.08860429376363754,
0.04448852315545082,
-0.087288998067379,
0.02538706175982952,
0.09573131799697876,
-0.033939678221940994,
-0.27172380685806274,
-0.005411392543464899,
-0.017259448766708374,
-0.10086365789175034,
0.07538699358701706,
0.036038003861904144,
-0.021008776500821114,
0.07093539088964462,
0.0035244461614638567,
-0.004151005297899246,
-0.031962089240550995,
0.07268954068422318,
0.09747131913900375,
0.07000897079706192,
0.12496212869882584,
-0.043284762650728226,
-0.00389271043241024,
0.06846774369478226,
0.015082299709320068,
0.2643793821334839,
-0.04904412105679512,
0.06414848566055298,
0.050093721598386765,
0.13911238312721252,
-0.030092891305685043,
0.0566580556333065,
0.005117675755172968,
-0.01934117265045643,
-0.007439779583364725,
-0.060000088065862656,
-0.016080647706985474,
-0.0008095814846456051,
-0.07352638244628906,
0.05186903476715088,
-0.058346979320049286,
0.05362837016582489,
0.013649821281433105,
0.2929653823375702,
-0.004904154688119888,
-0.27428412437438965,
-0.09174051135778427,
-0.024411197751760483,
-0.01720503717660904,
-0.06431131064891815,
-0.004721763078123331,
0.11773144453763962,
-0.1249227374792099,
0.0684124082326889,
-0.0732313022017479,
0.0907386988401413,
-0.0003236572374589741,
0.0001800468162400648,
0.07664944231510162,
0.18584215641021729,
-0.01970015838742256,
0.0438307523727417,
-0.20631450414657593,
0.23329070210456848,
0.023217979818582535,
0.12206396460533142,
-0.03504890576004982,
0.01054622232913971,
0.03666339069604874,
0.02321206033229828,
0.06604479253292084,
-0.0018669767305254936,
-0.06561153382062912,
-0.13572397828102112,
-0.043026238679885864,
0.058012474328279495,
0.15450994670391083,
-0.02802181802690029,
0.09032079577445984,
-0.04788618162274361,
0.007680526468902826,
0.03209623694419861,
-0.08549033105373383,
-0.14796237647533417,
-0.06910400092601776,
-0.005477454047650099,
0.015307052992284298,
-0.049789752811193466,
-0.06033655256032944,
-0.08141858875751495,
0.008226362057030201,
0.12583187222480774,
-0.01605406031012535,
-0.04068336263298988,
-0.15506938099861145,
0.08017786592245102,
0.16420167684555054,
-0.07033218443393707,
0.029585687443614006,
-0.0006097013829275966,
0.07510533183813095,
0.03705368563532829,
-0.10239510238170624,
0.062277864664793015,
-0.06772974878549576,
-0.1744893491268158,
-0.03402687981724739,
0.10567674040794373,
0.059786852449178696,
0.03900895640254021,
-0.014972629956901073,
0.04890485107898712,
-0.01481675449758768,
-0.10001334547996521,
0.024201957508921623,
-0.0028630790766328573,
0.05500440672039986,
0.05117417126893997,
-0.06556697934865952,
0.04420708492398262,
-0.03139243647456169,
0.004831769969314337,
0.09475262463092804,
0.22585424780845642,
-0.10636699944734573,
0.06224292144179344,
0.03788688778877258,
-0.061912454664707184,
-0.1663103997707367,
0.08576840907335281,
0.1321427971124649,
-0.005432726349681616,
0.07636293768882751,
-0.20252852141857147,
0.14120393991470337,
0.12964923679828644,
-0.02961411513388157,
0.06243027001619339,
-0.25957798957824707,
-0.13643261790275574,
0.05402657762169838,
0.1286718249320984,
0.08185843378305435,
-0.14484627544879913,
-0.026007255539298058,
-0.01437024399638176,
-0.17140266299247742,
0.12566159665584564,
-0.11418573558330536,
0.11859527230262756,
-0.030153000727295876,
0.10856332629919052,
0.017739104107022285,
-0.027161890640854836,
0.11757738888263702,
0.037205636501312256,
0.10978561639785767,
-0.03704783320426941,
0.027138348668813705,
0.060200728476047516,
-0.05187585949897766,
0.0209273099899292,
-0.03555229306221008,
0.0868472158908844,
-0.1240517720580101,
0.012179955840110779,
-0.09808164089918137,
0.05597195029258728,
-0.04442790150642395,
-0.05430617928504944,
-0.027612851932644844,
0.04924270510673523,
0.015999022871255875,
-0.0424584224820137,
0.06776157021522522,
-0.015710661187767982,
0.14214842021465302,
0.09287245571613312,
0.0899890884757042,
-0.012577377259731293,
-0.06586158275604248,
0.01919264905154705,
-0.005768475588411093,
0.059298671782016754,
-0.1391768902540207,
0.014413024298846722,
0.11955448240041733,
0.0657716765999794,
0.0836837887763977,
0.045595817267894745,
-0.0435231551527977,
0.0022011667024344206,
0.042609259486198425,
-0.11437791585922241,
-0.10413751006126404,
0.027332277968525887,
-0.052345383912324905,
-0.12543219327926636,
0.05993613600730896,
0.12470551580190659,
-0.03234226629137993,
-0.02362358197569847,
-0.009553385898470879,
0.00865627359598875,
-0.023136060684919357,
0.20063304901123047,
0.05947958678007126,
0.063052237033844,
-0.10356658697128296,
0.13318303227424622,
0.03310493007302284,
-0.041005585342645645,
0.03315829113125801,
0.11214353889226913,
-0.08802848309278488,
-0.0003966978401876986,
0.06701388210058212,
0.1072394996881485,
-0.09731823951005936,
-0.027803201228380203,
-0.10594809800386429,
-0.09461610019207001,
0.04365646094083786,
0.19592493772506714,
0.05505480244755745,
-0.022519484162330627,
-0.02813779003918171,
0.03507983312010765,
-0.1300373375415802,
0.06079528108239174,
0.0345042422413826,
0.08165702223777771,
-0.09293508529663086,
0.10800404101610184,
0.014283445663750172,
0.03059033676981926,
-0.015680823475122452,
0.017662206664681435,
-0.09295118600130081,
-0.029620597139000893,
-0.11851475387811661,
-0.015239492990076542,
0.0017887785797938704,
0.00833513680845499,
-0.02323967032134533,
-0.07501894235610962,
-0.06934207677841187,
0.05019231140613556,
-0.08696673810482025,
-0.05019199848175049,
0.01411941647529602,
0.015403148718178272,
-0.14332658052444458,
0.004438014235347509,
0.02019612491130829,
-0.07543585449457169,
0.07699210941791534,
0.08228541910648346,
0.033268243074417114,
0.03082393668591976,
-0.12568044662475586,
-0.036820292472839355,
-0.011085583828389645,
-0.005101577378809452,
0.06695917248725891,
-0.0880371555685997,
-0.005622981581836939,
-0.03928080573678017,
0.07928343862295151,
0.007193460129201412,
0.09516437351703644,
-0.13542789220809937,
0.02291698195040226,
-0.06046437472105026,
-0.020928381010890007,
-0.07548784464597702,
0.02918914519250393,
0.1048964336514473,
0.06652901321649551,
0.14531309902668,
-0.07392605394124985,
0.020500250160694122,
-0.2237108200788498,
-0.029541367664933205,
-0.017700709402561188,
-0.06164846941828728,
-0.11248509585857391,
-0.02741655893623829,
0.08561646193265915,
-0.045325540006160736,
0.08078762143850327,
0.0013833320699632168,
0.09073175489902496,
0.045922812074422836,
-0.02864227257668972,
-0.06420929729938507,
-0.010727254673838615,
0.1529632806777954,
0.0557214729487896,
-0.00990376342087984,
0.11987192928791046,
0.023809701204299927,
0.022584890946745872,
0.03411667421460152,
0.2163928896188736,
0.17772066593170166,
-0.015112301334738731,
0.06366536766290665,
0.06542804092168808,
-0.09228815883398056,
-0.08158186823129654,
0.13642075657844543,
-0.02265312895178795,
0.06845154613256454,
-0.054005201905965805,
0.16877107322216034,
0.10433714091777802,
-0.16707640886306763,
0.05651181563735008,
-0.0452946312725544,
-0.07953796535730362,
-0.13896597921848297,
0.011879993602633476,
-0.0749829038977623,
-0.12466804683208466,
0.029060043394565582,
-0.12540102005004883,
0.07594176381826401,
0.13827107846736908,
0.00788513757288456,
0.03417176380753517,
0.14041544497013092,
-0.05718225613236427,
-0.0016817788127809763,
0.04714909568428993,
0.01626015082001686,
0.0023435745388269424,
-0.01646300218999386,
-0.07227329164743423,
0.05586317181587219,
0.014235056936740875,
0.07599407434463501,
-0.055273570120334625,
-0.026715680956840515,
0.016879025846719742,
-0.019407929852604866,
-0.07808591425418854,
0.02110598050057888,
0.04713693633675575,
0.059619076550006866,
0.04897885397076607,
0.04005344957113266,
0.011189418844878674,
-0.04396137595176697,
0.3418039381504059,
-0.07327049225568771,
-0.10381869226694107,
-0.1219564750790596,
0.2699993848800659,
0.03615836054086685,
-0.03186441957950592,
0.045690447092056274,
-0.08866865932941437,
-0.013747052289545536,
0.14451371133327484,
0.15730923414230347,
-0.07933473587036133,
-0.026651382446289062,
-0.007018826436251402,
-0.017344223335385323,
-0.023418935015797615,
0.12657800316810608,
0.09563569724559784,
0.023335613310337067,
-0.07098198682069778,
-0.0019384011393412948,
-0.015073585323989391,
-0.02720450796186924,
-0.05060889199376106,
0.06731603294610977,
0.006525497883558273,
-0.013067391701042652,
-0.020320987328886986,
0.07319450378417969,
0.00692108366638422,
-0.2321661114692688,
0.06673683971166611,
-0.16264884173870087,
-0.18366190791130066,
-0.03229901194572449,
0.05168384313583374,
-0.01303055603057146,
0.05847581475973129,
-0.007739055901765823,
-0.010507943108677864,
0.0862555280327797,
-0.01859297789633274,
-0.03598783537745476,
-0.1439666599035263,
0.10877866297960281,
-0.1158771738409996,
0.19479955732822418,
-0.025640493258833885,
0.05688592046499252,
0.10849116742610931,
0.025895483791828156,
-0.14512860774993896,
0.03262989595532417,
0.043595653027296066,
-0.13710099458694458,
0.01632053405046463,
0.15623965859413147,
-0.03989109396934509,
0.06696870177984238,
0.028993070125579834,
-0.1450299471616745,
-0.01044373121112585,
-0.028632864356040955,
-0.03867968171834946,
-0.05900173261761665,
-0.015529485419392586,
-0.06347420811653137,
0.15729522705078125,
0.22823964059352875,
-0.022619707509875298,
0.02487138658761978,
-0.10561589151620865,
0.01178103405982256,
0.06729321926832199,
0.06704822182655334,
-0.04824874922633171,
-0.19665467739105225,
0.04053124785423279,
0.05135400965809822,
-0.01478610374033451,
-0.22816963493824005,
-0.06289772689342499,
0.04107281193137169,
-0.04936880245804787,
-0.02830391190946102,
0.09340143203735352,
0.04630260542035103,
0.05813615024089813,
-0.024730583652853966,
-0.13792704045772552,
-0.033184148371219635,
0.16266082227230072,
-0.1855178326368332,
-0.047069430351257324
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08073229342699051,
0.08282189816236496,
-0.0024415303487330675,
0.08443990349769592,
0.13730266690254211,
0.032723840326070786,
0.10084681212902069,
0.13862013816833496,
-0.11403364688158035,
0.04270759969949722,
0.08809870481491089,
0.0823967307806015,
0.031139317899942398,
0.13587012887001038,
-0.03310273215174675,
-0.25478434562683105,
-0.005657122004777193,
-0.01927383430302143,
-0.09570574015378952,
0.109385184943676,
0.09829937666654587,
-0.10054352879524231,
0.07203478366136551,
-0.014452210627496243,
-0.17978446185588837,
0.017574183642864227,
-0.012424025684595108,
-0.05033361539244652,
0.11862523853778839,
-0.010104126296937466,
0.07654465734958649,
0.00976764690130949,
0.12331180274486542,
-0.18958766758441925,
0.01637813076376915,
0.07552402466535568,
0.046387143433094025,
0.09702137112617493,
0.009250737726688385,
-0.010607706382870674,
0.13214614987373352,
-0.12746481597423553,
0.09937620162963867,
0.031839195638895035,
-0.09524855017662048,
-0.2046143263578415,
-0.09574383497238159,
0.006590524688363075,
0.04455399140715599,
0.08497745543718338,
0.00983912218362093,
0.15947142243385315,
-0.099094457924366,
0.08275600522756577,
0.23068207502365112,
-0.2753770053386688,
-0.07998471707105637,
0.054602157324552536,
0.0601053312420845,
0.08201809227466583,
-0.12687425315380096,
-0.009907092899084091,
0.006969383917748928,
0.02599397487938404,
0.10323719680309296,
-0.03208085894584656,
-0.08873690664768219,
0.0021810410544276237,
-0.10596481710672379,
0.007002983707934618,
0.10891303420066833,
0.034224770963191986,
-0.05250737443566322,
-0.07084819674491882,
-0.043620575219392776,
-0.052902791649103165,
-0.0345546193420887,
-0.02011146768927574,
0.03909013047814369,
-0.05962595343589783,
-0.13817648589611053,
-0.04788706824183464,
-0.0495062917470932,
-0.09063784033060074,
-0.0063610379584133625,
0.21814939379692078,
0.036370035260915756,
0.031627412885427475,
-0.05215178802609444,
0.1026860848069191,
0.012622336857020855,
-0.12735195457935333,
-0.02919606864452362,
-0.0046793632209300995,
-0.0908513069152832,
-0.03640199080109596,
-0.05977465584874153,
0.020362813025712967,
0.0345788300037384,
0.21287748217582703,
-0.04317149147391319,
0.07843250036239624,
0.03296919912099838,
-0.017355145886540413,
-0.02755145914852619,
0.13762010633945465,
-0.020839331671595573,
-0.07265667617321014,
0.011235217563807964,
0.06347998231649399,
0.009507467970252037,
-0.00449464563280344,
-0.06289827078580856,
-0.04000364989042282,
0.06503752619028091,
0.047342244535684586,
-0.06156539171934128,
0.03080659732222557,
-0.00728691928088665,
-0.023413993418216705,
0.001626469544135034,
-0.11521898210048676,
0.01574806310236454,
-0.009339995682239532,
-0.08192568272352219,
-0.04616537690162659,
0.007180133368819952,
-0.011382125318050385,
0.01165984757244587,
0.09652121365070343,
-0.07384316623210907,
-0.024032479152083397,
-0.0805448517203331,
-0.07506664842367172,
-0.016647592186927795,
-0.15739747881889343,
0.019703660160303116,
-0.06219290941953659,
-0.16011656820774078,
-0.030734580010175705,
0.055450040847063065,
-0.08164654672145844,
-0.02836637571454048,
-0.03298836946487427,
-0.07912269979715347,
0.02124280296266079,
0.0022799011785537004,
0.21768876910209656,
-0.04944614693522453,
0.08854977041482925,
0.00928155705332756,
0.057656124234199524,
-0.008902397006750107,
0.03616637736558914,
-0.08671142905950546,
0.009000289253890514,
-0.1776394248008728,
0.07484441250562668,
-0.08045972883701324,
0.017764372751116753,
-0.13986369967460632,
-0.0856785923242569,
-0.01118921022862196,
-0.02201593667268753,
0.07985847443342209,
0.10424529016017914,
-0.14334028959274292,
-0.020965326577425003,
0.11619787663221359,
-0.0659007802605629,
-0.055286988615989685,
0.0561230406165123,
-0.07613406330347061,
0.08634517341852188,
0.0540909506380558,
0.1917358636856079,
0.08742082118988037,
-0.10616259276866913,
0.013397319242358208,
0.013344957493245602,
0.034851450473070145,
0.002055071759968996,
0.04954526573419571,
0.005906227510422468,
0.029594020918011665,
0.01655755564570427,
-0.07150426506996155,
0.0073909396305680275,
-0.09119345992803574,
-0.05913829803466797,
-0.04887382313609123,
-0.08407314121723175,
-0.006192153785377741,
0.013525509275496006,
0.035688988864421844,
-0.08025341480970383,
-0.08313746005296707,
0.07546346634626389,
0.1366281807422638,
-0.04685106873512268,
0.017903609201312065,
-0.07557071000337601,
-0.0036785320844501257,
-0.03244958817958832,
-0.024040963500738144,
-0.20435704290866852,
-0.058274541050195694,
0.032855287194252014,
-0.006670587696135044,
0.0454210489988327,
0.003194990335032344,
0.08474534004926682,
0.02579333260655403,
-0.05471263453364372,
-0.0009098047739826143,
-0.08611422032117844,
-0.007588138338178396,
-0.09489987790584564,
-0.22091127932071686,
-0.053102586418390274,
-0.039052512496709824,
0.13472403585910797,
-0.1658788025379181,
-0.004163431469351053,
-0.01907968334853649,
0.11567448824644089,
0.04238784313201904,
-0.051320720463991165,
-0.00486726826056838,
0.02873154543340206,
0.013053247705101967,
-0.09728791564702988,
0.03879842534661293,
0.015493452548980713,
-0.09251955896615982,
-0.028692469000816345,
-0.10349363088607788,
-0.010904159396886826,
0.07003406435251236,
0.07034566253423691,
-0.10264474153518677,
-0.014201757498085499,
-0.0625266581773758,
-0.028671005740761757,
-0.0538807213306427,
0.039445891976356506,
0.18131904304027557,
0.017532343044877052,
0.10882547497749329,
-0.0747586190700531,
-0.08314426243305206,
0.019453879445791245,
0.008921543136239052,
0.06177708879113197,
0.10247469693422318,
0.07624589651823044,
-0.11068636924028397,
0.05775344371795654,
0.09300586581230164,
-0.05200890079140663,
0.1369590312242508,
-0.047064151614904404,
-0.07317018508911133,
-0.031127529218792915,
-0.00844374019652605,
-0.007131799124181271,
0.15038172900676727,
-0.036975979804992676,
0.017311181873083115,
0.03333597630262375,
0.037751927971839905,
0.004500760696828365,
-0.16202937066555023,
-0.014969034120440483,
0.013671491295099258,
-0.047719940543174744,
-0.023223746567964554,
0.015232997946441174,
0.015946464613080025,
0.09538450092077255,
0.04144958034157753,
-0.0050614881329238415,
0.0071215396746993065,
-0.010126756504178047,
-0.042743980884552,
0.20102450251579285,
-0.0909012034535408,
-0.041698161512613297,
-0.07470357418060303,
-0.001469254377298057,
-0.023645762354135513,
-0.04070484638214111,
0.015363607555627823,
-0.09610413759946823,
-0.0254057627171278,
-0.07177563011646271,
-0.00034804490860551596,
-0.04138655960559845,
0.01480616070330143,
0.003770515788346529,
0.014322346076369286,
0.058975186198949814,
-0.1324211061000824,
0.010184766724705696,
-0.06834106892347336,
-0.11182327568531036,
0.028972379863262177,
0.06096704304218292,
0.08001236617565155,
0.058115117251873016,
-0.0322478711605072,
0.018843475729227066,
-0.04129090905189514,
0.2315138429403305,
-0.07782933115959167,
0.012495960108935833,
0.12337258458137512,
0.027575848624110222,
0.03979838639497757,
0.1054743081331253,
0.032999102026224136,
-0.10007850080728531,
0.03917371854186058,
0.0774763822555542,
-0.04104842245578766,
-0.24565570056438446,
0.00777687830850482,
-0.03845309093594551,
-0.09616419672966003,
0.08855560421943665,
0.050845809280872345,
-0.04502655938267708,
0.06469695270061493,
0.010863439179956913,
0.00997143518179655,
-0.0243564173579216,
0.08817005902528763,
0.09001114219427109,
0.06647384166717529,
0.10796932876110077,
-0.03858331963419914,
-0.017580214887857437,
0.06323809921741486,
0.0291727464646101,
0.3049585819244385,
-0.04870744049549103,
0.0851709321141243,
0.04925566166639328,
0.13972100615501404,
-0.02191426046192646,
0.04643171653151512,
0.007583585102111101,
-0.006929642055183649,
-0.0300157330930233,
-0.053787875920534134,
-0.01896171271800995,
0.0020423477981239557,
-0.07920004427433014,
0.04558601975440979,
-0.05248653143644333,
0.05068083480000496,
0.018715063109993935,
0.29124167561531067,
0.0028523923829197884,
-0.2628921866416931,
-0.09363379329442978,
-0.01574154756963253,
-0.037671539932489395,
-0.051993176341056824,
0.011760023422539234,
0.12103404104709625,
-0.12321042269468307,
0.036863502115011215,
-0.0712369903922081,
0.08059628307819366,
-0.028476789593696594,
-0.0018964930204674602,
0.04207388684153557,
0.17277362942695618,
-0.02149340882897377,
0.055900588631629944,
-0.22348713874816895,
0.22906705737113953,
0.014574223197996616,
0.12856799364089966,
-0.058685220777988434,
0.008590915240347385,
0.025992166250944138,
0.003563478123396635,
0.08897340297698975,
-0.004122155252844095,
-0.06637568771839142,
-0.13563665747642517,
-0.05177034065127373,
0.07409005612134933,
0.14343684911727905,
-0.041190437972545624,
0.09691140800714493,
-0.05754305422306061,
0.014531799592077732,
0.03750166296958923,
-0.08849143236875534,
-0.13430148363113403,
-0.09783046692609787,
-0.022862309589982033,
0.014388060197234154,
-0.07218361645936966,
-0.05650560185313225,
-0.06924153864383698,
0.03185891732573509,
0.10445856302976608,
0.0164363794028759,
-0.031234633177518845,
-0.14713861048221588,
0.07722922414541245,
0.15624438226222992,
-0.06719274818897247,
0.025647228583693504,
-0.003613509237766266,
0.0747087374329567,
0.038440559059381485,
-0.08287703990936279,
0.06148506700992584,
-0.06505095213651657,
-0.1789296716451645,
-0.04836604371666908,
0.09442305564880371,
0.07029199600219727,
0.041743963956832886,
-0.0030148623045533895,
0.05248650908470154,
-0.02585335075855255,
-0.0934484675526619,
0.024981413036584854,
0.020357750356197357,
0.03647218272089958,
0.0363583043217659,
-0.0859321728348732,
0.07628215849399567,
-0.03773124888539314,
-0.013737511821091175,
0.11331755667924881,
0.23158566653728485,
-0.10109370946884155,
0.09800857305526733,
0.06156129762530327,
-0.06182054430246353,
-0.16078391671180725,
0.07203347980976105,
0.10373176634311676,
0.006923883222043514,
0.07128243893384933,
-0.2103153020143509,
0.12971794605255127,
0.10250301659107208,
-0.016521774232387543,
0.04043104872107506,
-0.2720085382461548,
-0.11961539089679718,
0.04437718540430069,
0.12927034497261047,
0.0965636596083641,
-0.12462622672319412,
-0.013629334047436714,
-0.015627797693014145,
-0.11517715454101562,
0.09438654035329819,
-0.11346440017223358,
0.1357993483543396,
-0.029317017644643784,
0.11172646284103394,
0.010499415919184685,
-0.02612215094268322,
0.10179606080055237,
0.047766584903001785,
0.10244906693696976,
-0.04227902367711067,
0.0041983360424637794,
0.06294967979192734,
-0.04831613600254059,
0.0026700205635279417,
-0.07611008733510971,
0.0877053365111351,
-0.13062678277492523,
-0.0031409154180437326,
-0.09217759221792221,
0.04595837742090225,
-0.03893938660621643,
-0.06853377819061279,
-0.04139888286590576,
0.05693964660167694,
0.04806315526366234,
-0.036170877516269684,
0.04661446064710617,
-0.018205193802714348,
0.10404175519943237,
0.03718053549528122,
0.08706415444612503,
0.013775210827589035,
-0.04518347233533859,
0.023263486102223396,
-0.009843643754720688,
0.06411907076835632,
-0.16517230868339539,
0.010957900434732437,
0.0997346043586731,
0.060195617377758026,
0.09702549874782562,
0.044957105070352554,
-0.0458296574652195,
0.01718461513519287,
0.029729878529906273,
-0.10105209797620773,
-0.1071465015411377,
0.04742395877838135,
-0.028269266709685326,
-0.1398937851190567,
0.050316184759140015,
0.1206163838505745,
-0.0442715547978878,
-0.027241874486207962,
-0.017215417698025703,
0.004657536745071411,
-0.023303283378481865,
0.18306347727775574,
0.04944521188735962,
0.0557570643723011,
-0.10192904621362686,
0.12813478708267212,
0.028519509360194206,
-0.020808247849345207,
0.05148843303322792,
0.08362920582294464,
-0.1031617522239685,
-0.0031953901052474976,
0.0841686874628067,
0.13823556900024414,
-0.0578421913087368,
-0.004989957436919212,
-0.10451634228229523,
-0.08406060189008713,
0.0502169132232666,
0.14411868155002594,
0.049168895930051804,
-0.017115110531449318,
-0.05366640165448189,
0.04001390188932419,
-0.1412208527326584,
0.0713297426700592,
0.02416776306927204,
0.06409647315740585,
-0.07667587697505951,
0.05920160189270973,
0.007361515425145626,
0.014171329326927662,
-0.016738824546337128,
0.00871706660836935,
-0.09350232034921646,
-0.01650485396385193,
-0.07960931956768036,
-0.0028125422541052103,
0.00010747152555268258,
0.016670240089297295,
-0.02097717672586441,
-0.07103876769542694,
-0.04813902825117111,
0.03719152882695198,
-0.08796966075897217,
-0.05005980655550957,
0.009473428130149841,
0.040643706917762756,
-0.12314473092556,
-0.005598185583949089,
0.021377364173531532,
-0.09261573106050491,
0.09434836357831955,
0.07258196920156479,
0.017383117228746414,
0.031199883669614792,
-0.12459225952625275,
-0.033508531749248505,
-0.01007008831948042,
-0.008898003958165646,
0.0624430850148201,
-0.09424379467964172,
-0.009413322433829308,
-0.03778665512800217,
0.07121458649635315,
0.012883033603429794,
0.06820745021104813,
-0.13392913341522217,
0.019525442272424698,
-0.07634413987398148,
-0.04756360128521919,
-0.07506407797336578,
0.03550579771399498,
0.0974075123667717,
0.06009819731116295,
0.15157659351825714,
-0.07679159939289093,
0.024163737893104553,
-0.20644603669643402,
-0.034880440682172775,
-0.0064193750731647015,
-0.06086516007781029,
-0.15145868062973022,
-0.04707341641187668,
0.08124496787786484,
-0.038177162408828735,
0.09095039963722229,
-0.018024617806077003,
0.07632923871278763,
0.03765039145946503,
-0.04626261815428734,
-0.05185459926724434,
-0.015582817606627941,
0.19756343960762024,
0.0723080039024353,
-0.016343429684638977,
0.11238711327314377,
0.0019132968736812472,
0.029435276985168457,
0.05290643870830536,
0.18272502720355988,
0.21372006833553314,
0.03188936039805412,
0.055742230266332626,
0.06478888541460037,
-0.07489743083715439,
-0.07014954835176468,
0.17955443263053894,
-0.014988112263381481,
0.0700579509139061,
-0.04907330125570297,
0.19708316028118134,
0.10809314996004105,
-0.1672007292509079,
0.04563020542263985,
-0.04410691559314728,
-0.08112822473049164,
-0.1255199909210205,
-0.010898929089307785,
-0.08619049936532974,
-0.12625442445278168,
0.03773486614227295,
-0.11718299984931946,
0.0531674325466156,
0.10973428934812546,
0.013174057938158512,
0.03677988424897194,
0.12745000422000885,
-0.018933365121483803,
0.0035160137340426445,
0.06441017240285873,
0.005176191683858633,
-0.01248262356966734,
-0.03595054894685745,
-0.07856319844722748,
0.0501289963722229,
0.0014184159226715565,
0.07753732800483704,
-0.047152116894721985,
-0.016300901770591736,
0.02575237676501274,
-0.027332661673426628,
-0.08035251498222351,
0.027525627985596657,
0.04130861163139343,
0.05639851093292236,
0.04859277978539467,
0.04634465277194977,
-0.008832654915750027,
-0.03323245421051979,
0.3198454678058624,
-0.06675674021244049,
-0.09984426945447922,
-0.12115282565355301,
0.2217135727405548,
0.02962491102516651,
-0.030524110421538353,
0.03350114822387695,
-0.08276413381099701,
-0.0100486995652318,
0.1581011861562729,
0.1665382981300354,
-0.07214049994945526,
-0.022530794143676758,
-0.005907644517719746,
-0.017960714176297188,
-0.03658943623304367,
0.12818914651870728,
0.08914346247911453,
-0.023897001519799232,
-0.06330730766057968,
-0.012705602683126926,
-0.019229186698794365,
-0.03170512616634369,
-0.04021485894918442,
0.04134335368871689,
0.01598099246621132,
-0.02523595094680786,
-0.04160207137465477,
0.07387145608663559,
0.005128599237650633,
-0.2562701404094696,
0.06642013788223267,
-0.15757527947425842,
-0.17095205187797546,
-0.04409563168883324,
0.03699590265750885,
-0.0006808391190133989,
0.05751332640647888,
-0.01772194541990757,
0.008754495531320572,
0.07792546600103378,
-0.017567027360200882,
-0.031223680824041367,
-0.1258479356765747,
0.12421957403421402,
-0.06619187444448471,
0.17132435739040375,
-0.02829854190349579,
0.04902768135070801,
0.11558302491903305,
0.028585214167833328,
-0.1363046020269394,
0.041523244231939316,
0.0536215640604496,
-0.10619659721851349,
0.015650827437639236,
0.1520717740058899,
-0.04612116888165474,
0.0926055759191513,
0.04229074716567993,
-0.10895620286464691,
0.005091724917292595,
-0.05656895413994789,
-0.03360709175467491,
-0.08118242770433426,
-0.013055120594799519,
-0.06019951403141022,
0.1688179075717926,
0.2198750376701355,
-0.03002629242837429,
0.011521010659635067,
-0.10200022161006927,
0.01625567488372326,
0.06963881850242615,
0.029215266928076744,
-0.0572354681789875,
-0.1888507604598999,
0.011757811531424522,
0.06880400329828262,
-0.0073136924766004086,
-0.24157968163490295,
-0.07551239430904388,
0.039368126541376114,
-0.03692300617694855,
-0.04048380255699158,
0.10323195904493332,
0.04060003533959389,
0.05204012617468834,
-0.03028942085802555,
-0.16001006960868835,
-0.03152058273553848,
0.1543722152709961,
-0.17508509755134583,
-0.036296140402555466
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-128-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-128-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-128-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-128-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08128967136144638,
0.08359725773334503,
-0.0024582992773503065,
0.08485375344753265,
0.13737520575523376,
0.0331815630197525,
0.100110724568367,
0.13857029378414154,
-0.11415614187717438,
0.04244435206055641,
0.08820167183876038,
0.08194875717163086,
0.031014369800686836,
0.1353217512369156,
-0.03347517549991608,
-0.2546077370643616,
-0.0063606384210288525,
-0.019146421924233437,
-0.0944574624300003,
0.10899489372968674,
0.09839246422052383,
-0.10068284720182419,
0.07207230478525162,
-0.014728403650224209,
-0.17932240664958954,
0.017404137179255486,
-0.01203838735818863,
-0.05044900253415108,
0.11891387403011322,
-0.009521746076643467,
0.07652470469474792,
0.008993725292384624,
0.12355496734380722,
-0.19006004929542542,
0.016232328489422798,
0.07574444264173508,
0.04639199376106262,
0.09705323725938797,
0.008332588709890842,
-0.009949645958840847,
0.13168928027153015,
-0.12787172198295593,
0.09942259639501572,
0.031606320291757584,
-0.09497137367725372,
-0.2049909085035324,
-0.09528597444295883,
0.007513225544244051,
0.04439370334148407,
0.08434940874576569,
0.010680735111236572,
0.1599687933921814,
-0.09805377572774887,
0.08328841626644135,
0.2312125712633133,
-0.27486392855644226,
-0.0793096125125885,
0.054004646837711334,
0.059926584362983704,
0.0831507220864296,
-0.12605831027030945,
-0.009741170331835747,
0.006837903521955013,
0.0257697906345129,
0.10374238342046738,
-0.03270593285560608,
-0.09029167890548706,
0.0021489288192242384,
-0.1056719422340393,
0.006782286334782839,
0.10927563160657883,
0.03449433296918869,
-0.052727457135915756,
-0.07043082267045975,
-0.044020600616931915,
-0.05362807959318161,
-0.03494948521256447,
-0.020180944353342056,
0.03880031779408455,
-0.05947800725698471,
-0.13714011013507843,
-0.04826834797859192,
-0.04906662926077843,
-0.08978573232889175,
-0.0059456913731992245,
0.2181154489517212,
0.036630697548389435,
0.03148128464818001,
-0.051402319222688675,
0.1025368869304657,
0.01109606958925724,
-0.12718448042869568,
-0.02884601801633835,
-0.0042526922188699245,
-0.09092282503843307,
-0.036570508033037186,
-0.059820786118507385,
0.01938161626458168,
0.034476298838853836,
0.21386249363422394,
-0.04289982467889786,
0.07823485136032104,
0.033152781426906586,
-0.016803525388240814,
-0.027465814724564552,
0.13840387761592865,
-0.0212686937302351,
-0.07375848293304443,
0.012248607352375984,
0.06322550028562546,
0.010097017511725426,
-0.005076112225651741,
-0.06384793668985367,
-0.04039343073964119,
0.0652080774307251,
0.04713521525263786,
-0.06154746189713478,
0.030359044671058655,
-0.007302920334041119,
-0.023530617356300354,
0.0019111180445179343,
-0.11527163535356522,
0.01594988815486431,
-0.009338456206023693,
-0.08185187727212906,
-0.0458153672516346,
0.007816643454134464,
-0.010647074319422245,
0.011615239083766937,
0.09600386768579483,
-0.07306469976902008,
-0.023540474474430084,
-0.0804675742983818,
-0.07518786191940308,
-0.016484707593917847,
-0.15692494809627533,
0.020169196650385857,
-0.06322217732667923,
-0.16038092970848083,
-0.03056364692747593,
0.055569231510162354,
-0.08110414445400238,
-0.02821592427790165,
-0.0323856882750988,
-0.07825178653001785,
0.020825505256652832,
0.0020584187004715204,
0.21660777926445007,
-0.04963644593954086,
0.08818384259939194,
0.009080047719180584,
0.057216826826334,
-0.009939892217516899,
0.03636988624930382,
-0.08604926615953445,
0.008946949616074562,
-0.17787553369998932,
0.074872687458992,
-0.07999255508184433,
0.018288254737854004,
-0.13883425295352936,
-0.0855223536491394,
-0.010057277046144009,
-0.021550046280026436,
0.08021209388971329,
0.10334347933530807,
-0.14349322021007538,
-0.0207439623773098,
0.11559391021728516,
-0.06520724296569824,
-0.05545201897621155,
0.05750003084540367,
-0.07624748349189758,
0.08601631224155426,
0.05456145480275154,
0.191519632935524,
0.0884065330028534,
-0.10605736076831818,
0.01460630539804697,
0.014505266211926937,
0.035415299236774445,
0.001118829706683755,
0.04937010258436203,
0.006016705185174942,
0.029725022614002228,
0.01674235798418522,
-0.07079122215509415,
0.007922121323645115,
-0.09120369702577591,
-0.059493135660886765,
-0.04913411661982536,
-0.08411984890699387,
-0.006935108918696642,
0.013786436058580875,
0.03570264205336571,
-0.0800437331199646,
-0.08342195302248001,
0.0764516144990921,
0.1366519331932068,
-0.04710637032985687,
0.018129337579011917,
-0.0750972330570221,
-0.0030896218959242105,
-0.03130365535616875,
-0.024107813835144043,
-0.20403090119361877,
-0.057061657309532166,
0.032770924270153046,
-0.006795873399823904,
0.045108016580343246,
0.0033149663358926773,
0.08386719226837158,
0.026264062151312828,
-0.05425536632537842,
1.6383120282625896e-7,
-0.08556237816810608,
-0.0072560301050543785,
-0.09566410630941391,
-0.22035561501979828,
-0.05308375135064125,
-0.03862305358052254,
0.13524681329727173,
-0.1668984591960907,
-0.003832095768302679,
-0.01996796578168869,
0.11518833041191101,
0.04197513312101364,
-0.05126403272151947,
-0.005066609941422939,
0.029356444254517555,
0.01284276694059372,
-0.09747027605772018,
0.03880513831973076,
0.016042539849877357,
-0.09325452893972397,
-0.029726136475801468,
-0.10343585163354874,
-0.01036168448626995,
0.06956624239683151,
0.06911826133728027,
-0.10267907381057739,
-0.014774397015571594,
-0.062082257121801376,
-0.02859107404947281,
-0.05332037806510925,
0.038645241409540176,
0.18199722468852997,
0.017729923129081726,
0.11011932790279388,
-0.07458572089672089,
-0.08276649564504623,
0.01923074759542942,
0.008875187486410141,
0.06248939782381058,
0.10218621045351028,
0.0760665237903595,
-0.1089041531085968,
0.057154007256031036,
0.09225304424762726,
-0.053133610635995865,
0.13607437908649445,
-0.046820271760225296,
-0.0728486031293869,
-0.031280517578125,
-0.008610423654317856,
-0.007089048624038696,
0.1501912623643875,
-0.03817034140229225,
0.016655344516038895,
0.033459827303886414,
0.03734787181019783,
0.0047146049328148365,
-0.16207255423069,
-0.014942223206162453,
0.014304662123322487,
-0.047323618084192276,
-0.02211005613207817,
0.014583409763872623,
0.01577644981443882,
0.09505997598171234,
0.04123039171099663,
-0.005446098279207945,
0.0076025002636015415,
-0.010075210593640804,
-0.04290972650051117,
0.20058587193489075,
-0.09104293584823608,
-0.04268449544906616,
-0.07577753067016602,
-0.0013272261712700129,
-0.023721732199192047,
-0.04091835767030716,
0.015634896233677864,
-0.09461688250303268,
-0.025434769690036774,
-0.07218845188617706,
-0.0015559043968096375,
-0.041237011551856995,
0.014571844600141048,
0.00385991670191288,
0.013996554538607597,
0.05937696620821953,
-0.13232754170894623,
0.01007819827646017,
-0.06774292886257172,
-0.11124475300312042,
0.02915153093636036,
0.06114970147609711,
0.08035416156053543,
0.05880436673760414,
-0.03277727589011192,
0.018512580543756485,
-0.04087728261947632,
0.23121806979179382,
-0.07738129794597626,
0.012744567357003689,
0.12343834340572357,
0.02659354731440544,
0.04000183194875717,
0.10482034087181091,
0.03343817964196205,
-0.10008691251277924,
0.038929879665374756,
0.07689563930034637,
-0.041392069309949875,
-0.24511198699474335,
0.007708874065428972,
-0.03864168003201485,
-0.09601522237062454,
0.08828505873680115,
0.05083800479769707,
-0.0444876104593277,
0.06479355692863464,
0.01124754548072815,
0.011250853538513184,
-0.02514900080859661,
0.08792468905448914,
0.0905207172036171,
0.06601183116436005,
0.10763909667730331,
-0.03829342871904373,
-0.01735144481062889,
0.06360824406147003,
0.028350830078125,
0.30342501401901245,
-0.04889874532818794,
0.08571915328502655,
0.04870371147990227,
0.14021140336990356,
-0.022074338048696518,
0.04591662436723709,
0.007226202171295881,
-0.007117546629160643,
-0.03015696071088314,
-0.05380445718765259,
-0.0199327003210783,
0.0025234350468963385,
-0.07979944348335266,
0.046203624457120895,
-0.052497491240501404,
0.051203809678554535,
0.01845409721136093,
0.29109424352645874,
0.002637606579810381,
-0.2626953721046448,
-0.0937444269657135,
-0.015866117551922798,
-0.03782772272825241,
-0.05233890563249588,
0.011828897520899773,
0.12171074002981186,
-0.12313251197338104,
0.036184974014759064,
-0.07063547521829605,
0.08080989122390747,
-0.0295421089977026,
-0.0016314660897478461,
0.041658900678157806,
0.17319847643375397,
-0.021418171003460884,
0.05601280555129051,
-0.22446346282958984,
0.22799591720104218,
0.014789595268666744,
0.12865473330020905,
-0.05867135897278786,
0.009041893295943737,
0.026016460731625557,
0.005133417434990406,
0.08824120461940765,
-0.003805422456935048,
-0.06541753560304642,
-0.13682246208190918,
-0.052099812775850296,
0.07399440556764603,
0.14246919751167297,
-0.0398205928504467,
0.09649576991796494,
-0.05784020945429802,
0.014625485055148602,
0.037465259432792664,
-0.08788664638996124,
-0.13427706062793732,
-0.0980578288435936,
-0.023271845653653145,
0.015205278992652893,
-0.07188135385513306,
-0.05637663975358009,
-0.0688961073756218,
0.0302828811109066,
0.1047566756606102,
0.01690082997083664,
-0.031310517340898514,
-0.14707423746585846,
0.07789915055036545,
0.15553857386112213,
-0.06739865243434906,
0.02583380602300167,
-0.00367863685823977,
0.07462801784276962,
0.0385269932448864,
-0.08216391503810883,
0.06134210154414177,
-0.06494738161563873,
-0.17849700152873993,
-0.048600759357213974,
0.09384465962648392,
0.07022815942764282,
0.041830163449048996,
-0.002756374189630151,
0.052204594016075134,
-0.026163993403315544,
-0.0934029370546341,
0.02387627772986889,
0.021305643022060394,
0.036087516695261,
0.03667560592293739,
-0.08612184226512909,
0.07711058109998703,
-0.03750529885292053,
-0.013993971049785614,
0.11433296650648117,
0.23120583593845367,
-0.10144384950399399,
0.09725628793239594,
0.06184433028101921,
-0.06200971454381943,
-0.16022662818431854,
0.07204761356115341,
0.10331054776906967,
0.007053372450172901,
0.07151934504508972,
-0.20979060232639313,
0.12994857132434845,
0.1035279780626297,
-0.016266513615846634,
0.040495507419109344,
-0.2721276581287384,
-0.11982512474060059,
0.04515429958701134,
0.12931711971759796,
0.09790465235710144,
-0.12523211538791656,
-0.013605804182589054,
-0.016238270327448845,
-0.11620611697435379,
0.09331651777029037,
-0.11287322640419006,
0.13558706641197205,
-0.029175443574786186,
0.11071385443210602,
0.01060877088457346,
-0.02615487203001976,
0.1022498607635498,
0.048241183161735535,
0.10245294123888016,
-0.04270024597644806,
0.004448867402970791,
0.06232969090342522,
-0.04848715662956238,
0.0031088278628885746,
-0.0759044662117958,
0.08777705579996109,
-0.13226816058158875,
-0.0032180913258343935,
-0.09105737507343292,
0.04615001007914543,
-0.038796115666627884,
-0.06860359758138657,
-0.0414358526468277,
0.056201644241809845,
0.0478515625,
-0.03599739447236061,
0.04759759455919266,
-0.01812633126974106,
0.10352932661771774,
0.03791295364499092,
0.08595115691423416,
0.011340372264385223,
-0.045283399522304535,
0.02317856065928936,
-0.009855040349066257,
0.06405142694711685,
-0.165140300989151,
0.011539972387254238,
0.10009744018316269,
0.0603756457567215,
0.09763384610414505,
0.043972235172986984,
-0.04528898000717163,
0.017452208325266838,
0.029358699917793274,
-0.10068237036466599,
-0.10627312958240509,
0.04703608155250549,
-0.029004091396927834,
-0.1395203322172165,
0.049515146762132645,
0.12090004980564117,
-0.0449637696146965,
-0.02679520845413208,
-0.017470311373472214,
0.003853950649499893,
-0.023415766656398773,
0.1824398636817932,
0.05026554688811302,
0.05546226352453232,
-0.10180269926786423,
0.12784382700920105,
0.028894372284412384,
-0.019803928211331367,
0.05158865079283714,
0.08357825130224228,
-0.10321535170078278,
-0.0034168134443461895,
0.08431532979011536,
0.1383044421672821,
-0.05858122557401657,
-0.0058610341511666775,
-0.1049441322684288,
-0.0832103043794632,
0.049937304109334946,
0.14341579377651215,
0.049502983689308167,
-0.017381422221660614,
-0.05385298654437065,
0.03959984332323074,
-0.14121723175048828,
0.0712113305926323,
0.02416878752410412,
0.06438292562961578,
-0.07687097787857056,
0.06080766022205353,
0.007858914323151112,
0.01430421881377697,
-0.01662829890847206,
0.00846023578196764,
-0.09342166781425476,
-0.016249172389507294,
-0.08135771751403809,
-0.0025559680070728064,
0.0004806222568731755,
0.016971206292510033,
-0.020967785269021988,
-0.07094001770019531,
-0.04801660031080246,
0.03725504130125046,
-0.08754171431064606,
-0.04997008293867111,
0.009804143570363522,
0.04056931659579277,
-0.12263448536396027,
-0.00594030087813735,
0.021035533398389816,
-0.09226342290639877,
0.09462518990039825,
0.07193277776241302,
0.01710989885032177,
0.031041566282510757,
-0.1251053661108017,
-0.03323066607117653,
-0.010083126835525036,
-0.008200501091778278,
0.06260376423597336,
-0.09296772629022598,
-0.008920307271182537,
-0.03758835047483444,
0.07101664692163467,
0.012643113732337952,
0.06830380111932755,
-0.1341119110584259,
0.019641390070319176,
-0.07622026652097702,
-0.04752585291862488,
-0.07543980330228806,
0.03533313050866127,
0.0968935564160347,
0.05965500324964523,
0.1513255536556244,
-0.07663944363594055,
0.02400342747569084,
-0.20639029145240784,
-0.035073794424533844,
-0.006313784047961235,
-0.06068209558725357,
-0.15149402618408203,
-0.04801106080412865,
0.08109497278928757,
-0.03785211965441704,
0.09248364716768265,
-0.01774914562702179,
0.07658669352531433,
0.037408459931612015,
-0.046877723187208176,
-0.05162845551967621,
-0.016006452962756157,
0.1977606564760208,
0.07283982634544373,
-0.01616096682846546,
0.11163633316755295,
0.002153022913262248,
0.030331604182720184,
0.05235985293984413,
0.1810900717973709,
0.2138766199350357,
0.03142125532031059,
0.05553237348794937,
0.06456201523542404,
-0.07462836802005768,
-0.07034756243228912,
0.179053395986557,
-0.015124223195016384,
0.07075832039117813,
-0.0492112971842289,
0.19743488729000092,
0.10779597610235214,
-0.16711562871932983,
0.04552865028381348,
-0.0435698926448822,
-0.08129672706127167,
-0.12565451860427856,
-0.010230143554508686,
-0.08626147359609604,
-0.12617096304893494,
0.037203967571258545,
-0.11670857667922974,
0.053328946232795715,
0.11035821586847305,
0.012922507710754871,
0.03658745065331459,
0.126522958278656,
-0.018950264900922775,
0.0036222038324922323,
0.06440772116184235,
0.005181069951504469,
-0.01218409277498722,
-0.037084367126226425,
-0.07918856292963028,
0.04984838142991066,
0.0015877321129664779,
0.0779549777507782,
-0.04728524759411812,
-0.014996654354035854,
0.02612902782857418,
-0.027219559997320175,
-0.08060433715581894,
0.027592694386839867,
0.04067596420645714,
0.056311991065740585,
0.04747188836336136,
0.04663584753870964,
-0.008772328495979309,
-0.03335937485098839,
0.3192375898361206,
-0.06629769504070282,
-0.10019999742507935,
-0.12094305455684662,
0.2209051102399826,
0.029837308451533318,
-0.03052767924964428,
0.03346749395132065,
-0.0828702375292778,
-0.009248404763638973,
0.15857815742492676,
0.16589879989624023,
-0.072990283370018,
-0.022485798224806786,
-0.005639892071485519,
-0.018077697604894638,
-0.037448689341545105,
0.12862277030944824,
0.08940702676773071,
-0.025006869807839394,
-0.06238722428679466,
-0.01287445891648531,
-0.019343659281730652,
-0.03158276900649071,
-0.04139260575175285,
0.04067352041602135,
0.015896977856755257,
-0.024677179753780365,
-0.04115280508995056,
0.07348956912755966,
0.005700216628611088,
-0.2573004364967346,
0.06634379923343658,
-0.15753182768821716,
-0.17105737328529358,
-0.044188737869262695,
0.03723384439945221,
-0.00008182684541679919,
0.05680076405405998,
-0.017603576183319092,
0.00912314560264349,
0.07883697003126144,
-0.017951659858226776,
-0.03135204687714577,
-0.125352680683136,
0.12389285862445831,
-0.06562238186597824,
0.1712745577096939,
-0.028239339590072632,
0.0493302196264267,
0.11528794467449188,
0.028752008453011513,
-0.13546699285507202,
0.04170512035489082,
0.05335265025496483,
-0.10561179369688034,
0.0156415868550539,
0.1509256213903427,
-0.04602878540754318,
0.09246711432933807,
0.042681530117988586,
-0.10845036059617996,
0.004606799688190222,
-0.0574222207069397,
-0.03407584875822067,
-0.08088939636945724,
-0.01410111878067255,
-0.05993427336215973,
0.16899573802947998,
0.2195274382829666,
-0.0298863984644413,
0.011244122870266438,
-0.10218600928783417,
0.01587303914129734,
0.0699932724237442,
0.029394790530204773,
-0.0574813075363636,
-0.18874892592430115,
0.01229208055883646,
0.06848310679197311,
-0.0072021777741611,
-0.24098728597164154,
-0.0759255588054657,
0.03920120745897293,
-0.037018779665231705,
-0.04016116261482239,
0.10334458947181702,
0.04092060402035713,
0.05219609662890434,
-0.030341938138008118,
-0.1601894199848175,
-0.032015491276979446,
0.15444765985012054,
-0.17496028542518616,
-0.03648695349693298
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08850991725921631,
0.09592525660991669,
-0.002588411560282111,
0.07781397551298141,
0.14014296233654022,
0.029093438759446144,
0.09593384712934494,
0.13814058899879456,
-0.11403363943099976,
0.04612983018159866,
0.09688562154769897,
0.07723534107208252,
0.0300371702760458,
0.14543655514717102,
-0.030259491875767708,
-0.24381820857524872,
-0.007211707998067141,
-0.022151969373226166,
-0.10591453313827515,
0.11311041563749313,
0.10015224665403366,
-0.09869678318500519,
0.06751739233732224,
-0.018808484077453613,
-0.17451830208301544,
0.013607329688966274,
-0.0187673419713974,
-0.05385793745517731,
0.11669987440109253,
-0.0011436400236561894,
0.07847148180007935,
0.009916692040860653,
0.11478772014379501,
-0.1952952891588211,
0.01834406517446041,
0.07266465574502945,
0.0443740114569664,
0.09472742676734924,
0.005671121180057526,
-0.022588269785046577,
0.11930078268051147,
-0.129579097032547,
0.09935250133275986,
0.0329376757144928,
-0.09885028749704361,
-0.2060803472995758,
-0.09693314880132675,
0.013025077991187572,
0.04240919277071953,
0.08975350856781006,
0.006672861985862255,
0.14845681190490723,
-0.10702993720769882,
0.08003448694944382,
0.23094631731510162,
-0.26451170444488525,
-0.08228425681591034,
0.046400874853134155,
0.06122535467147827,
0.08414468914270401,
-0.12352227419614792,
-0.012838130816817284,
0.010047685354948044,
0.01998135633766651,
0.10154759138822556,
-0.026690179482102394,
-0.0763450562953949,
0.013195635750889778,
-0.11081978678703308,
-0.0009480025037191808,
0.11554738879203796,
0.0371609628200531,
-0.05448922514915466,
-0.08110641688108444,
-0.03738364577293396,
-0.062283288687467575,
-0.03446212410926819,
-0.014361554756760597,
0.035921916365623474,
-0.06193838641047478,
-0.14272034168243408,
-0.04702044650912285,
-0.04971378296613693,
-0.09174902737140656,
0.0004982319078408182,
0.20940864086151123,
0.03783810883760452,
0.021008813753724098,
-0.04941033944487572,
0.10564873367547989,
0.01607503369450569,
-0.12417036294937134,
-0.03489962965250015,
-0.006208226550370455,
-0.09479629248380661,
-0.0344897136092186,
-0.055795978754758835,
0.03019738383591175,
0.039891064167022705,
0.2297898679971695,
-0.02732313796877861,
0.07322711497545242,
0.03575568273663521,
-0.013269190676510334,
-0.02761046029627323,
0.14854063093662262,
-0.02836727164685726,
-0.07626985758543015,
0.009404991753399372,
0.06392460316419601,
0.00489626731723547,
-0.006644113454967737,
-0.06186749413609505,
-0.04380141198635101,
0.061942119151353836,
0.05774151533842087,
-0.04832213744521141,
0.027877550572156906,
-0.009484892711043358,
-0.024111105129122734,
0.0014235678827390075,
-0.11896925419569016,
0.008874248713254929,
-0.008429407142102718,
-0.07919818162918091,
-0.053447987884283066,
0.014295756816864014,
-0.012314341962337494,
0.00919497199356556,
0.09186894446611404,
-0.07387660443782806,
-0.03740599751472473,
-0.07932469993829727,
-0.07472727447748184,
-0.014727877452969551,
-0.16053473949432373,
0.02131941169500351,
-0.07013298571109772,
-0.1523655503988266,
-0.03216217830777168,
0.051002539694309235,
-0.07973355799913406,
-0.03423362597823143,
-0.03258400782942772,
-0.08135058730840683,
0.01813950203359127,
0.0007250155322253704,
0.21436183154582977,
-0.049011364579200745,
0.09596073627471924,
0.013404199853539467,
0.0533035472035408,
0.007215498015284538,
0.03824195638298988,
-0.08000761270523071,
0.013724729418754578,
-0.1762344092130661,
0.07650014758110046,
-0.08956431597471237,
0.029251856729388237,
-0.14591270685195923,
-0.08764830231666565,
-0.0033846902661025524,
-0.019801480695605278,
0.08248409628868103,
0.10589029639959335,
-0.1255379617214203,
-0.024380631744861603,
0.12517905235290527,
-0.05634058639407158,
-0.05778186395764351,
0.06513570249080658,
-0.07357004284858704,
0.08167769759893417,
0.04877883940935135,
0.18934999406337738,
0.09859295934438705,
-0.1092209666967392,
0.0246573593467474,
0.017228489741683006,
0.03744751960039139,
0.003915551118552685,
0.05702229589223862,
-0.0005045156576670706,
0.02283683978021145,
0.015669668093323708,
-0.08347366005182266,
0.016026565805077553,
-0.09229987114667892,
-0.06252257525920868,
-0.04151551425457001,
-0.08969443291425705,
0.0075975204817950726,
0.009261069819331169,
0.028616849333047867,
-0.07950210571289062,
-0.08860821276903152,
0.06231151521205902,
0.14014603197574615,
-0.04579055681824684,
0.011236418038606644,
-0.07997318357229233,
0.004309577401727438,
-0.028572237119078636,
-0.02172655425965786,
-0.19520483911037445,
-0.0606086440384388,
0.024861345067620277,
0.01057251263409853,
0.04425446316599846,
-0.008389851078391075,
0.0805058628320694,
0.021760523319244385,
-0.05022476240992546,
-0.004385763313621283,
-0.09074045717716217,
-0.010206171311438084,
-0.08964373171329498,
-0.21737974882125854,
-0.05509652942419052,
-0.03907175734639168,
0.15447857975959778,
-0.17277204990386963,
0.001019212300889194,
-0.018000265583395958,
0.10974160581827164,
0.043003182858228683,
-0.051037270575761795,
-0.00015285010158549994,
0.029098080471158028,
0.014890243299305439,
-0.09978427737951279,
0.033912256360054016,
0.016273964196443558,
-0.09937010705471039,
-0.023941107094287872,
-0.10280788689851761,
0.0030173449777066708,
0.07584146410226822,
0.07354340702295303,
-0.1074858084321022,
-0.018493125215172768,
-0.06309487670660019,
-0.02839227207005024,
-0.047285255044698715,
0.03425775095820427,
0.17285801470279694,
0.01797599531710148,
0.1128951832652092,
-0.07527477294206619,
-0.08620180934667587,
0.018630163744091988,
0.009113048203289509,
0.058137446641922,
0.11380665749311447,
0.0756601020693779,
-0.09747549146413803,
0.05914841219782829,
0.08757482469081879,
-0.04803953319787979,
0.1405019462108612,
-0.048817966133356094,
-0.07933975756168365,
-0.029822077602148056,
0.001507653039880097,
-0.00032929438748396933,
0.1510060727596283,
-0.04754428192973137,
0.010493728332221508,
0.03552675247192383,
0.030236568301916122,
0.008537841960787773,
-0.16121454536914825,
-0.024216217920184135,
0.020934084430336952,
-0.048384714871644974,
-0.028683731332421303,
0.012151733040809631,
0.013612557202577591,
0.09129782766103745,
0.05005418509244919,
0.0024175874423235655,
0.005030160769820213,
-0.013630555011332035,
-0.049306776374578476,
0.20142267644405365,
-0.09218969196081161,
-0.046942561864852905,
-0.0846376121044159,
-0.0008986970060504973,
-0.010909069329500198,
-0.03526534140110016,
0.01732555404305458,
-0.10153178870677948,
-0.023581627756357193,
-0.0655178502202034,
0.005440044682472944,
-0.04795670509338379,
0.010938233695924282,
0.00019289088959340006,
0.019529299810528755,
0.05704568326473236,
-0.13430027663707733,
0.01443130150437355,
-0.06194081902503967,
-0.11227954924106598,
0.03035140037536621,
0.05160922557115555,
0.0879863053560257,
0.06260601431131363,
-0.025914542376995087,
0.017587829381227493,
-0.046727124601602554,
0.2328469455242157,
-0.08747783303260803,
0.004539259243756533,
0.1240365132689476,
0.024101803079247475,
0.03678961843252182,
0.10105134546756744,
0.02777235209941864,
-0.09944631904363632,
0.04387553781270981,
0.07502458244562149,
-0.04431185871362686,
-0.25447678565979004,
0.008757372386753559,
-0.04436095803976059,
-0.08211394399404526,
0.08952078223228455,
0.04886554554104805,
-0.03702676296234131,
0.06460753083229065,
0.0013151022139936686,
0.007708823774009943,
-0.02074701525270939,
0.08869727700948715,
0.0849519819021225,
0.05548249930143356,
0.10558527708053589,
-0.04062140733003616,
-0.017357347533106804,
0.06531835347414017,
0.02786029502749443,
0.3082790672779083,
-0.04829832911491394,
0.1017623320221901,
0.05318981036543846,
0.1465628743171692,
-0.02106841839849949,
0.03682412952184677,
0.011628272943198681,
-0.00502023147419095,
-0.028551964089274406,
-0.05399854853749275,
-0.02583238296210766,
0.005510326474905014,
-0.06800556182861328,
0.04158627614378929,
-0.05654400587081909,
0.04414566978812218,
0.017706383019685745,
0.2909254729747772,
0.000561184308025986,
-0.2647410035133362,
-0.10125022381544113,
-0.01237967424094677,
-0.03753010928630829,
-0.05041371285915375,
0.013161401264369488,
0.118198923766613,
-0.13136820495128632,
0.02539251744747162,
-0.06604573130607605,
0.08500971645116806,
-0.02787863090634346,
-0.005616997834295034,
0.039214469492435455,
0.16209116578102112,
-0.02214888483285904,
0.06102178990840912,
-0.22307385504245758,
0.2313433140516281,
0.008208787068724632,
0.12241668254137039,
-0.054267752915620804,
0.006446379702538252,
0.024090060964226723,
-0.00039690459379926324,
0.09574205428361893,
-0.0031515269074589014,
-0.04625702649354935,
-0.13984069228172302,
-0.052405599504709244,
0.07039632648229599,
0.13893139362335205,
-0.052862364798784256,
0.10164953023195267,
-0.05969380959868431,
0.010616401210427284,
0.03659664839506149,
-0.08128591626882553,
-0.11951705068349838,
-0.10375764966011047,
-0.018712539225816727,
-0.0011688562808558345,
-0.05978092551231384,
-0.06455718725919724,
-0.06632262468338013,
0.02426721528172493,
0.11566309630870819,
-0.002973778871819377,
-0.03440900519490242,
-0.14964598417282104,
0.0730895921587944,
0.15502354502677917,
-0.06776546686887741,
0.033647168427705765,
0.0025164762046188116,
0.0806235820055008,
0.03487002104520798,
-0.07751329988241196,
0.06285088509321213,
-0.06719823181629181,
-0.1804850846529007,
-0.04838024452328682,
0.10251648724079132,
0.07088834792375565,
0.04171374440193176,
-0.006431076675653458,
0.04802862927317619,
-0.027881355956196785,
-0.09112979471683502,
0.029146065935492516,
0.03152218833565712,
0.03565400838851929,
0.04224245250225067,
-0.07600883394479752,
0.08393488079309464,
-0.044710125774145126,
-0.02022961527109146,
0.12079346179962158,
0.23170483112335205,
-0.10469632595777512,
0.09672403335571289,
0.057872477918863297,
-0.06020079180598259,
-0.16642865538597107,
0.07237720489501953,
0.10511795431375504,
0.013157933950424194,
0.0585329644382,
-0.21661128103733063,
0.12086433917284012,
0.10248883068561554,
-0.013962398283183575,
0.040167562663555145,
-0.27810555696487427,
-0.12061326950788498,
0.05014453083276749,
0.12407885491847992,
0.08320126682519913,
-0.12618738412857056,
-0.018786011263728142,
-0.015112701803445816,
-0.12813597917556763,
0.07908517122268677,
-0.11327054351568222,
0.1312694400548935,
-0.023753572255373,
0.10954534262418747,
0.012869873084127903,
-0.02724733017385006,
0.1081605851650238,
0.04956973344087601,
0.09634766727685928,
-0.042610280215740204,
0.0004233588115312159,
0.06001058965921402,
-0.04890365153551102,
-0.00039382491377182305,
-0.06700300425291061,
0.08860893547534943,
-0.13712234795093536,
-0.007712442893534899,
-0.0887966975569725,
0.042587801814079285,
-0.0410446971654892,
-0.06666000187397003,
-0.04212349280714989,
0.05759578198194504,
0.04550183564424515,
-0.032797615975141525,
0.03981657698750496,
-0.026620887219905853,
0.10414326936006546,
0.025376128032803535,
0.08661969006061554,
0.018749719485640526,
-0.05557645484805107,
0.022651394829154015,
-0.012353727594017982,
0.06426116824150085,
-0.16914410889148712,
0.009595382958650589,
0.09861908853054047,
0.06861484795808792,
0.1019015684723854,
0.041604503989219666,
-0.047773584723472595,
0.016914555802941322,
0.028276974335312843,
-0.09683757275342941,
-0.11393698304891586,
0.039510082453489304,
-0.036523979157209396,
-0.14730414748191833,
0.042055219411849976,
0.11863593757152557,
-0.039533581584692,
-0.03114093840122223,
-0.019331198185682297,
0.0034610929433256388,
-0.0207868330180645,
0.1804952174425125,
0.060547295957803726,
0.06004924699664116,
-0.10297952592372894,
0.12044733017683029,
0.03507380560040474,
-0.02879875712096691,
0.05170908570289612,
0.081732839345932,
-0.10054277628660202,
-0.006197901908308268,
0.07625044882297516,
0.1254441738128662,
-0.05437513068318367,
0.0006292662001214921,
-0.10122150927782059,
-0.08718529343605042,
0.059191882610321045,
0.1422145515680313,
0.04905310273170471,
-0.015537681058049202,
-0.045612260699272156,
0.04609726741909981,
-0.1403421312570572,
0.07440387457609177,
0.03313461318612099,
0.06336881965398788,
-0.07708218693733215,
0.06484654545783997,
0.003676291322335601,
0.017205432057380676,
-0.014502947218716145,
0.0022952700965106487,
-0.0969112291932106,
-0.006947945803403854,
-0.08263345807790756,
0.0017447032732889056,
-0.00031273465720005333,
0.01844358630478382,
-0.024341316893696785,
-0.07113654166460037,
-0.0447208397090435,
0.03626512736082077,
-0.08623364567756653,
-0.050783514976501465,
0.0069953384809195995,
0.04324384406208992,
-0.12462758272886276,
-0.004215283319354057,
0.029076170176267624,
-0.09864626824855804,
0.09883235394954681,
0.07350311428308487,
0.02036958746612072,
0.02815386652946472,
-0.12119071185588837,
-0.03444874659180641,
-0.014953555539250374,
-0.010210945270955563,
0.059836775064468384,
-0.0998619943857193,
-0.004878828767687082,
-0.046180032193660736,
0.06120631471276283,
0.013541383668780327,
0.061403583735227585,
-0.14072446525096893,
0.014584504999220371,
-0.06839441508054733,
-0.04450589045882225,
-0.07868147641420364,
0.040720656514167786,
0.09359587728977203,
0.05892816558480263,
0.14040257036685944,
-0.07489526271820068,
0.025779392570257187,
-0.2041417509317398,
-0.03658390790224075,
-0.01305236667394638,
-0.05642455443739891,
-0.1443793773651123,
-0.04617200419306755,
0.08267071843147278,
-0.03979330137372017,
0.08996334671974182,
-0.026744088158011436,
0.0736493468284607,
0.03797418251633644,
-0.051058776676654816,
-0.03299935534596443,
-0.00845472700893879,
0.20292668044567108,
0.07175440341234207,
-0.014685788191854954,
0.10210898518562317,
0.00011990576604148373,
0.03042728640139103,
0.04704828932881355,
0.17066657543182373,
0.22205933928489685,
0.04222540929913521,
0.04946558550000191,
0.062485020607709885,
-0.07744301855564117,
-0.07014515995979309,
0.1739172488451004,
-0.011105811223387718,
0.06824930757284164,
-0.04612618312239647,
0.19249092042446136,
0.12070480734109879,
-0.167790949344635,
0.04874574393033981,
-0.04607568681240082,
-0.08121580630540848,
-0.11852922290563583,
-0.010074258781969547,
-0.08267281949520111,
-0.123606838285923,
0.03677390143275261,
-0.11905384063720703,
0.046428337693214417,
0.10863767564296722,
0.015594171360135078,
0.03543157875537872,
0.12678202986717224,
-0.010268954560160637,
-0.006768518593162298,
0.06824326515197754,
0.004708088468760252,
-0.010554404929280281,
-0.04181757941842079,
-0.07500124722719193,
0.05930020287632942,
0.0012399429688230157,
0.08202080428600311,
-0.045498792082071304,
-0.015155799686908722,
0.0311999823898077,
-0.03074297308921814,
-0.0798128992319107,
0.02577117271721363,
0.044954799115657806,
0.053495436906814575,
0.051682621240615845,
0.0414775125682354,
-0.01070205308496952,
-0.03304709866642952,
0.3272559642791748,
-0.06667064875364304,
-0.10304644703865051,
-0.1257704496383667,
0.214475616812706,
0.03319982439279556,
-0.02786954492330551,
0.03158556669950485,
-0.08659689873456955,
-0.001113763079047203,
0.1700095236301422,
0.17754711210727692,
-0.06495033204555511,
-0.02039284259080887,
0.0001234860683325678,
-0.015870630741119385,
-0.03130597993731499,
0.12411560863256454,
0.09714636206626892,
-0.012029646895825863,
-0.061253685504198074,
-0.015936417505145073,
-0.01527772843837738,
-0.03364350274205208,
-0.04149196669459343,
0.050040118396282196,
0.02025149017572403,
-0.027342762798070908,
-0.043464239686727524,
0.07667684555053711,
0.002947293920442462,
-0.25556251406669617,
0.06377829611301422,
-0.15678654611110687,
-0.17606566846370697,
-0.05133282020688057,
0.030019117519259453,
0.0047641475684940815,
0.05698412284255028,
-0.013701936230063438,
0.002934875898063183,
0.0879664272069931,
-0.01033307146281004,
-0.03358525037765503,
-0.12111914157867432,
0.12307662516832352,
-0.04587695747613907,
0.17090174555778503,
-0.03188690170645714,
0.0412963405251503,
0.11675063520669937,
0.03071623295545578,
-0.13516680896282196,
0.035995908081531525,
0.06286466121673584,
-0.09702028334140778,
0.020449623465538025,
0.14946088194847107,
-0.04775761440396309,
0.09933620691299438,
0.04613492637872696,
-0.10950657725334167,
0.001924662385135889,
-0.06796343624591827,
-0.03401311859488487,
-0.08627274632453918,
-0.010346700437366962,
-0.06411435455083847,
0.16623006761074066,
0.221623033285141,
-0.037680886685848236,
0.007967598736286163,
-0.09826860576868057,
0.01206981111317873,
0.06967660784721375,
0.03567114099860191,
-0.049654316157102585,
-0.18157878518104553,
0.0054659792222082615,
0.05984645336866379,
-0.001686923555098474,
-0.2527238726615906,
-0.07216276973485947,
0.03742041066288948,
-0.029043326154351234,
-0.032944247126579285,
0.11009949445724487,
0.04711711406707764,
0.05041249841451645,
-0.031313564628362656,
-0.14968256652355194,
-0.033044926822185516,
0.15644219517707825,
-0.1730484813451767,
-0.03590825945138931
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08896119892597198,
0.09363608062267303,
-0.0025207488797605038,
0.077954962849617,
0.14026294648647308,
0.02931234985589981,
0.09642365574836731,
0.13668292760849,
-0.112643763422966,
0.045270103961229324,
0.0957537367939949,
0.0785556510090828,
0.030203048139810562,
0.14601145684719086,
-0.029646286740899086,
-0.24451008439064026,
-0.007303339429199696,
-0.021803010255098343,
-0.10444213449954987,
0.1130623146891594,
0.10027880221605301,
-0.09918568283319473,
0.06729517132043839,
-0.018750181421637535,
-0.17469987273216248,
0.013936720788478851,
-0.018516935408115387,
-0.052764978259801865,
0.11705074459314346,
-0.002718597184866667,
0.0780564397573471,
0.00942118652164936,
0.11669529974460602,
-0.19382889568805695,
0.01837502047419548,
0.07209312170743942,
0.04429565742611885,
0.09434331953525543,
0.00539025291800499,
-0.021495722234249115,
0.11896906048059464,
-0.13032056391239166,
0.09965751320123672,
0.032286714762449265,
-0.0989128053188324,
-0.20487141609191895,
-0.09716739505529404,
0.01253506075590849,
0.04375699535012245,
0.08864624053239822,
0.006424002815037966,
0.14875492453575134,
-0.10732950270175934,
0.08035474270582199,
0.2321067452430725,
-0.26591596007347107,
-0.08196520060300827,
0.048103492707014084,
0.06299091875553131,
0.08495861291885376,
-0.12325441837310791,
-0.013591405004262924,
0.010652446188032627,
0.019655099138617516,
0.10247600823640823,
-0.02730497159063816,
-0.07634314894676208,
0.012863531708717346,
-0.11218065768480301,
0.0007330378866754472,
0.11464697122573853,
0.0376911424100399,
-0.054260049015283585,
-0.08113913238048553,
-0.038176167756319046,
-0.06271906197071075,
-0.03563231602311134,
-0.015170326456427574,
0.036473147571086884,
-0.06231282651424408,
-0.14163967967033386,
-0.046467557549476624,
-0.04868347570300102,
-0.09204305708408356,
0.0006475611007772386,
0.20868012309074402,
0.03836394473910332,
0.02071264199912548,
-0.049288347363471985,
0.10478855669498444,
0.013387062586843967,
-0.12415144592523575,
-0.03370034322142601,
-0.005777149926871061,
-0.09584338217973709,
-0.035465944558382034,
-0.0564434789121151,
0.030134228989481926,
0.03915240243077278,
0.2298850566148758,
-0.025140130892395973,
0.07354556769132614,
0.037319958209991455,
-0.013614647090435028,
-0.027416057884693146,
0.14892806112766266,
-0.02891870029270649,
-0.07794057577848434,
0.00931836012750864,
0.06421075761318207,
0.005358952097594738,
-0.005616362672299147,
-0.06281057000160217,
-0.0440601110458374,
0.06244878098368645,
0.05734220892190933,
-0.05046940594911575,
0.028324011713266373,
-0.009207114577293396,
-0.023750262334942818,
0.002719262382015586,
-0.11954953521490097,
0.00957330223172903,
-0.008915943093597889,
-0.0801343247294426,
-0.053403183817863464,
0.012822465971112251,
-0.01145110186189413,
0.009654698893427849,
0.09093879908323288,
-0.07418055087327957,
-0.03673642873764038,
-0.0798860415816307,
-0.0758143961429596,
-0.014665549620985985,
-0.16292807459831238,
0.02193046733736992,
-0.06896478682756424,
-0.1546320617198944,
-0.03271638974547386,
0.05012490227818489,
-0.07913833856582642,
-0.03413628414273262,
-0.03375491872429848,
-0.08137116581201553,
0.01689682900905609,
0.0016684259753674269,
0.2155941128730774,
-0.0485498309135437,
0.0946606919169426,
0.013400336727499962,
0.05487570911645889,
0.0063476599752902985,
0.03833528980612755,
-0.07954718172550201,
0.0136622479185462,
-0.17682936787605286,
0.07603403180837631,
-0.08950231224298477,
0.031141692772507668,
-0.14504322409629822,
-0.08736873418092728,
-0.0038989062886685133,
-0.01947912573814392,
0.082339346408844,
0.10547088831663132,
-0.12818892300128937,
-0.02337959036231041,
0.1258825659751892,
-0.056123264133930206,
-0.05707038193941116,
0.06382545828819275,
-0.07381612807512283,
0.08223326504230499,
0.05028235912322998,
0.18947981297969818,
0.09865706413984299,
-0.1087777391076088,
0.02477126568555832,
0.017271431162953377,
0.037032876163721085,
0.002126130275428295,
0.056362900882959366,
0.00039804467814974487,
0.02424602024257183,
0.015711171552538872,
-0.08173540234565735,
0.015429620631039143,
-0.09191102534532547,
-0.06249970570206642,
-0.04193820804357529,
-0.0905032679438591,
0.007187223993241787,
0.00928869005292654,
0.02889086864888668,
-0.08007246255874634,
-0.08782333880662918,
0.06360597163438797,
0.1399877667427063,
-0.04564313217997551,
0.010674458928406239,
-0.07965879142284393,
0.0037667297292500734,
-0.028945310041308403,
-0.021911855787038803,
-0.19537685811519623,
-0.05949276685714722,
0.025430776178836823,
0.008874213322997093,
0.044355735182762146,
-0.006093383301049471,
0.08104780316352844,
0.021578921005129814,
-0.049551114439964294,
-0.004101974423974752,
-0.09004103392362595,
-0.010126802138984203,
-0.09133259952068329,
-0.21707768738269806,
-0.0554826520383358,
-0.039172977209091187,
0.15343284606933594,
-0.17303620278835297,
0.0013296095421537757,
-0.02011839859187603,
0.10943575203418732,
0.04214612767100334,
-0.05079703405499458,
-0.00014110482879914343,
0.028727492317557335,
0.01551912073045969,
-0.0999346598982811,
0.03412167727947235,
0.015050629153847694,
-0.09860347956418991,
-0.025363631546497345,
-0.10417067259550095,
0.001317672897130251,
0.07521337270736694,
0.07444633543491364,
-0.10721609741449356,
-0.017863508313894272,
-0.06278127431869507,
-0.027877693995833397,
-0.04617929458618164,
0.0340442955493927,
0.17173656821250916,
0.01717822253704071,
0.11251705884933472,
-0.07559669017791748,
-0.08712930232286453,
0.01845697872340679,
0.009850943461060524,
0.05938352644443512,
0.1136302724480629,
0.07626160234212875,
-0.09720147401094437,
0.05849844217300415,
0.08935178071260452,
-0.04795563220977783,
0.13914552330970764,
-0.048948053270578384,
-0.07957135885953903,
-0.029793977737426758,
0.000489121419377625,
-0.0013826574431732297,
0.15054580569267273,
-0.04805842414498329,
0.00906570628285408,
0.03482562676072121,
0.029751431196928024,
0.008225487545132637,
-0.16195206344127655,
-0.023777535185217857,
0.020496675744652748,
-0.047773830592632294,
-0.02953401766717434,
0.012043285183608532,
0.012836473993957043,
0.09129378199577332,
0.049101900309324265,
0.0012216606410220265,
0.00436367467045784,
-0.013677084818482399,
-0.048684053122997284,
0.20163626968860626,
-0.09180764108896255,
-0.04494138062000275,
-0.08289652317762375,
-0.0013667859602719545,
-0.010052008554339409,
-0.035570815205574036,
0.016522321850061417,
-0.1016841009259224,
-0.023660236969590187,
-0.06544952094554901,
0.004785343538969755,
-0.047226741909980774,
0.011014256626367569,
0.0015282925451174378,
0.019364068284630775,
0.05606947839260101,
-0.1346871703863144,
0.014298489317297935,
-0.06305323541164398,
-0.11249681562185287,
0.029697541147470474,
0.051728349179029465,
0.08753377944231033,
0.0638347864151001,
-0.025993233546614647,
0.017549602314829826,
-0.046295542269945145,
0.23340435326099396,
-0.08702484518289566,
0.006200749892741442,
0.12377025932073593,
0.02427944913506508,
0.03695045784115791,
0.10217439383268356,
0.027762198820710182,
-0.09983663260936737,
0.04355580732226372,
0.07457640767097473,
-0.04375981539487839,
-0.25477364659309387,
0.008776670321822166,
-0.04379652068018913,
-0.08420008420944214,
0.08931205421686172,
0.04859695956110954,
-0.03650930896401405,
0.0655420646071434,
0.0020960993133485317,
0.008843684569001198,
-0.020897705107927322,
0.08802857249975204,
0.08372361958026886,
0.05512385815382004,
0.10591036826372147,
-0.04105222225189209,
-0.018288297578692436,
0.06407243013381958,
0.02775445766746998,
0.30884432792663574,
-0.04674471169710159,
0.10101491957902908,
0.05272746458649635,
0.1458660364151001,
-0.021424194797873497,
0.03874208778142929,
0.011882760562002659,
-0.005932833068072796,
-0.028804121538996696,
-0.05384916812181473,
-0.02378239668905735,
0.005594349000602961,
-0.06860331445932388,
0.041686899960041046,
-0.05578075721859932,
0.043612800538539886,
0.017431506887078285,
0.28904610872268677,
0.0021289722062647343,
-0.2651471197605133,
-0.09981615841388702,
-0.012195334769785404,
-0.03869399428367615,
-0.04976317659020424,
0.012962568551301956,
0.11669215559959412,
-0.13085117936134338,
0.026416443288326263,
-0.0658039078116417,
0.08521713316440582,
-0.027203481644392014,
-0.0050586205907166,
0.038354918360710144,
0.1635170876979828,
-0.021931210532784462,
0.06111978366971016,
-0.22377321124076843,
0.23058883845806122,
0.00833803415298462,
0.12301638722419739,
-0.054937418550252914,
0.005974969360977411,
0.024199996143579483,
-0.0007205987349152565,
0.09575364738702774,
-0.0030497193802148104,
-0.04676111042499542,
-0.14000582695007324,
-0.052326057106256485,
0.07034847140312195,
0.13953635096549988,
-0.051113538444042206,
0.10188967734575272,
-0.058938197791576385,
0.009741760790348053,
0.03696485236287117,
-0.08213333785533905,
-0.12074276804924011,
-0.10273034870624542,
-0.019482459872961044,
-0.00175497867166996,
-0.06123762205243111,
-0.06379354745149612,
-0.06647653877735138,
0.02166934311389923,
0.11383384466171265,
-0.0006605132366530597,
-0.034520503133535385,
-0.14892727136611938,
0.07337512075901031,
0.15544119477272034,
-0.06766029447317123,
0.03446783870458603,
0.0030698964837938547,
0.08077444136142731,
0.034907326102256775,
-0.07753557711839676,
0.06308607012033463,
-0.06695237755775452,
-0.17998425662517548,
-0.047923147678375244,
0.10343056172132492,
0.07158853858709335,
0.04176715388894081,
-0.005288620013743639,
0.048085980117321014,
-0.028275417163968086,
-0.09122341126203537,
0.028401024639606476,
0.031474769115448,
0.03474299609661102,
0.04231664910912514,
-0.07692135125398636,
0.08297613263130188,
-0.044647663831710815,
-0.01860332116484642,
0.12047677487134933,
0.2294992059469223,
-0.10463407635688782,
0.0961325392127037,
0.05702925845980644,
-0.06050132215023041,
-0.16603052616119385,
0.0732986181974411,
0.10522719472646713,
0.01323288306593895,
0.059199973940849304,
-0.2150542438030243,
0.1222420185804367,
0.10242149233818054,
-0.013461233116686344,
0.0400700718164444,
-0.278601735830307,
-0.12063183635473251,
0.0506134107708931,
0.1246161237359047,
0.08359009027481079,
-0.1258520931005478,
-0.018377842381596565,
-0.016225066035985947,
-0.12802310287952423,
0.07927967607975006,
-0.11398320645093918,
0.13091668486595154,
-0.02381480485200882,
0.10930304229259491,
0.01266076322644949,
-0.027289196848869324,
0.10671505331993103,
0.05170908570289612,
0.09707698225975037,
-0.042921051383018494,
-0.0005926643498241901,
0.06149458885192871,
-0.04838396981358528,
0.0005298068863339722,
-0.06666624546051025,
0.0881626307964325,
-0.13572958111763,
-0.007532564457505941,
-0.08835769444704056,
0.04205762594938278,
-0.04080735146999359,
-0.06621040403842926,
-0.04202670231461525,
0.0570732057094574,
0.0450093112885952,
-0.032898299396038055,
0.038559675216674805,
-0.025277039036154747,
0.10355543345212936,
0.0220941212028265,
0.08737164735794067,
0.018105026334524155,
-0.05443925783038139,
0.023018285632133484,
-0.012000064365565777,
0.06327956914901733,
-0.16968446969985962,
0.009106743149459362,
0.09880238771438599,
0.06873589754104614,
0.10163716971874237,
0.041581470519304276,
-0.047803591936826706,
0.016208749264478683,
0.028037790209054947,
-0.09536398202180862,
-0.11521953344345093,
0.040459249168634415,
-0.03782447800040245,
-0.14717501401901245,
0.043905727565288544,
0.11688891798257828,
-0.039718471467494965,
-0.03189578652381897,
-0.020188841968774796,
0.0031632136087864637,
-0.020704563707113266,
0.1817055195569992,
0.061483368277549744,
0.059831131249666214,
-0.10367985814809799,
0.11992768198251724,
0.03465482220053673,
-0.027406498789787292,
0.05102724954485893,
0.08259375393390656,
-0.10133281350135803,
-0.006106511224061251,
0.07794221490621567,
0.12711119651794434,
-0.05280034616589546,
0.0008528917678631842,
-0.101495161652565,
-0.0874769389629364,
0.05929653346538544,
0.14358112215995789,
0.04981492459774017,
-0.016674453392624855,
-0.0453154481947422,
0.046281151473522186,
-0.13999417424201965,
0.07391291856765747,
0.03316757455468178,
0.06369082629680634,
-0.07689593732357025,
0.06542711704969406,
0.0037304472643882036,
0.01761235110461712,
-0.014551601372659206,
0.003492261515930295,
-0.09683296084403992,
-0.007839172147214413,
-0.08250850439071655,
0.0002524630108382553,
-0.0011078543029725552,
0.01837875321507454,
-0.024669520556926727,
-0.07117721438407898,
-0.04464573413133621,
0.0358111672103405,
-0.08639559894800186,
-0.05097600445151329,
0.0068153077736496925,
0.042110465466976166,
-0.12414591014385223,
-0.004382525570690632,
0.02822520211338997,
-0.09777972847223282,
0.09807424247264862,
0.0727190226316452,
0.02097637578845024,
0.029400857165455818,
-0.1220167800784111,
-0.0341043546795845,
-0.014112082310020924,
-0.00993102602660656,
0.060338910669088364,
-0.09904111176729202,
-0.005073036998510361,
-0.045800261199474335,
0.06312969326972961,
0.012809053994715214,
0.05935249105095863,
-0.13978977501392365,
0.014776972122490406,
-0.0695488303899765,
-0.04452535882592201,
-0.07891193777322769,
0.04040098190307617,
0.09285678714513779,
0.058112673461437225,
0.14111603796482086,
-0.07382199913263321,
0.02565949410200119,
-0.2048773169517517,
-0.036956340074539185,
-0.013505605980753899,
-0.057022251188755035,
-0.1442723274230957,
-0.04680360481142998,
0.08295177668333054,
-0.039880167692899704,
0.09177859872579575,
-0.02606208436191082,
0.0740729421377182,
0.037232790142297745,
-0.046977899968624115,
-0.03348281979560852,
-0.00811292789876461,
0.20308341085910797,
0.07153096795082092,
-0.014803050085902214,
0.10147559642791748,
0.0010808644583448768,
0.03080538846552372,
0.04499157890677452,
0.1700935661792755,
0.22152584791183472,
0.040812645107507706,
0.049536895006895065,
0.06333807855844498,
-0.07733791321516037,
-0.06872683018445969,
0.1754837930202484,
-0.012152327224612236,
0.06684265285730362,
-0.046104222536087036,
0.19521008431911469,
0.1194043830037117,
-0.16756786406040192,
0.049108777195215225,
-0.04539920762181282,
-0.08194170892238617,
-0.11816642433404922,
-0.010885944589972496,
-0.08327297121286392,
-0.12326062470674515,
0.03677443414926529,
-0.11893139034509659,
0.0462477020919323,
0.10924772918224335,
0.015348738059401512,
0.0349796824157238,
0.1274981051683426,
-0.010670498013496399,
-0.0061475918628275394,
0.06797324866056442,
0.00459248898550868,
-0.010267853736877441,
-0.041257623583078384,
-0.07409639656543732,
0.05934502184391022,
0.000664964783936739,
0.08179786056280136,
-0.04683484137058258,
-0.016154156997799873,
0.03070160560309887,
-0.030246522277593613,
-0.07950688153505325,
0.026021050289273262,
0.04458143562078476,
0.05303703993558884,
0.05009813234210014,
0.0420977920293808,
-0.011020942591130733,
-0.033443983644247055,
0.32558107376098633,
-0.06663180887699127,
-0.10427127033472061,
-0.12485188245773315,
0.213270366191864,
0.03377579525113106,
-0.02774164266884327,
0.031917162239551544,
-0.08639572560787201,
0.00011782233195845038,
0.1708536148071289,
0.17761629819869995,
-0.06464128941297531,
-0.020590774714946747,
-0.00021089227811899036,
-0.01632482185959816,
-0.03171127289533615,
0.12437503039836884,
0.09754367917776108,
-0.013988266699016094,
-0.06090907007455826,
-0.01566496677696705,
-0.01477386336773634,
-0.03409179672598839,
-0.04220312088727951,
0.0489262193441391,
0.021317269653081894,
-0.027416858822107315,
-0.041939105838537216,
0.07737798988819122,
0.0045424820855259895,
-0.25523439049720764,
0.061911266297101974,
-0.15600289404392242,
-0.1760374903678894,
-0.0515945702791214,
0.030259180814027786,
0.006361193023622036,
0.056887779384851456,
-0.013793818652629852,
0.003334395121783018,
0.08747126907110214,
-0.010335641913115978,
-0.03422703966498375,
-0.12152411043643951,
0.1232801154255867,
-0.04863015562295914,
0.16963714361190796,
-0.03173375502228737,
0.042390964925289154,
0.1165323331952095,
0.02975192479789257,
-0.1344204545021057,
0.03681659325957298,
0.062307070940732956,
-0.09630126506090164,
0.021388797089457512,
0.14898793399333954,
-0.047141436487436295,
0.09585653990507126,
0.045540858060121536,
-0.10945965349674225,
0.002526325173676014,
-0.06705864518880844,
-0.034140147268772125,
-0.08671854436397552,
-0.008959409780800343,
-0.06391624361276627,
0.16669948399066925,
0.22167745232582092,
-0.03772199526429176,
0.008670441806316376,
-0.09872022271156311,
0.011109610088169575,
0.06972788274288177,
0.03535236045718193,
-0.05014149844646454,
-0.18203379213809967,
0.00516925286501646,
0.0586317703127861,
-0.0015637401957064867,
-0.25052884221076965,
-0.07131898403167725,
0.03579742833971977,
-0.030224086716771126,
-0.03331661969423294,
0.10906938463449478,
0.04819870740175247,
0.05060264840722084,
-0.03105449676513672,
-0.15013131499290466,
-0.03308022767305374,
0.15707963705062866,
-0.17381681501865387,
-0.03557932749390602
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08786017447710037,
0.0952954813838005,
-0.002628184389322996,
0.0779949501156807,
0.1394018977880478,
0.029689522460103035,
0.09614623337984085,
0.13754698634147644,
-0.11348170042037964,
0.04549141973257065,
0.09615317732095718,
0.07721036672592163,
0.029819918796420097,
0.14525194466114044,
-0.029653947800397873,
-0.24480952322483063,
-0.007414630148559809,
-0.02323228120803833,
-0.10631149262189865,
0.11301741749048233,
0.10084141045808792,
-0.09805667400360107,
0.06724324822425842,
-0.01845160685479641,
-0.17521049082279205,
0.014026522636413574,
-0.017707135528326035,
-0.05240299180150032,
0.11673913896083832,
-0.0018426079768687487,
0.07836445420980453,
0.010454242117702961,
0.11651650816202164,
-0.19486157596111298,
0.01806141436100006,
0.0714450404047966,
0.044461432844400406,
0.09458745270967484,
0.005957134999334812,
-0.021550629287958145,
0.11918627470731735,
-0.1293475478887558,
0.0990762934088707,
0.0324757881462574,
-0.09852030873298645,
-0.2053312510251999,
-0.09723027795553207,
0.013516229577362537,
0.04394666850566864,
0.08778636902570724,
0.007215395104140043,
0.14762462675571442,
-0.10713176429271698,
0.07990187406539917,
0.2310466319322586,
-0.2664209008216858,
-0.08233988285064697,
0.04654531180858612,
0.06185125559568405,
0.08545396476984024,
-0.12217824161052704,
-0.01308087445795536,
0.010323829017579556,
0.019816173240542412,
0.10145792365074158,
-0.02734207734465599,
-0.0767187774181366,
0.012751458212733269,
-0.11197718232870102,
-0.0009969767415896058,
0.1148386225104332,
0.03697315976023674,
-0.05520903691649437,
-0.0795925036072731,
-0.03863368183374405,
-0.060072146356105804,
-0.03444242849946022,
-0.015755191445350647,
0.036442194133996964,
-0.061288073658943176,
-0.14130954444408417,
-0.0487893708050251,
-0.04998618736863136,
-0.09266844391822815,
0.00029440512298606336,
0.21026645600795746,
0.038296349346637726,
0.022110160440206528,
-0.04896923899650574,
0.10599374771118164,
0.01521973218768835,
-0.12365274131298065,
-0.034304432570934296,
-0.005638814065605402,
-0.09539037942886353,
-0.0348215252161026,
-0.05598582699894905,
0.03095136024057865,
0.03952451050281525,
0.22912609577178955,
-0.0273788720369339,
0.0728708952665329,
0.03615458309650421,
-0.012770946137607098,
-0.027244525030255318,
0.14833877980709076,
-0.027310535311698914,
-0.07526058703660965,
0.009534083306789398,
0.06370332092046738,
0.005837190896272659,
-0.00586630217730999,
-0.06274301558732986,
-0.04444978013634682,
0.06300103664398193,
0.05845852196216583,
-0.04965326935052872,
0.026297030970454216,
-0.010440858080983162,
-0.0240461565554142,
0.0013450593687593937,
-0.11949314177036285,
0.009544513188302517,
-0.008394090458750725,
-0.07872235029935837,
-0.05452785640954971,
0.013649464584887028,
-0.011276311241090298,
0.010132321156561375,
0.09007201343774796,
-0.07281050831079483,
-0.03603207319974899,
-0.07855583727359772,
-0.07393000274896622,
-0.01474830787628889,
-0.160072922706604,
0.021272702142596245,
-0.06981314718723297,
-0.1526150405406952,
-0.031212499365210533,
0.050922803580760956,
-0.08052455633878708,
-0.0364488884806633,
-0.032622311264276505,
-0.08040269464254379,
0.017898255959153175,
0.0008852778119035065,
0.21514441072940826,
-0.048571690917015076,
0.09550086408853531,
0.013258122839033604,
0.05441589280962944,
0.006725577637553215,
0.03744039312005043,
-0.0787179097533226,
0.014609633944928646,
-0.17712056636810303,
0.07661626487970352,
-0.08873797953128815,
0.02786160260438919,
-0.1461806744337082,
-0.0868929997086525,
-0.0027237730100750923,
-0.02036081813275814,
0.08158929646015167,
0.10554356127977371,
-0.126431405544281,
-0.023264123126864433,
0.12495609372854233,
-0.05775190889835358,
-0.05672029033303261,
0.0650380402803421,
-0.07358499616384506,
0.08365759998559952,
0.04837517440319061,
0.1894228160381317,
0.09909754246473312,
-0.1090705543756485,
0.0266096293926239,
0.01864704303443432,
0.03685897961258888,
0.003818983444944024,
0.05792751908302307,
-0.0004576154169626534,
0.02179398573935032,
0.015433329157531261,
-0.08418008685112,
0.015353057533502579,
-0.09247785806655884,
-0.06313817203044891,
-0.04187474772334099,
-0.08998754620552063,
0.008348424918949604,
0.007879512384533882,
0.02908424101769924,
-0.07924885302782059,
-0.08735772967338562,
0.06304590404033661,
0.1404876559972763,
-0.04509872943162918,
0.011859171092510223,
-0.08000448346138,
0.0037967669777572155,
-0.02954067289829254,
-0.022312484681606293,
-0.1950756460428238,
-0.06041545048356056,
0.02593120001256466,
0.009311888366937637,
0.044456832110881805,
-0.006343585439026356,
0.08027330785989761,
0.02184179797768593,
-0.04984460026025772,
-0.004432269837707281,
-0.09076230973005295,
-0.010631203651428223,
-0.09051092714071274,
-0.21765995025634766,
-0.055290523916482925,
-0.03914325311779976,
0.15433581173419952,
-0.17243221402168274,
0.00033806930878199637,
-0.019103122875094414,
0.10954450815916061,
0.04248911142349243,
-0.050510771572589874,
-0.0008684965432621539,
0.027766363695263863,
0.014995613135397434,
-0.09917503595352173,
0.03410344198346138,
0.016371916979551315,
-0.09927918016910553,
-0.023134492337703705,
-0.10222676396369934,
0.0024051221553236246,
0.07386848330497742,
0.07398131489753723,
-0.10719256103038788,
-0.01851004548370838,
-0.06311465054750443,
-0.028534870594739914,
-0.04751572757959366,
0.03428811579942703,
0.1724165678024292,
0.01703762076795101,
0.11266832798719406,
-0.07566982507705688,
-0.08646886050701141,
0.018321773037314415,
0.008053364232182503,
0.05824795737862587,
0.11331592500209808,
0.07616652548313141,
-0.09888668358325958,
0.05783771723508835,
0.08901538699865341,
-0.048096321523189545,
0.13890400528907776,
-0.04883594438433647,
-0.07922594249248505,
-0.031014207750558853,
0.0025458759628236294,
-0.0008697110461071134,
0.15017817914485931,
-0.04689539596438408,
0.010891390964388847,
0.03494320437312126,
0.03041261062026024,
0.008126693777740002,
-0.16250745952129364,
-0.023893892765045166,
0.020769720897078514,
-0.04936564341187477,
-0.027764195576310158,
0.011427766643464565,
0.013779370114207268,
0.09172296524047852,
0.04981672763824463,
0.0017342332284897566,
0.005515426862984896,
-0.013681122101843357,
-0.049818772822618484,
0.20086152851581573,
-0.09172750264406204,
-0.04663011059165001,
-0.08459588140249252,
-0.0004841713234782219,
-0.01014886237680912,
-0.03532549366354942,
0.017017100006341934,
-0.10054602473974228,
-0.02309897169470787,
-0.06566261500120163,
0.006003599613904953,
-0.048142917454242706,
0.011511672288179398,
0.0016011119587346911,
0.019593670964241028,
0.058133989572525024,
-0.13394543528556824,
0.014358011074364185,
-0.06251607090234756,
-0.11362956464290619,
0.030561523512005806,
0.05214248225092888,
0.08710182458162308,
0.06291277706623077,
-0.025354374200105667,
0.017361339181661606,
-0.04579942673444748,
0.23351989686489105,
-0.08635347336530685,
0.005432181525975466,
0.12409348785877228,
0.022801488637924194,
0.03747875615954399,
0.10194133967161179,
0.027094276621937752,
-0.0996115654706955,
0.043797336518764496,
0.07423606514930725,
-0.04415927454829216,
-0.25466111302375793,
0.008369090035557747,
-0.04322454705834389,
-0.08323460817337036,
0.08912646025419235,
0.04878497123718262,
-0.04006120190024376,
0.06445551663637161,
0.002359688049182296,
0.007627812214195728,
-0.02054162323474884,
0.0879879966378212,
0.08473807573318481,
0.05520924553275108,
0.10551223158836365,
-0.04045399650931358,
-0.018193090334534645,
0.06562023609876633,
0.028528297320008278,
0.30795541405677795,
-0.04737510904669762,
0.10012597590684891,
0.05308030918240547,
0.1470678299665451,
-0.021484380587935448,
0.03684761002659798,
0.012217415496706963,
-0.004969597794115543,
-0.029050981625914574,
-0.053791627287864685,
-0.025047848001122475,
0.006697852164506912,
-0.06731251627206802,
0.04143824055790901,
-0.05661080777645111,
0.046066831797361374,
0.01713971421122551,
0.29092565178871155,
0.002069284440949559,
-0.2631206214427948,
-0.09976524859666824,
-0.01138981431722641,
-0.03841936215758324,
-0.04956342652440071,
0.012987788766622543,
0.11889519542455673,
-0.13168957829475403,
0.025249799713492393,
-0.06555988639593124,
0.08447331935167313,
-0.027722777798771858,
-0.005924155004322529,
0.03768875449895859,
0.16191422939300537,
-0.02086888626217842,
0.06163337081670761,
-0.2215680629014969,
0.23086535930633545,
0.00825225654989481,
0.12157387286424637,
-0.053496889770030975,
0.006484241224825382,
0.023568477481603622,
-0.0004556526255328208,
0.09637042880058289,
-0.002176777459681034,
-0.047836702316999435,
-0.13995333015918732,
-0.05354355275630951,
0.07026895880699158,
0.13979926705360413,
-0.05258195847272873,
0.10146238654851913,
-0.059773121029138565,
0.01044840645045042,
0.036417558789253235,
-0.08095037192106247,
-0.12019876390695572,
-0.10230343043804169,
-0.01915774866938591,
-0.0029730438254773617,
-0.06168811395764351,
-0.06491481512784958,
-0.06646685302257538,
0.025853179395198822,
0.11618860065937042,
-0.001801953767426312,
-0.03473189100623131,
-0.14872871339321136,
0.0731562003493309,
0.15485072135925293,
-0.06844829022884369,
0.03335244953632355,
0.0028646374121308327,
0.08140089362859726,
0.03483527898788452,
-0.07777565717697144,
0.06299693137407303,
-0.06699930131435394,
-0.18124040961265564,
-0.0479501411318779,
0.10381223261356354,
0.07117839902639389,
0.042138706892728806,
-0.005274030379951,
0.04768288508057594,
-0.026974299922585487,
-0.09105108678340912,
0.029679426923394203,
0.03199926018714905,
0.03454853966832161,
0.04215939715504646,
-0.07673118263483047,
0.08537523448467255,
-0.0442475751042366,
-0.019912807270884514,
0.1219387948513031,
0.23260535299777985,
-0.10513335466384888,
0.09812293201684952,
0.057003963738679886,
-0.061131250113248825,
-0.1664896160364151,
0.0712215006351471,
0.10669053345918655,
0.012579078786075115,
0.060430437326431274,
-0.2157745361328125,
0.12133048474788666,
0.10214895009994507,
-0.014698716811835766,
0.03873297944664955,
-0.27927955985069275,
-0.12071608006954193,
0.0494951456785202,
0.1243346631526947,
0.08503267168998718,
-0.1251666247844696,
-0.01927279308438301,
-0.014690269716084003,
-0.12777365744113922,
0.07860065251588821,
-0.11266770958900452,
0.1308109313249588,
-0.023569071665406227,
0.10992997139692307,
0.012874331325292587,
-0.026556098833680153,
0.10853683203458786,
0.04987919703125954,
0.09541551768779755,
-0.04241808131337166,
0.00013285722525324672,
0.05950044468045235,
-0.04911913722753525,
-0.00019531459838617593,
-0.06624994426965714,
0.08885305374860764,
-0.13670285046100616,
-0.007874906063079834,
-0.08817613124847412,
0.041862353682518005,
-0.041492264717817307,
-0.06624837964773178,
-0.04183858633041382,
0.05696883052587509,
0.04527343437075615,
-0.03273998200893402,
0.03811733052134514,
-0.024907175451517105,
0.10240600258111954,
0.026737535372376442,
0.08660715818405151,
0.019771361723542213,
-0.055041152983903885,
0.021712182089686394,
-0.012234511785209179,
0.06368886679410934,
-0.16879907250404358,
0.010537082329392433,
0.09801791608333588,
0.0680718868970871,
0.10179891437292099,
0.04134155437350273,
-0.04844922199845314,
0.017322693020105362,
0.027874315157532692,
-0.09667588025331497,
-0.11617062985897064,
0.039704181253910065,
-0.036632239818573,
-0.14748640358448029,
0.0416521392762661,
0.11942573636770248,
-0.039233624935150146,
-0.031819190829992294,
-0.019865527749061584,
0.004155534785240889,
-0.02115645445883274,
0.18034480512142181,
0.06023193150758743,
0.06039504334330559,
-0.10245323181152344,
0.11999929696321487,
0.03474798798561096,
-0.026757534593343735,
0.051088809967041016,
0.08188214898109436,
-0.10040919482707977,
-0.005771077703684568,
0.07761740684509277,
0.12591947615146637,
-0.05447424203157425,
0.0010802140459418297,
-0.10125395655632019,
-0.08775530755519867,
0.05863740295171738,
0.14176975190639496,
0.050111573189496994,
-0.016015302389860153,
-0.04520795866847038,
0.04609755799174309,
-0.13895204663276672,
0.0745292380452156,
0.033867012709379196,
0.06351295113563538,
-0.07774297147989273,
0.06504681706428528,
0.0034535368904471397,
0.019167762249708176,
-0.014902218244969845,
0.002518960740417242,
-0.09664979577064514,
-0.007666787598282099,
-0.08408395200967789,
0.0023662059102207422,
0.00027157471049577,
0.018400009721517563,
-0.023939257487654686,
-0.07176205515861511,
-0.0446765273809433,
0.03660878166556358,
-0.08627639710903168,
-0.05093632638454437,
0.006061204709112644,
0.04267912358045578,
-0.12431629002094269,
-0.005052523221820593,
0.02946523204445839,
-0.09870673716068268,
0.09935443103313446,
0.07347770780324936,
0.02107950672507286,
0.028878023847937584,
-0.12005434930324554,
-0.03452937677502632,
-0.013860754668712616,
-0.009878485463559628,
0.059801507741212845,
-0.10059107840061188,
-0.0056801438331604,
-0.045606520026922226,
0.06222075968980789,
0.0129854129627347,
0.06245124712586403,
-0.14038363099098206,
0.015046409331262112,
-0.06991436332464218,
-0.04617712274193764,
-0.07856450229883194,
0.039792872965335846,
0.09217651188373566,
0.05975496768951416,
0.1410263627767563,
-0.07504139840602875,
0.025975961238145828,
-0.20452232658863068,
-0.03658704459667206,
-0.013411125168204308,
-0.054935164749622345,
-0.144736185669899,
-0.04669416695833206,
0.08183903992176056,
-0.0397329181432724,
0.0902058556675911,
-0.026569945737719536,
0.07295098155736923,
0.037650130689144135,
-0.04964417964220047,
-0.03189816325902939,
-0.008279473520815372,
0.20129944384098053,
0.07162187248468399,
-0.014425266534090042,
0.1017700806260109,
-0.0006287938449531794,
0.031312406063079834,
0.044922828674316406,
0.17165298759937286,
0.22154195606708527,
0.041665155440568924,
0.04944118857383728,
0.06345032155513763,
-0.07688994705677032,
-0.07079999148845673,
0.17401306331157684,
-0.011746407486498356,
0.06697718054056168,
-0.0452614352107048,
0.19327642023563385,
0.12048991024494171,
-0.1682981699705124,
0.04784160107374191,
-0.045474156737327576,
-0.08185908198356628,
-0.11950697749853134,
-0.00984023604542017,
-0.08397911489009857,
-0.1236082911491394,
0.03654945269227028,
-0.1188095435500145,
0.04655259847640991,
0.10836544632911682,
0.014642585068941116,
0.03590626269578934,
0.1250593513250351,
-0.0099891796708107,
-0.0062090554274618626,
0.0673106238245964,
0.004845684859901667,
-0.009486161172389984,
-0.03981393203139305,
-0.07479256391525269,
0.05920320749282837,
0.0025150475557893515,
0.08150777220726013,
-0.04561382159590721,
-0.014995931647717953,
0.03018469735980034,
-0.031086904928088188,
-0.07982758432626724,
0.025530701503157616,
0.044533662497997284,
0.053127825260162354,
0.04917260631918907,
0.04235472530126572,
-0.010744837112724781,
-0.03292487934231758,
0.326882541179657,
-0.06640700995922089,
-0.10356033593416214,
-0.1251021921634674,
0.21535587310791016,
0.031945858150720596,
-0.026849601417779922,
0.03296785056591034,
-0.08620791137218475,
-0.0016253353096544743,
0.16897743940353394,
0.1771453469991684,
-0.06652777642011642,
-0.020366782322525978,
-0.0003948130179196596,
-0.01607140339910984,
-0.030892690643668175,
0.12540994584560394,
0.09677106887102127,
-0.012077100574970245,
-0.06166650354862213,
-0.016346219927072525,
-0.01605166122317314,
-0.03349415585398674,
-0.042412348091602325,
0.04983877018094063,
0.020027978345751762,
-0.026051461696624756,
-0.04324014484882355,
0.07644621282815933,
0.004143407568335533,
-0.254589706659317,
0.0623449869453907,
-0.15546537935733795,
-0.17649787664413452,
-0.050697777420282364,
0.030975166708230972,
0.004428437910974026,
0.05701398476958275,
-0.014861796982586384,
0.002883953507989645,
0.08814997225999832,
-0.01054928544908762,
-0.03447337821125984,
-0.12039955705404282,
0.12407344579696655,
-0.04771975800395012,
0.17097218334674835,
-0.030895516276359558,
0.04286041110754013,
0.11594689637422562,
0.030235953629016876,
-0.13534551858901978,
0.035599175840616226,
0.0625014379620552,
-0.09655643999576569,
0.02052842080593109,
0.14994902908802032,
-0.04762071743607521,
0.097834013402462,
0.04706641286611557,
-0.10923779755830765,
0.0007903294754214585,
-0.06703943759202957,
-0.034285008907318115,
-0.08608700335025787,
-0.010806094855070114,
-0.06530987471342087,
0.16560696065425873,
0.220872700214386,
-0.037897445261478424,
0.008963117375969887,
-0.09824646264314651,
0.011987187899649143,
0.06971351057291031,
0.037384167313575745,
-0.04886792600154877,
-0.18195569515228271,
0.004848325625061989,
0.05968151241540909,
-0.0011631451779976487,
-0.25181853771209717,
-0.07316582649946213,
0.036512937396764755,
-0.029019223526120186,
-0.03307740017771721,
0.11051476001739502,
0.046922244131565094,
0.049997031688690186,
-0.030822833999991417,
-0.15151333808898926,
-0.033668529242277145,
0.15626749396324158,
-0.17311713099479675,
-0.0356961153447628
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08829128742218018,
0.09424453973770142,
-0.0025848792865872383,
0.07893981039524078,
0.1406514048576355,
0.0300140343606472,
0.09653345495462418,
0.13670441508293152,
-0.11372396349906921,
0.04505102336406708,
0.09637835621833801,
0.07725822180509567,
0.028914062306284904,
0.14503149688243866,
-0.029656672850251198,
-0.24452170729637146,
-0.007900665514171124,
-0.022657571360468864,
-0.10568590462207794,
0.11298897117376328,
0.09995067864656448,
-0.09878174215555191,
0.0677877739071846,
-0.018744029104709625,
-0.17629863321781158,
0.01446867361664772,
-0.018357738852500916,
-0.052225902676582336,
0.11693944782018661,
-0.0018291237065568566,
0.07857466489076614,
0.009755467996001244,
0.1163235455751419,
-0.19313272833824158,
0.01832183636724949,
0.07148460298776627,
0.04406142607331276,
0.09423822164535522,
0.005478594917804003,
-0.02174585871398449,
0.11780648678541183,
-0.1297878623008728,
0.09888813644647598,
0.032326776534318924,
-0.09876459836959839,
-0.20605428516864777,
-0.09692326933145523,
0.012139586731791496,
0.043085649609565735,
0.08864012360572815,
0.006887967232614756,
0.14719082415103912,
-0.10720398277044296,
0.08035692572593689,
0.22948192059993744,
-0.26728516817092896,
-0.08273375034332275,
0.04697822779417038,
0.06187056005001068,
0.08577610552310944,
-0.12270735204219818,
-0.012264001183211803,
0.010782493278384209,
0.020619938150048256,
0.10175737738609314,
-0.0276939794421196,
-0.07668986916542053,
0.013155274093151093,
-0.11189282685518265,
0.0006016312981955707,
0.1156604066491127,
0.03737575560808182,
-0.05499960109591484,
-0.07978003472089767,
-0.037876904010772705,
-0.0613800585269928,
-0.03487889841198921,
-0.014807400293648243,
0.03667807579040527,
-0.06230362877249718,
-0.1418614685535431,
-0.04707975685596466,
-0.04954928904771805,
-0.09141801297664642,
0.0002939733967650682,
0.20934797823429108,
0.03828045725822449,
0.021762272343039513,
-0.04862724989652634,
0.10522700846195221,
0.01501147449016571,
-0.12348845601081848,
-0.03322165459394455,
-0.0059804185293614864,
-0.0947776734828949,
-0.03489871695637703,
-0.05665002763271332,
0.032492611557245255,
0.039900943636894226,
0.2295115441083908,
-0.02756955660879612,
0.07350149005651474,
0.036721233278512955,
-0.013480213470757008,
-0.027277614921331406,
0.14742296934127808,
-0.028704965487122536,
-0.07694938033819199,
0.009810306131839752,
0.06373690813779831,
0.00528031587600708,
-0.0058493283577263355,
-0.0629550889134407,
-0.043649546802043915,
0.06190216913819313,
0.05800453945994377,
-0.05091603472828865,
0.028118440881371498,
-0.009465585462749004,
-0.023534666746854782,
0.0013602502876892686,
-0.1193840354681015,
0.009133142419159412,
-0.008752839639782906,
-0.07886768132448196,
-0.05355604737997055,
0.01307595893740654,
-0.012105743400752544,
0.009874477982521057,
0.09037633240222931,
-0.07374143600463867,
-0.03668205067515373,
-0.07906293123960495,
-0.07487870752811432,
-0.015267695300281048,
-0.16001376509666443,
0.022371504455804825,
-0.06954806298017502,
-0.15286292135715485,
-0.03232128918170929,
0.05065528675913811,
-0.08033368736505508,
-0.03541965037584305,
-0.03257613629102707,
-0.08127246797084808,
0.017484180629253387,
0.0010267430916428566,
0.2159806489944458,
-0.04893165081739426,
0.09441213309764862,
0.014643717557191849,
0.054472632706165314,
0.00611577183008194,
0.03749283403158188,
-0.07838702201843262,
0.01390688307583332,
-0.17776428163051605,
0.07578878104686737,
-0.08939949423074722,
0.029893487691879272,
-0.145064115524292,
-0.08796240389347076,
-0.001576053793542087,
-0.019394250586628914,
0.08120350539684296,
0.10502995550632477,
-0.12595920264720917,
-0.023241441696882248,
0.12417630851268768,
-0.05632871761918068,
-0.05690794065594673,
0.06501763314008713,
-0.07390758395195007,
0.08261601626873016,
0.048876333981752396,
0.18963569402694702,
0.09883508086204529,
-0.10858586430549622,
0.025790659710764885,
0.017913971096277237,
0.03713492676615715,
0.0031283244024962187,
0.05633743107318878,
0.000544012407772243,
0.02177804335951805,
0.01599958725273609,
-0.08277911692857742,
0.015316641889512539,
-0.09217411279678345,
-0.06269164383411407,
-0.04129699617624283,
-0.08997532725334167,
0.007561652921140194,
0.008056815713644028,
0.029024334624409676,
-0.0803724154829979,
-0.08758775889873505,
0.06219393014907837,
0.1402582973241806,
-0.04556821286678314,
0.010715625248849392,
-0.0802333652973175,
0.004423291888087988,
-0.02971855364739895,
-0.021899130195379257,
-0.19476261734962463,
-0.06118500605225563,
0.024984341114759445,
0.011466111987829208,
0.044557128101587296,
-0.005968804005533457,
0.08077029883861542,
0.022180289030075073,
-0.05010940879583359,
-0.004590648226439953,
-0.08944043517112732,
-0.010224323719739914,
-0.09040689468383789,
-0.21815955638885498,
-0.05536225438117981,
-0.038979947566986084,
0.1543026864528656,
-0.17303414642810822,
0.0006372433854267001,
-0.019106131047010422,
0.1089320182800293,
0.0420074425637722,
-0.04991177096962929,
-0.00028493459103628993,
0.029119452461600304,
0.015798794105648994,
-0.09916768968105316,
0.03458608314394951,
0.01606675423681736,
-0.09806188195943832,
-0.024487076327204704,
-0.10249872505664825,
0.0039391410537064075,
0.07496614754199982,
0.07286914438009262,
-0.10770708322525024,
-0.018075332045555115,
-0.06264372915029526,
-0.028426019474864006,
-0.04573037475347519,
0.03428012505173683,
0.17350968718528748,
0.016337048262357712,
0.11265531927347183,
-0.07507890462875366,
-0.08609746396541595,
0.018465155735611916,
0.009164134040474892,
0.05948793888092041,
0.11330900341272354,
0.07524608820676804,
-0.09731380641460419,
0.05806463584303856,
0.08848592638969421,
-0.04847954213619232,
0.13933417201042175,
-0.049064505845308304,
-0.07860229164361954,
-0.030544567853212357,
0.0014505441067740321,
-0.0013536635087803006,
0.1509035974740982,
-0.04727110266685486,
0.009865508414804935,
0.03458046913146973,
0.02989550307393074,
0.00837920606136322,
-0.16162559390068054,
-0.023876242339611053,
0.020014988258481026,
-0.04839244484901428,
-0.02827814407646656,
0.012084569782018661,
0.013198849745094776,
0.0913330465555191,
0.04966972395777702,
0.00046298594679683447,
0.0056342193856835365,
-0.013466169126331806,
-0.04887022078037262,
0.20123030245304108,
-0.09206246584653854,
-0.04537307098507881,
-0.08427785336971283,
-0.002624526619911194,
-0.011191672645509243,
-0.0358722060918808,
0.01675633154809475,
-0.10182815045118332,
-0.023498283699154854,
-0.0652933269739151,
0.00510739628225565,
-0.04823349416255951,
0.011422974988818169,
0.0013408857630565763,
0.01897905394434929,
0.057799771428108215,
-0.13379359245300293,
0.014514757320284843,
-0.06284800916910172,
-0.1135907918214798,
0.031036464497447014,
0.05295873433351517,
0.08777792751789093,
0.06224749609827995,
-0.025459859520196915,
0.01738808862864971,
-0.045688558369874954,
0.23454728722572327,
-0.08638226985931396,
0.0055444128811359406,
0.12389862537384033,
0.023630475625395775,
0.03653666377067566,
0.10183537751436234,
0.028090182691812515,
-0.10011082887649536,
0.04345891997218132,
0.0747038871049881,
-0.04367872700095177,
-0.2546314001083374,
0.008908601477742195,
-0.04358711466193199,
-0.08356309682130814,
0.08876088261604309,
0.04869813472032547,
-0.039492253214120865,
0.06506279855966568,
0.003325592027977109,
0.008558765985071659,
-0.02150704897940159,
0.08776181191205978,
0.08661573380231857,
0.05470374971628189,
0.10628864914178848,
-0.04080251231789589,
-0.018558837473392487,
0.06490743905305862,
0.0287020206451416,
0.30881017446517944,
-0.04741932451725006,
0.09994331747293472,
0.053538814187049866,
0.146734818816185,
-0.021003015339374542,
0.03739865496754646,
0.01169654168188572,
-0.005685565993189812,
-0.029061932116746902,
-0.05356641858816147,
-0.023696117103099823,
0.005578805226832628,
-0.06893482059240341,
0.04123537614941597,
-0.05659792944788933,
0.044714778661727905,
0.017352547496557236,
0.29024821519851685,
0.0015063780592754483,
-0.2653180956840515,
-0.1002042144536972,
-0.01221553049981594,
-0.03814339637756348,
-0.0490216501057148,
0.013327013701200485,
0.11814043670892715,
-0.13110555708408356,
0.025855479761958122,
-0.06543460488319397,
0.08432641625404358,
-0.027824610471725464,
-0.00487125338986516,
0.039475005120038986,
0.16357317566871643,
-0.021763883531093597,
0.0611078180372715,
-0.22225339710712433,
0.2298143208026886,
0.008518774062395096,
0.12223297357559204,
-0.05350126326084137,
0.00627705967053771,
0.02438315935432911,
0.0008982216822914779,
0.09586499631404877,
-0.0028115613386034966,
-0.04753795266151428,
-0.139890655875206,
-0.052595485001802444,
0.07130004465579987,
0.13920648396015167,
-0.051322732120752335,
0.10214199125766754,
-0.05891034007072449,
0.009960649535059929,
0.03631485626101494,
-0.08214832097291946,
-0.12051378935575485,
-0.10284113883972168,
-0.020093444734811783,
-0.0014609784120693803,
-0.06165926530957222,
-0.06407550722360611,
-0.06689678132534027,
0.02383917197585106,
0.11531385034322739,
0.00006291324825724587,
-0.034764230251312256,
-0.14906907081604004,
0.072408527135849,
0.15490500628948212,
-0.06754690408706665,
0.033484719693660736,
0.0032323142513632774,
0.08019682765007019,
0.035834357142448425,
-0.07753726840019226,
0.06313146650791168,
-0.06757824122905731,
-0.18003013730049133,
-0.048005253076553345,
0.1029396578669548,
0.07081311196088791,
0.041319604963064194,
-0.005731005687266588,
0.04765864461660385,
-0.027904637157917023,
-0.09165828675031662,
0.02983364276587963,
0.03023446910083294,
0.035614974796772,
0.04199051856994629,
-0.07769577205181122,
0.08542532473802567,
-0.043703366070985794,
-0.019078504294157028,
0.12052955478429794,
0.2304730862379074,
-0.1046009212732315,
0.09595474600791931,
0.0570150725543499,
-0.060665640980005264,
-0.16588552296161652,
0.07258480787277222,
0.10529709607362747,
0.012903428636491299,
0.05894938483834267,
-0.21639296412467957,
0.12254509329795837,
0.10163731127977371,
-0.013638412579894066,
0.04070005193352699,
-0.2765659689903259,
-0.11997430771589279,
0.0497463159263134,
0.12502378225326538,
0.08637390285730362,
-0.1256362497806549,
-0.01850811205804348,
-0.014903657138347626,
-0.12746015191078186,
0.0776708796620369,
-0.11509428173303604,
0.13105103373527527,
-0.0240650437772274,
0.1107068806886673,
0.012100441381335258,
-0.026583628728985786,
0.10773162543773651,
0.05084993317723274,
0.09662209451198578,
-0.04271683841943741,
0.0011847694404423237,
0.05944046005606651,
-0.048365168273448944,
-0.0005207308568060398,
-0.06718260049819946,
0.08837637305259705,
-0.13686953485012054,
-0.007715118583291769,
-0.08866269141435623,
0.041801050305366516,
-0.04137028753757477,
-0.06588476151227951,
-0.04121747985482216,
0.05695115774869919,
0.04417751356959343,
-0.03295949473977089,
0.03696909919381142,
-0.025577956810593605,
0.10273971408605576,
0.02533356472849846,
0.08709793537855148,
0.01829688437283039,
-0.05549323931336403,
0.022891690954566002,
-0.01279356423765421,
0.06357599794864655,
-0.1685003936290741,
0.009158688597381115,
0.09855856746435165,
0.06731445342302322,
0.10135358572006226,
0.04201394319534302,
-0.04747772216796875,
0.016696926206350327,
0.028604431077837944,
-0.09701754152774811,
-0.11502417922019958,
0.03999779745936394,
-0.040070563554763794,
-0.14694130420684814,
0.043032076209783554,
0.11923639476299286,
-0.03907104954123497,
-0.03159172460436821,
-0.019941003993153572,
0.0034042231272906065,
-0.02121650241315365,
0.18104052543640137,
0.060846805572509766,
0.05988823249936104,
-0.10348469018936157,
0.11960726231336594,
0.034603919833898544,
-0.027019064873456955,
0.05103883892297745,
0.08290965855121613,
-0.10128585994243622,
-0.006599146872758865,
0.07687490433454514,
0.12789729237556458,
-0.05400210618972778,
0.00041392073035240173,
-0.10243918001651764,
-0.08798141032457352,
0.05861248821020126,
0.14134350419044495,
0.05001463368535042,
-0.017255106940865517,
-0.0454389713704586,
0.04562659189105034,
-0.13971294462680817,
0.07418964058160782,
0.03283616527915001,
0.06406152993440628,
-0.07775622606277466,
0.06430134922266006,
0.0033097316045314074,
0.01852579601109028,
-0.014818009920418262,
0.0031221946701407433,
-0.09714821726083755,
-0.007860132493078709,
-0.08419256657361984,
0.0015547614311799407,
0.0003062008472625166,
0.018979208543896675,
-0.024238629266619682,
-0.07076292484998703,
-0.045306168496608734,
0.036847006529569626,
-0.08640575408935547,
-0.05060402676463127,
0.007926300168037415,
0.043189410120248795,
-0.12347927689552307,
-0.004743657074868679,
0.028246283531188965,
-0.09834005683660507,
0.09911660850048065,
0.07323315739631653,
0.020748235285282135,
0.029010239988565445,
-0.11891670525074005,
-0.035030242055654526,
-0.014356519095599651,
-0.01054112333804369,
0.06055191159248352,
-0.09966754913330078,
-0.005231559742242098,
-0.04582265391945839,
0.06294693052768707,
0.013115622103214264,
0.06161922588944435,
-0.13980494439601898,
0.015446215867996216,
-0.06909127533435822,
-0.04482496902346611,
-0.07915095239877701,
0.04000406712293625,
0.09295377135276794,
0.059256020933389664,
0.14096634089946747,
-0.07500068843364716,
0.025219960138201714,
-0.20448972284793854,
-0.036965031176805496,
-0.013641797937452793,
-0.05654935911297798,
-0.14438104629516602,
-0.04670265316963196,
0.08245505392551422,
-0.04044071584939957,
0.09204501658678055,
-0.026568040251731873,
0.07299534976482391,
0.03717025741934776,
-0.04943946376442909,
-0.033021971583366394,
-0.007986187934875488,
0.20100292563438416,
0.07100570946931839,
-0.014941872097551823,
0.10157693177461624,
0.000630303518846631,
0.03075190633535385,
0.04601804539561272,
0.16992555558681488,
0.22102582454681396,
0.04235542565584183,
0.04952998831868172,
0.0638069435954094,
-0.0774231031537056,
-0.06934370845556259,
0.17506296932697296,
-0.011302887462079525,
0.06764449179172516,
-0.046063780784606934,
0.1919204294681549,
0.11958927661180496,
-0.16718822717666626,
0.04798618331551552,
-0.046322502195835114,
-0.08195261657238007,
-0.1183990091085434,
-0.00829053670167923,
-0.08350180834531784,
-0.12409617006778717,
0.03668184578418732,
-0.11933566629886627,
0.04576738178730011,
0.10950073599815369,
0.015183289535343647,
0.03526189178228378,
0.12648612260818481,
-0.008913286030292511,
-0.005538778379559517,
0.06754852831363678,
0.00438815588131547,
-0.00990733690559864,
-0.03972009941935539,
-0.07444223761558533,
0.05857076495885849,
0.001068409881554544,
0.0808522179722786,
-0.046273790299892426,
-0.015460718423128128,
0.03081594593822956,
-0.030659155920147896,
-0.07944949716329575,
0.025455867871642113,
0.04517759755253792,
0.05296430364251137,
0.050094690173864365,
0.04190322011709213,
-0.011544675566256046,
-0.03303314000368118,
0.32582029700279236,
-0.06633603572845459,
-0.10248540341854095,
-0.12544231116771698,
0.21495944261550903,
0.03170743212103844,
-0.02748723141849041,
0.03185740113258362,
-0.08521465957164764,
-0.0005547911860048771,
0.1702137291431427,
0.1789214164018631,
-0.06682812422513962,
-0.020736128091812134,
-0.00017963474965654314,
-0.016172656789422035,
-0.031061267480254173,
0.12580133974552155,
0.09720540791749954,
-0.013400107622146606,
-0.061581894755363464,
-0.016342194750905037,
-0.015604546293616295,
-0.03310627117753029,
-0.04265468940138817,
0.04916590452194214,
0.021083718165755272,
-0.026496626436710358,
-0.042455196380615234,
0.0763072669506073,
0.0038994215428829193,
-0.2545676529407501,
0.06296167522668839,
-0.15499736368656158,
-0.17688003182411194,
-0.05163931846618652,
0.030504733324050903,
0.005205760709941387,
0.05739719048142433,
-0.014898868277668953,
0.002746041864156723,
0.08879656344652176,
-0.011033517308533192,
-0.03353242203593254,
-0.12112035602331161,
0.12401002645492554,
-0.048206645995378494,
0.16981738805770874,
-0.031366731971502304,
0.04273315519094467,
0.1166052594780922,
0.02986903488636017,
-0.13513441383838654,
0.03611808642745018,
0.061788469552993774,
-0.09715165942907333,
0.021114062517881393,
0.14900116622447968,
-0.04738568142056465,
0.09702986478805542,
0.04588352516293526,
-0.10818907618522644,
0.0014726790832355618,
-0.06677239388227463,
-0.03411601483821869,
-0.08605284988880157,
-0.010834243148565292,
-0.06519370526075363,
0.16610953211784363,
0.22218585014343262,
-0.037816815078258514,
0.00893666222691536,
-0.09811265021562576,
0.011911273002624512,
0.0702797919511795,
0.035788461565971375,
-0.04967247694730759,
-0.18200351297855377,
0.005150472745299339,
0.06054938957095146,
-0.0016588501166552305,
-0.2527018189430237,
-0.07193884998559952,
0.03628889098763466,
-0.02882974222302437,
-0.033364128321409225,
0.11031137406826019,
0.0478380024433136,
0.050893791019916534,
-0.031182261183857918,
-0.1492566466331482,
-0.03376762941479683,
0.15606103837490082,
-0.17327676713466644,
-0.03627558425068855
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-42
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 8.618732261116367, 'f1': 14.074017518582023}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-42
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 8.618732261116367, 'f1': 14.074017518582023}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 8.618732261116367, 'f1': 14.074017518582023}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 8.618732261116367, 'f1': 14.074017518582023}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
52,
45,
6,
12,
8,
3,
104,
35,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results\n\n{'exact_match': 8.618732261116367, 'f1': 14.074017518582023}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.10042857378721237,
0.0593213252723217,
-0.0027756583876907825,
0.0753999575972557,
0.15026803314685822,
0.04035438969731331,
0.11276142299175262,
0.11698906123638153,
-0.1115158349275589,
0.0468350350856781,
0.058662038296461105,
0.07495419681072235,
0.04551253840327263,
0.11679694056510925,
-0.036872874945402145,
-0.26056158542633057,
-0.009722687304019928,
-0.0014742835192009807,
-0.1384553760290146,
0.11120376735925674,
0.11417648941278458,
-0.09086572378873825,
0.06604188680648804,
0.011839660815894604,
-0.1865473836660385,
0.03105016052722931,
0.003564528888091445,
-0.04577411711215973,
0.12160971015691757,
0.012491496279835701,
0.09205509722232819,
0.011819788254797459,
0.12064646929502487,
-0.18055202066898346,
0.016580626368522644,
0.09086809307336807,
0.03768131881952286,
0.10214084386825562,
0.04996652901172638,
-0.029916154220700264,
0.12886759638786316,
-0.11446920782327652,
0.07957197725772858,
0.05324133113026619,
-0.10419979691505432,
-0.24431830644607544,
-0.10610619187355042,
0.04402570798993111,
0.050428491085767746,
0.08640773594379425,
0.0128564378246665,
0.14461423456668854,
-0.06763165444135666,
0.08058124035596848,
0.24171122908592224,
-0.28578221797943115,
-0.09149663895368576,
0.056917160749435425,
0.054379384964704514,
0.04589217156171799,
-0.11718802899122238,
-0.0030150311067700386,
0.014485570602118969,
0.037168845534324646,
0.09655005484819412,
-0.03406675159931183,
-0.11538272351026535,
-0.009858679957687855,
-0.10904218256473541,
0.01205469947308302,
0.10298971831798553,
0.03513345494866371,
-0.05900980904698372,
-0.045446816831827164,
-0.057913556694984436,
-0.06614314764738083,
-0.02434339001774788,
-0.0449201762676239,
0.04243210330605507,
-0.056181956082582474,
-0.10635128617286682,
-0.06413155049085617,
-0.05843259021639824,
-0.08730479329824448,
-0.014928054995834827,
0.2356489598751068,
0.02734183706343174,
0.04097563773393631,
-0.055964887142181396,
0.11097551137208939,
-0.014231430366635323,
-0.13384966552257538,
-0.01670646481215954,
-0.0033459514379501343,
-0.09601392596960068,
-0.0339849516749382,
-0.05839477851986885,
0.015243055298924446,
0.021010970696806908,
0.21934910118579865,
-0.09905412048101425,
0.07693550735712051,
0.03925038501620293,
-0.009839617647230625,
-0.047218240797519684,
0.14180167019367218,
-0.047811031341552734,
-0.07104790955781937,
0.009650898166000843,
0.06150004267692566,
0.008272819221019745,
0.000032630319765303284,
-0.056499868631362915,
-0.048393744975328445,
0.06459108740091324,
0.04428539052605629,
-0.03745542839169502,
0.027286291122436523,
0.0029520855750888586,
-0.03218667954206467,
0.019835276529192924,
-0.11465437710285187,
0.013784302398562431,
-0.0033675802405923605,
-0.10260576009750366,
-0.039373308420181274,
0.02208230271935463,
-0.0014967687893658876,
-0.0006564257782883942,
0.1007087305188179,
-0.08189006894826889,
-0.006204082630574703,
-0.08953408896923065,
-0.08613018691539764,
-0.014300740323960781,
-0.13465340435504913,
0.006140151526778936,
-0.053981974720954895,
-0.1641564965248108,
-0.03926364704966545,
0.05410544574260712,
-0.07076503336429596,
-0.033251795917749405,
0.001568472944200039,
-0.08786569535732269,
0.021658143028616905,
0.000289925723336637,
0.21617794036865234,
-0.0424814336001873,
0.0759986862540245,
0.024909278377890587,
0.04251359775662422,
-0.019388576969504356,
0.03137916326522827,
-0.07730704545974731,
0.015175783075392246,
-0.18269486725330353,
0.05897196754813194,
-0.08773161470890045,
0.01102286484092474,
-0.13198742270469666,
-0.09726475924253464,
-0.0048205978237092495,
-0.01912461407482624,
0.08648484945297241,
0.10201101005077362,
-0.14219412207603455,
-0.027316682040691376,
0.10030382126569748,
-0.07813000679016113,
-0.07630420476198196,
0.05244490131735802,
-0.04802742972970009,
0.06237535923719406,
0.034267764538526535,
0.15042740106582642,
0.11150778084993362,
-0.11357495933771133,
0.00024820311227813363,
0.008634323254227638,
0.03088337928056717,
0.010520592331886292,
0.05052868276834488,
-0.008589733392000198,
0.0264630988240242,
0.016243906691670418,
-0.07797212153673172,
-0.01688206195831299,
-0.09796860814094543,
-0.06060899794101715,
-0.055173110216856,
-0.08323115855455399,
0.003413132391870022,
0.026880204677581787,
0.030394719913601875,
-0.0749552771449089,
-0.10587695986032486,
0.09833719581365585,
0.13256455957889557,
-0.02944672293961048,
0.020612504333257675,
-0.07671105861663818,
-0.015575945377349854,
-0.003804681356996298,
-0.028805328533053398,
-0.22235740721225739,
-0.10830797255039215,
0.01307481899857521,
-0.03466112166643143,
0.04464923217892647,
-0.005969423335045576,
0.08835183829069138,
0.03701532259583473,
-0.055675093084573746,
0.0009915019618347287,
-0.09207330644130707,
-0.02063947729766369,
-0.08081416040658951,
-0.21946381032466888,
-0.08847790211439133,
-0.018074216321110725,
0.15806898474693298,
-0.17240960896015167,
0.01154622994363308,
-0.010621007531881332,
0.1278257966041565,
0.02212233655154705,
-0.04612596705555916,
-0.018793022260069847,
0.031282830983400345,
0.010895706713199615,
-0.09426594525575638,
0.0386611670255661,
0.017399806529283524,
-0.07270239293575287,
-0.031176958233118057,
-0.13550478219985962,
0.03211880102753639,
0.07161497324705124,
0.047719452530145645,
-0.09900557994842529,
-0.00847200769931078,
-0.07051736861467361,
-0.039267219603061676,
-0.04724277928471565,
0.03134695440530777,
0.16151055693626404,
0.020124338567256927,
0.10806572437286377,
-0.06737305223941803,
-0.07691206783056259,
0.014722170308232307,
0.00789918564260006,
0.05057472735643387,
0.08813601732254028,
0.06071358919143677,
-0.0920528993010521,
0.06982028484344482,
0.06221943348646164,
-0.06528954207897186,
0.13559557497501373,
-0.04031532630324364,
-0.05793169513344765,
-0.03938477486371994,
-0.018678124994039536,
-0.0074667432345449924,
0.1544055938720703,
-0.04223110154271126,
0.026075927540659904,
0.0357489213347435,
0.025440726429224014,
0.02971772663295269,
-0.17301036417484283,
-0.003607085905969143,
0.004844407085329294,
-0.047102782875299454,
-0.013854312710464,
0.005466552451252937,
0.03936908021569252,
0.094873808324337,
0.034602269530296326,
0.003368297591805458,
0.00042980059515684843,
-0.005487862508744001,
-0.058682795614004135,
0.21881626546382904,
-0.09540926665067673,
-0.05815686658024788,
-0.11150307208299637,
0.02864193171262741,
-0.05619096755981445,
-0.04643188789486885,
0.007576889358460903,
-0.08521523326635361,
-0.03661276772618294,
-0.047551531344652176,
0.006889597978442907,
-0.024236265569925308,
-0.0007967378478497267,
0.013655368238687515,
0.011264506727457047,
0.0869903415441513,
-0.14386314153671265,
0.013927784748375416,
-0.05862625688314438,
-0.1297009140253067,
0.009120432659983635,
0.08844519406557083,
0.07656271755695343,
0.0871472880244255,
-0.03207605332136154,
0.021913748234510422,
-0.028606349602341652,
0.2398955225944519,
-0.07783817499876022,
0.019271017983555794,
0.1569998860359192,
0.029506227001547813,
0.04776402935385704,
0.09329444169998169,
0.05127305909991264,
-0.08212246000766754,
0.02447252720594406,
0.09789788722991943,
-0.03187604248523712,
-0.2662908732891083,
-0.007860775105655193,
-0.019296547397971153,
-0.10077834874391556,
0.0730099231004715,
0.042509716004133224,
0.0006530751707032323,
0.07172553241252899,
-0.01698007807135582,
0.0005840971716679633,
-0.015864165499806404,
0.0791865810751915,
0.10041923820972443,
0.06766115874052048,
0.11914639174938202,
-0.03639962896704674,
-0.004482898395508528,
0.06127206236124039,
0.011782025918364525,
0.271716833114624,
-0.041325345635414124,
0.07192753255367279,
0.05581982806324959,
0.1343173384666443,
-0.0337284617125988,
0.052081089466810226,
0.009914040565490723,
-0.017253287136554718,
0.0028536017052829266,
-0.06337950378656387,
-0.001565778162330389,
0.00038369756657630205,
-0.08606016635894775,
0.06701891869306564,
-0.06440988183021545,
0.04980244114995003,
0.01972600631415844,
0.28995880484580994,
-0.006182532291859388,
-0.26457348465919495,
-0.10679665207862854,
-0.02214827388525009,
-0.013426559045910835,
-0.0688696950674057,
-0.007305472157895565,
0.10576067864894867,
-0.12825532257556915,
0.0634070336818695,
-0.07380460202693939,
0.08963143080472946,
0.011203676462173462,
-0.0015844147419556975,
0.08377550542354584,
0.18512003123760223,
-0.016917960718274117,
0.05252081900835037,
-0.20386868715286255,
0.23377835750579834,
0.02195931226015091,
0.12519894540309906,
-0.040823813527822495,
0.024084104225039482,
0.03623856231570244,
0.0271298885345459,
0.06059804931282997,
-0.004803477320820093,
-0.05972889065742493,
-0.13648712635040283,
-0.03353405371308327,
0.057898785918951035,
0.14961247146129608,
-0.042618077248334885,
0.09558220207691193,
-0.05180372670292854,
0.009396916255354881,
0.035436615347862244,
-0.07332787662744522,
-0.15173090994358063,
-0.07770707458257675,
-0.00039824677514843643,
0.00047409924445673823,
-0.04305476322770119,
-0.07179175317287445,
-0.08301837742328644,
0.008477441035211086,
0.13031348586082458,
-0.022184932604432106,
-0.041923269629478455,
-0.15249010920524597,
0.08346797525882721,
0.15996932983398438,
-0.07271494716405869,
0.03302542865276337,
-0.004375365562736988,
0.09730441868305206,
0.02475869469344616,
-0.09946615248918533,
0.05519919469952583,
-0.07281287014484406,
-0.1839265674352646,
-0.025390349328517914,
0.11410776525735855,
0.06662617623806,
0.03555653989315033,
-0.013129256665706635,
0.047733474522829056,
-0.01682986132800579,
-0.10225033015012741,
0.019534414634108543,
-0.01298101432621479,
0.03353598341345787,
0.038090355694293976,
-0.06334785372018814,
0.042756736278533936,
-0.041344206780195236,
0.004125949461013079,
0.08593782782554626,
0.22808443009853363,
-0.10857284814119339,
0.05121520906686783,
0.04455544427037239,
-0.06477631628513336,
-0.16167594492435455,
0.09437090903520584,
0.1272548884153366,
-0.007858623750507832,
0.058054935187101364,
-0.2151884287595749,
0.14628997445106506,
0.12608180940151215,
-0.03048509731888771,
0.07270493358373642,
-0.2514934837818146,
-0.13851965963840485,
0.06459229439496994,
0.11542092263698578,
0.06583668291568756,
-0.15214331448078156,
-0.02850441262125969,
-0.015382413752377033,
-0.16681599617004395,
0.128451868891716,
-0.1273554116487503,
0.11005735397338867,
-0.027091648429632187,
0.09570056945085526,
0.0164044052362442,
-0.023845864459872246,
0.12517231702804565,
0.03794506937265396,
0.108184814453125,
-0.032317016273736954,
0.011889259330928326,
0.07718779891729355,
-0.049668002873659134,
0.008895023725926876,
-0.030955061316490173,
0.08066242188215256,
-0.12534336745738983,
0.007767374161630869,
-0.10871883481740952,
0.07247720658779144,
-0.057154301553964615,
-0.058059804141521454,
-0.03252474218606949,
0.05449632555246353,
0.005050309002399445,
-0.03961554169654846,
0.0707974061369896,
-0.010253963991999626,
0.1632542610168457,
0.09361627697944641,
0.08656448125839233,
-0.01522573921829462,
-0.0711335763335228,
0.03023066557943821,
-0.006495099514722824,
0.0638962984085083,
-0.13995994627475739,
0.014728140085935593,
0.1149580180644989,
0.07602161169052124,
0.08759486675262451,
0.04998549073934555,
-0.05464008450508118,
0.0009675371693447232,
0.043221015483140945,
-0.11132108420133591,
-0.11589019000530243,
0.00870973989367485,
-0.040346235036849976,
-0.12793205678462982,
0.05813327804207802,
0.12112739682197571,
-0.03952312842011452,
-0.021862277761101723,
-0.009523884393274784,
0.001716217608191073,
-0.02683044783771038,
0.1954585611820221,
0.06128506734967232,
0.07538747042417526,
-0.10252974182367325,
0.12762196362018585,
0.037805113941431046,
-0.037624064832925797,
0.026080001145601273,
0.11010871082544327,
-0.08310141414403915,
-0.0023484930861741304,
0.06349891424179077,
0.10431811213493347,
-0.09179918467998505,
-0.030363239347934723,
-0.11087030172348022,
-0.10662314295768738,
0.038676775991916656,
0.1904592365026474,
0.05082910135388374,
-0.01954944245517254,
-0.023321136832237244,
0.04850750416517258,
-0.13696624338626862,
0.06452850252389908,
0.03352902829647064,
0.0760168731212616,
-0.09678491204977036,
0.11284568905830383,
0.021335110068321228,
0.032968129962682724,
-0.012602869421243668,
0.004899513442069292,
-0.0971641018986702,
-0.026340419426560402,
-0.1155434250831604,
-0.022734204307198524,
-0.0045372252352535725,
0.00898203905671835,
-0.024570925161242485,
-0.07421658933162689,
-0.06632228195667267,
0.05251895636320114,
-0.08300073444843292,
-0.05271239951252937,
0.011384932324290276,
0.01781492866575718,
-0.14233125746250153,
0.009888256900012493,
0.028102470561861992,
-0.09006306529045105,
0.08333905786275864,
0.08635903149843216,
0.03948343172669411,
0.038825057446956635,
-0.11462560296058655,
-0.036488525569438934,
-0.014024154283106327,
-0.00510079599916935,
0.06757476180791855,
-0.10961276292800903,
-0.005421763751655817,
-0.04643360525369644,
0.07650412619113922,
0.004624869674444199,
0.08087830990552902,
-0.13509531319141388,
0.022637536749243736,
-0.05336600914597511,
-0.03015834279358387,
-0.07092878222465515,
0.03208104893565178,
0.10533886402845383,
0.055346015840768814,
0.14448991417884827,
-0.07421348243951797,
0.02481171116232872,
-0.23290002346038818,
-0.027994755655527115,
-0.023621417582035065,
-0.05854255333542824,
-0.11486926674842834,
-0.023489177227020264,
0.07999451458454132,
-0.044596582651138306,
0.0766860842704773,
-0.009150001220405102,
0.09221670031547546,
0.05000774934887886,
-0.03835618868470192,
-0.0722515806555748,
-0.004511849023401737,
0.163150355219841,
0.06751059740781784,
-0.000964607170317322,
0.12021093815565109,
0.019864603877067566,
0.022329092025756836,
0.026982229202985764,
0.21230639517307281,
0.17519734799861908,
-0.009595682844519615,
0.058642495423555374,
0.08556535094976425,
-0.10463296622037888,
-0.07876074314117432,
0.13882601261138916,
-0.017362697049975395,
0.06711938977241516,
-0.05454041063785553,
0.15101464092731476,
0.13138967752456665,
-0.17458456754684448,
0.0605996809899807,
-0.048551809042692184,
-0.08069442957639694,
-0.14159637689590454,
0.021771496161818504,
-0.0720561072230339,
-0.13132569193840027,
0.03195580840110779,
-0.1299632340669632,
0.06126425042748451,
0.13480064272880554,
0.007499708794057369,
0.03334922716021538,
0.14598101377487183,
-0.035871606320142746,
-0.005097236949950457,
0.045962873846292496,
0.014089450240135193,
0.001374320941977203,
-0.031197065487504005,
-0.06707687675952911,
0.054070599377155304,
0.00728706456720829,
0.07146783918142319,
-0.050775498151779175,
-0.024121467024087906,
0.02575855329632759,
-0.01870131306350231,
-0.07879399508237839,
0.02814495749771595,
0.03909933194518089,
0.05445954203605652,
0.042517855763435364,
0.03243504837155342,
0.00860557146370411,
-0.04371199756860733,
0.33440521359443665,
-0.06939318031072617,
-0.11668580770492554,
-0.1234879270195961,
0.2605817914009094,
0.024866297841072083,
-0.03420699015259743,
0.04727790504693985,
-0.08489495515823364,
-0.016065841540694237,
0.13976027071475983,
0.16464905440807343,
-0.05580173060297966,
-0.019774941727519035,
-0.0002608227077871561,
-0.020873039960861206,
-0.031111663207411766,
0.12689554691314697,
0.09711319208145142,
0.027890799567103386,
-0.06738930195569992,
-0.00434762891381979,
-0.013557957485318184,
-0.02667955867946148,
-0.04960339516401291,
0.07594604045152664,
0.008326164446771145,
-0.011553824879229069,
-0.021497027948498726,
0.07451285421848297,
0.004171174019575119,
-0.23204286396503448,
0.07450315356254578,
-0.15989024937152863,
-0.18320977687835693,
-0.04279683157801628,
0.04956299811601639,
-0.018580801784992218,
0.0671631321310997,
-0.004442919045686722,
-0.015316152013838291,
0.0903770849108696,
-0.012572310864925385,
-0.032533612102270126,
-0.1504071056842804,
0.12146025151014328,
-0.10390547662973404,
0.19833789765834808,
-0.036355357617139816,
0.0561135895550251,
0.112879678606987,
0.019825365394353867,
-0.1492232084274292,
0.01991993747651577,
0.049166254699230194,
-0.13786514103412628,
0.010079591535031796,
0.1464078277349472,
-0.03677027300000191,
0.07746320962905884,
0.0311130303889513,
-0.15152926743030548,
-0.007022629491984844,
-0.0214468315243721,
-0.03644520416855812,
-0.06587975472211838,
-0.011719828471541405,
-0.06333520263433456,
0.15320414304733276,
0.22600504755973816,
-0.028527522459626198,
0.01688595861196518,
-0.10851216316223145,
0.01396643090993166,
0.07395574450492859,
0.08358696103096008,
-0.049336329102516174,
-0.21377162635326385,
0.043154966086149216,
0.04944361746311188,
-0.01720174215734005,
-0.23952674865722656,
-0.0648108646273613,
0.05891406536102295,
-0.05028630048036575,
-0.02222689986228943,
0.09493691474199295,
0.0502161979675293,
0.05899934470653534,
-0.029840003699064255,
-0.13464713096618652,
-0.04204757511615753,
0.1670490950345993,
-0.18601956963539124,
-0.042701587080955505
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08826947957277298,
0.09365250915288925,
-0.002546914154663682,
0.0787024050951004,
0.140910804271698,
0.02948620542883873,
0.09625988453626633,
0.13694807887077332,
-0.11293583363294601,
0.0453825369477272,
0.09575197845697403,
0.07800549268722534,
0.02943969890475273,
0.14466628432273865,
-0.029559897258877754,
-0.24421456456184387,
-0.007391039747744799,
-0.022406956180930138,
-0.10545090585947037,
0.11299506574869156,
0.10000143945217133,
-0.098972849547863,
0.06712470948696136,
-0.01916772499680519,
-0.17652343213558197,
0.01475241407752037,
-0.01846650242805481,
-0.05186961218714714,
0.11676783114671707,
-0.0022141430526971817,
0.07831869274377823,
0.010034997016191483,
0.1160968542098999,
-0.19353969395160675,
0.01832597330212593,
0.07195395976305008,
0.04426601901650429,
0.09431993961334229,
0.006349383853375912,
-0.021347615867853165,
0.11919672787189484,
-0.12919498980045319,
0.09860803186893463,
0.03276379778981209,
-0.09875518083572388,
-0.20445360243320465,
-0.09736859053373337,
0.01134977862238884,
0.043496571481227875,
0.08935772627592087,
0.006388442125171423,
0.14812035858631134,
-0.10784053802490234,
0.0803675428032875,
0.23095771670341492,
-0.2658348083496094,
-0.08266652375459671,
0.04782068729400635,
0.061720460653305054,
0.08493249118328094,
-0.12351914495229721,
-0.013365611433982849,
0.010751430876553059,
0.020306026563048363,
0.10081504285335541,
-0.02712332457304001,
-0.07794246077537537,
0.01275709830224514,
-0.11217719316482544,
0.00025090534472838044,
0.11445207893848419,
0.0374794639647007,
-0.05448012053966522,
-0.07942891120910645,
-0.038324497640132904,
-0.060762159526348114,
-0.03461766242980957,
-0.014678438194096088,
0.03675374761223793,
-0.06212163344025612,
-0.14141295850276947,
-0.046808190643787384,
-0.04952014237642288,
-0.09248918294906616,
0.0003906970960088074,
0.2090480923652649,
0.03826187923550606,
0.021654024720191956,
-0.04934133589267731,
0.10517360270023346,
0.015198751352727413,
-0.12376673519611359,
-0.03410891816020012,
-0.0053682648576796055,
-0.09492140263319016,
-0.03495156392455101,
-0.05649959295988083,
0.031109726056456566,
0.03931890428066254,
0.22750194370746613,
-0.027456974610686302,
0.07378087192773819,
0.03611264377832413,
-0.013331609778106213,
-0.027896447107195854,
0.14761123061180115,
-0.02777961455285549,
-0.07489082962274551,
0.00898546539247036,
0.06390690058469772,
0.004866942297667265,
-0.005489954724907875,
-0.06236666813492775,
-0.043847329914569855,
0.062424398958683014,
0.057541314512491226,
-0.0505530908703804,
0.027707574889063835,
-0.009965511038899422,
-0.023768188431859016,
0.001936818240210414,
-0.11921899765729904,
0.009029925800859928,
-0.009123341180384159,
-0.0792534127831459,
-0.05445050075650215,
0.013161377049982548,
-0.012394694611430168,
0.009664464741945267,
0.09061673283576965,
-0.07424779236316681,
-0.036616649478673935,
-0.07991581410169601,
-0.07465185970067978,
-0.014989761635661125,
-0.16154585778713226,
0.02207379974424839,
-0.06873509287834167,
-0.15303072333335876,
-0.032432496547698975,
0.05030152201652527,
-0.08034361153841019,
-0.035056691616773605,
-0.033101391047239304,
-0.08166978508234024,
0.017788251861929893,
0.001337332185357809,
0.216785728931427,
-0.04859566688537598,
0.09517612308263779,
0.01449776440858841,
0.05434282124042511,
0.006628979928791523,
0.038128335028886795,
-0.07943007349967957,
0.01371685042977333,
-0.17713454365730286,
0.0759912058711052,
-0.09007380157709122,
0.02958623692393303,
-0.14588947594165802,
-0.08732719719409943,
-0.0029415329918265343,
-0.019902031868696213,
0.08198302239179611,
0.10573034733533859,
-0.12614096701145172,
-0.022987959906458855,
0.12452857196331024,
-0.05721208453178406,
-0.05680413171648979,
0.06402279436588287,
-0.07365047186613083,
0.0824529156088829,
0.04838404059410095,
0.18981724977493286,
0.09845931828022003,
-0.10801216214895248,
0.02461457997560501,
0.017480704933404922,
0.0376613475382328,
0.002558885607868433,
0.056267209351062775,
0.0003834573144558817,
0.022539932280778885,
0.015965312719345093,
-0.0828980952501297,
0.015178862027823925,
-0.09226071834564209,
-0.06204140558838844,
-0.041935935616493225,
-0.09004057198762894,
0.007253758143633604,
0.009258282370865345,
0.028839875012636185,
-0.07988043874502182,
-0.08694251626729965,
0.062273263931274414,
0.1401752084493637,
-0.04496532678604126,
0.01087479293346405,
-0.079707071185112,
0.003194264369085431,
-0.030469851568341255,
-0.021898727864027023,
-0.1959550678730011,
-0.06164269149303436,
0.025590579956769943,
0.010933097451925278,
0.04454190284013748,
-0.006070353090763092,
0.08063165098428726,
0.02122892066836357,
-0.050118062645196915,
-0.004681035410612822,
-0.09025990962982178,
-0.010835240595042706,
-0.09075450897216797,
-0.21791517734527588,
-0.05551190674304962,
-0.03951995447278023,
0.15237204730510712,
-0.17241494357585907,
0.00036080952850170434,
-0.01901928521692753,
0.10923131555318832,
0.041978832334280014,
-0.05039943754673004,
-0.0002892963821068406,
0.028629295527935028,
0.015465744771063328,
-0.09944585710763931,
0.034493811428546906,
0.015411424450576305,
-0.09773179143667221,
-0.024270661175251007,
-0.10271041840314865,
0.0023404662497341633,
0.07466678321361542,
0.07428852468729019,
-0.10776714235544205,
-0.01806740276515484,
-0.06273258477449417,
-0.02873530052602291,
-0.046757493168115616,
0.03513139858841896,
0.17294640839099884,
0.016579745337367058,
0.11211173236370087,
-0.07549752295017242,
-0.08620486408472061,
0.019075114279985428,
0.009221859276294708,
0.05891595780849457,
0.11412512511014938,
0.07691799104213715,
-0.09895335137844086,
0.0586920864880085,
0.08904684334993362,
-0.047784239053726196,
0.13977757096290588,
-0.049244340509176254,
-0.07908356934785843,
-0.029471848160028458,
0.002016922691836953,
-0.001175810699351132,
0.15049053728580475,
-0.04695805907249451,
0.010319831781089306,
0.03449056297540665,
0.03038048930466175,
0.008415982127189636,
-0.16189409792423248,
-0.023976046591997147,
0.019867710769176483,
-0.04834834858775139,
-0.028809165582060814,
0.012056087143719196,
0.013150873593986034,
0.09153977036476135,
0.04927785322070122,
0.0007699506240896881,
0.005155875347554684,
-0.013536402024328709,
-0.04891175031661987,
0.20210717618465424,
-0.09174955636262894,
-0.04442981258034706,
-0.08338311314582825,
-0.0018166752997785807,
-0.010965420864522457,
-0.035845622420310974,
0.016885949298739433,
-0.10288166999816895,
-0.023428838700056076,
-0.06527766585350037,
0.005228633992373943,
-0.048138272017240524,
0.010666175745427608,
0.0006592601421289146,
0.019244983792304993,
0.05719306692481041,
-0.13447868824005127,
0.014528943225741386,
-0.06338212639093399,
-0.11406148225069046,
0.03083515539765358,
0.052302513271570206,
0.08747583627700806,
0.06253699958324432,
-0.025682199746370316,
0.017227595672011375,
-0.04577936977148056,
0.23351292312145233,
-0.08659670501947403,
0.005089182406663895,
0.12417130172252655,
0.024672435596585274,
0.03710009530186653,
0.10207223147153854,
0.027385441586375237,
-0.10012979805469513,
0.044065363705158234,
0.07510203868150711,
-0.04407074302434921,
-0.2556643486022949,
0.008794838562607765,
-0.04379875585436821,
-0.08396321535110474,
0.08900760114192963,
0.048986271023750305,
-0.03972325474023819,
0.06515911966562271,
0.0024626636877655983,
0.0074882954359054565,
-0.02168377861380577,
0.08806836605072021,
0.08486898988485336,
0.055454254150390625,
0.10620437562465668,
-0.04082321375608444,
-0.018014859408140182,
0.06422062963247299,
0.028889168053865433,
0.31001120805740356,
-0.0476064532995224,
0.1000155359506607,
0.05324980616569519,
0.1468239724636078,
-0.021300315856933594,
0.038050729781389236,
0.011485550552606583,
-0.0057947225868701935,
-0.029037311673164368,
-0.05339263007044792,
-0.024081245064735413,
0.005412245634943247,
-0.06954937428236008,
0.04154197499155998,
-0.0566389374434948,
0.04505528509616852,
0.01652509532868862,
0.2907753586769104,
0.0016368004726246,
-0.2645053565502167,
-0.09964778274297714,
-0.012472025118768215,
-0.03828461840748787,
-0.0497412271797657,
0.013081531040370464,
0.11792590469121933,
-0.13072316348552704,
0.025281623005867004,
-0.06573043018579483,
0.08502916991710663,
-0.02661256305873394,
-0.0054620010778307915,
0.03860027715563774,
0.16363093256950378,
-0.02162117138504982,
0.06171038746833801,
-0.2229626625776291,
0.23178383708000183,
0.008403271436691284,
0.12203315645456314,
-0.053802940994501114,
0.0061970921233296394,
0.023702092468738556,
-0.0009079509763978422,
0.09614884108304977,
-0.0026928146835416555,
-0.04813937470316887,
-0.13963350653648376,
-0.0519208200275898,
0.0710524171590805,
0.13990122079849243,
-0.05170737951993942,
0.10157418996095657,
-0.05901845172047615,
0.010196628049015999,
0.037018269300460815,
-0.08182074874639511,
-0.12056191265583038,
-0.10294700413942337,
-0.019766652956604958,
-0.0021042670123279095,
-0.06242762506008148,
-0.0640064924955368,
-0.06666705012321472,
0.025215718895196915,
0.11501684784889221,
0.00009890004730550572,
-0.034326571971178055,
-0.14895088970661163,
0.07304647564888,
0.15511299669742584,
-0.06800216436386108,
0.03330863267183304,
0.002821339527145028,
0.07999145984649658,
0.03497416898608208,
-0.07791344821453094,
0.0639711245894432,
-0.06748698651790619,
-0.1804516464471817,
-0.04799564182758331,
0.10259280353784561,
0.07122304290533066,
0.04163262993097305,
-0.005852058529853821,
0.048003338277339935,
-0.027816174551844597,
-0.09144426137208939,
0.030038965865969658,
0.030347025021910667,
0.03544141352176666,
0.04249662160873413,
-0.07723208516836166,
0.08428842574357986,
-0.04435203969478607,
-0.019612392410635948,
0.12012224644422531,
0.23134951293468475,
-0.10445297509431839,
0.09613119810819626,
0.05756187066435814,
-0.06093998998403549,
-0.16645582020282745,
0.07310107350349426,
0.10587991774082184,
0.012434879317879677,
0.05968965217471123,
-0.21634604036808014,
0.12254747003316879,
0.10163421183824539,
-0.013986978679895401,
0.039861973375082016,
-0.2772581875324249,
-0.11997090280056,
0.04959737882018089,
0.1249200627207756,
0.08516565710306168,
-0.1253783106803894,
-0.018411992117762566,
-0.01455470360815525,
-0.12692977488040924,
0.07839410752058029,
-0.11335595697164536,
0.13137294352054596,
-0.02441721223294735,
0.11071912199258804,
0.012263141572475433,
-0.026983320713043213,
0.10665245354175568,
0.051048360764980316,
0.09680265933275223,
-0.04242125153541565,
0.0008030617609620094,
0.059671372175216675,
-0.04851177707314491,
-0.0003831486974377185,
-0.06724375486373901,
0.08889355510473251,
-0.1359378546476364,
-0.0073730009607970715,
-0.08925200253725052,
0.04216904565691948,
-0.04129071906208992,
-0.06584298610687256,
-0.04130923002958298,
0.057237230241298676,
0.04462851211428642,
-0.03318339213728905,
0.038041748106479645,
-0.02538708783686161,
0.10419046878814697,
0.02572757378220558,
0.08739491552114487,
0.018781917169690132,
-0.05469613894820213,
0.0222307201474905,
-0.01203833892941475,
0.06375493854284286,
-0.16931360960006714,
0.008913093246519566,
0.09847177565097809,
0.0681692361831665,
0.10118985921144485,
0.042510759085416794,
-0.04813612252473831,
0.016770750284194946,
0.027815664187073708,
-0.0963275134563446,
-0.11573757231235504,
0.0402163602411747,
-0.036473166197538376,
-0.1475297510623932,
0.04327625036239624,
0.11785668134689331,
-0.0398615263402462,
-0.03159072995185852,
-0.019855935126543045,
0.0037732613272964954,
-0.02055766060948372,
0.18163976073265076,
0.06077154353260994,
0.06046322360634804,
-0.10335573554039001,
0.1200672909617424,
0.03437676280736923,
-0.0278752651065588,
0.05107831954956055,
0.08263475447893143,
-0.10074364393949509,
-0.006479913368821144,
0.07808353006839752,
0.1265607625246048,
-0.053905073553323746,
0.0016779415309429169,
-0.1015046089887619,
-0.08774729073047638,
0.058867115527391434,
0.14253194630146027,
0.0499124675989151,
-0.01698201708495617,
-0.0452391691505909,
0.045904356986284256,
-0.14019620418548584,
0.07424265891313553,
0.03238886222243309,
0.06421931087970734,
-0.07732030749320984,
0.0629189983010292,
0.003518822370097041,
0.018712127581238747,
-0.014624999836087227,
0.0033841636031866074,
-0.09688498824834824,
-0.007910031825304031,
-0.08206424117088318,
0.001419175649061799,
0.00042626840877346694,
0.018417099490761757,
-0.02467612363398075,
-0.07110178470611572,
-0.04448733478784561,
0.03690827637910843,
-0.08673158288002014,
-0.05098595470190048,
0.007276229094713926,
0.0428798645734787,
-0.12383051216602325,
-0.004626845940947533,
0.028589822351932526,
-0.09795232862234116,
0.09825449436903,
0.07283060997724533,
0.021028736606240273,
0.029385562986135483,
-0.12101446837186813,
-0.034732162952423096,
-0.014132540673017502,
-0.010377208702266216,
0.06033676490187645,
-0.09900601953268051,
-0.005498679354786873,
-0.0458555743098259,
0.06301075220108032,
0.012761988677084446,
0.06080865114927292,
-0.14040958881378174,
0.01479923352599144,
-0.0701168105006218,
-0.04546847566962242,
-0.07861220091581345,
0.04037880524992943,
0.09315455704927444,
0.059462208300828934,
0.14078958332538605,
-0.07476102560758591,
0.025828685611486435,
-0.20473355054855347,
-0.03684413060545921,
-0.013452713377773762,
-0.056516293436288834,
-0.14484578371047974,
-0.04611089453101158,
0.08267048746347427,
-0.040260735899209976,
0.0897064134478569,
-0.026208916679024696,
0.07348726689815521,
0.03729574382305145,
-0.04909645393490791,
-0.033153850585222244,
-0.008195783942937851,
0.2010374665260315,
0.07097500562667847,
-0.01443951204419136,
0.10319852828979492,
0.0007397759472951293,
0.030461663380265236,
0.0472513847053051,
0.17154298722743988,
0.22193783521652222,
0.041186150163412094,
0.049737680703401566,
0.06374591588973999,
-0.07770213484764099,
-0.06926584988832474,
0.17494048178195953,
-0.010896466672420502,
0.06735869497060776,
-0.046137869358062744,
0.19247157871723175,
0.11964566260576248,
-0.16703090071678162,
0.04853387549519539,
-0.04664462432265282,
-0.08169623464345932,
-0.11845622956752777,
-0.008397877216339111,
-0.08347418159246445,
-0.1240638792514801,
0.037057891488075256,
-0.1196179911494255,
0.046321433037519455,
0.1100226640701294,
0.015027343295514584,
0.03544915094971657,
0.12722350656986237,
-0.008450638502836227,
-0.0056700799614191055,
0.06776806712150574,
0.0042700315825641155,
-0.010133970528841019,
-0.039929550141096115,
-0.07424503564834595,
0.059784095734357834,
0.0010776215931400657,
0.08099131286144257,
-0.04608422890305519,
-0.016257451847195625,
0.030511489138007164,
-0.030504880473017693,
-0.07962893694639206,
0.025574585422873497,
0.045285411179065704,
0.052903901785612106,
0.05060030519962311,
0.04179464653134346,
-0.01091034710407257,
-0.03312939405441284,
0.3271574378013611,
-0.06659171730279922,
-0.1020965650677681,
-0.12437672168016434,
0.21582894027233124,
0.032728083431720734,
-0.0273496825248003,
0.031818270683288574,
-0.08586812764406204,
-0.0009119988535530865,
0.16877597570419312,
0.1770215928554535,
-0.06574545055627823,
-0.02047206647694111,
-0.00043754387297667563,
-0.01611470989882946,
-0.030869588255882263,
0.12578365206718445,
0.09751762449741364,
-0.012724791653454304,
-0.06191810593008995,
-0.016107797622680664,
-0.015598705969750881,
-0.03368614986538887,
-0.04172007367014885,
0.04907054826617241,
0.021407706663012505,
-0.026788363233208656,
-0.04264840856194496,
0.07636930793523788,
0.003902614815160632,
-0.2541002929210663,
0.06232401356101036,
-0.15591491758823395,
-0.1763627976179123,
-0.0515911765396595,
0.030254626646637917,
0.0052026426419615746,
0.05751308798789978,
-0.014957377687096596,
0.003364293137565255,
0.08678531646728516,
-0.010884638875722885,
-0.033961761742830276,
-0.12193848937749863,
0.1234760656952858,
-0.04865715280175209,
0.17013269662857056,
-0.03147700056433678,
0.04197582229971886,
0.11646153032779694,
0.03011048585176468,
-0.13573141396045685,
0.036099307239055634,
0.06192294508218765,
-0.09746985882520676,
0.020927794277668,
0.14979863166809082,
-0.04728787764906883,
0.09726450592279434,
0.04562488570809364,
-0.10988105833530426,
0.0018747284775599837,
-0.0674610584974289,
-0.033391643315553665,
-0.08656822890043259,
-0.009354772046208382,
-0.06508143246173859,
0.16622544825077057,
0.22247016429901123,
-0.037655431777238846,
0.008420797064900398,
-0.0985039547085762,
0.011664996854960918,
0.0694054663181305,
0.03668040782213211,
-0.04942822456359863,
-0.18190421164035797,
0.005366261582821608,
0.06045999750494957,
-0.0018884111195802689,
-0.25268444418907166,
-0.07157571613788605,
0.036469824612140656,
-0.02986500971019268,
-0.03336765989661217,
0.10966367274522781,
0.04766155406832695,
0.05043015629053116,
-0.031252458691596985,
-0.15104660391807556,
-0.03350096195936203,
0.1565546840429306,
-0.17338965833187103,
-0.036020420491695404
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-16-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-16-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08897849172353745,
0.09449239075183868,
-0.0025626488495618105,
0.07911036908626556,
0.14089491963386536,
0.029943155124783516,
0.09560944885015488,
0.13690441846847534,
-0.11306044459342957,
0.04514902085065842,
0.09593816846609116,
0.07744467258453369,
0.029222702607512474,
0.14419570565223694,
-0.029914425686001778,
-0.2440173476934433,
-0.008123977109789848,
-0.02215053141117096,
-0.10409248620271683,
0.11259618401527405,
0.10013575106859207,
-0.09914033859968185,
0.06715784966945648,
-0.019340569153428078,
-0.1758774220943451,
0.014524750411510468,
-0.018093788996338844,
-0.05202614143490791,
0.11705979704856873,
-0.0016616766806691885,
0.07822990417480469,
0.009193328209221363,
0.11635525524616241,
-0.19397512078285217,
0.018146678805351257,
0.07210232317447662,
0.044252172112464905,
0.0944104865193367,
0.005402953829616308,
-0.020609688013792038,
0.11861006170511246,
-0.12955346703529358,
0.09869275242090225,
0.03251520171761513,
-0.09847885370254517,
-0.20476388931274414,
-0.09692994505167007,
0.012378783896565437,
0.04341704398393631,
0.08870398253202438,
0.007240116596221924,
0.14860619604587555,
-0.10682833194732666,
0.0809374451637268,
0.2315046489238739,
-0.2653957009315491,
-0.08201059699058533,
0.047048404812812805,
0.061516743153333664,
0.08602852374315262,
-0.12272302806377411,
-0.013274594210088253,
0.010671841911971569,
0.020051471889019012,
0.10131631046533585,
-0.027728978544473648,
-0.07930238544940948,
0.012730238027870655,
-0.11192115396261215,
0.00009330282773589715,
0.11473702639341354,
0.03776658698916435,
-0.05472202226519585,
-0.07914955914020538,
-0.03873975947499275,
-0.0618484765291214,
-0.035122115164995193,
-0.014766396023333073,
0.03646257892251015,
-0.06196916475892067,
-0.14040371775627136,
-0.04715878888964653,
-0.04903033748269081,
-0.09165170788764954,
0.0008795703761279583,
0.20898091793060303,
0.038534410297870636,
0.021430728957057,
-0.04867259040474892,
0.10506581515073776,
0.013656972907483578,
-0.12359325587749481,
-0.03378298133611679,
-0.004812759812921286,
-0.09491255134344101,
-0.035156141966581345,
-0.05646946281194687,
0.03018687292933464,
0.03922801837325096,
0.2288445085287094,
-0.027032487094402313,
0.07356388866901398,
0.036445554345846176,
-0.012837025336921215,
-0.027829639613628387,
0.14856891334056854,
-0.02826724387705326,
-0.07626131176948547,
0.010066408663988113,
0.06370224058628082,
0.005517171695828438,
-0.006068473216146231,
-0.06325310468673706,
-0.04437333345413208,
0.0627940371632576,
0.057287245988845825,
-0.05054083839058876,
0.02723408304154873,
-0.009979022666811943,
-0.02387266419827938,
0.002428300678730011,
-0.11928845942020416,
0.009265684522688389,
-0.009241205640137196,
-0.07913385331630707,
-0.05405424162745476,
0.013825987465679646,
-0.011611184105277061,
0.009647334925830364,
0.09017722308635712,
-0.07353446632623672,
-0.03613167628645897,
-0.07977558672428131,
-0.07462967187166214,
-0.014785557053983212,
-0.16107024252414703,
0.0225005391985178,
-0.06974910199642181,
-0.15332208573818207,
-0.032270535826683044,
0.05027635395526886,
-0.079750657081604,
-0.03490389138460159,
-0.03250445798039436,
-0.08071595430374146,
0.017463672906160355,
0.001074913190677762,
0.21552544832229614,
-0.04878894239664078,
0.09481299668550491,
0.014403347857296467,
0.053885024040937424,
0.005503335501998663,
0.03835202008485794,
-0.07873708754777908,
0.013705069199204445,
-0.17734533548355103,
0.07589595019817352,
-0.08961532264947891,
0.030265413224697113,
-0.14474304020404816,
-0.08716268837451935,
-0.0018540540477260947,
-0.01943156309425831,
0.08242923021316528,
0.10473448038101196,
-0.1262761503458023,
-0.02280481904745102,
0.1240217462182045,
-0.056488651782274246,
-0.05707983300089836,
0.06545467674732208,
-0.07371643930673599,
0.0820823535323143,
0.048877306282520294,
0.1895265132188797,
0.09951853007078171,
-0.10795323550701141,
0.025870569050312042,
0.01863519847393036,
0.03830433264374733,
0.0014492090558633208,
0.05609968677163124,
0.0004157305520493537,
0.022888058796525,
0.016177713871002197,
-0.08204857259988785,
0.0157491322606802,
-0.09225873649120331,
-0.062404289841651917,
-0.042147472500801086,
-0.09011655300855637,
0.006458087358623743,
0.00947142019867897,
0.028873803094029427,
-0.07974280416965485,
-0.08723127096891403,
0.06329745799303055,
0.14026473462581635,
-0.04522141441702843,
0.011132122948765755,
-0.07928153872489929,
0.0037921047769486904,
-0.029349319636821747,
-0.022006805986166,
-0.1955030858516693,
-0.060316622257232666,
0.025435764342546463,
0.01104202214628458,
0.044183891266584396,
-0.00600671349093318,
0.07979811728000641,
0.021702107042074203,
-0.04960411414504051,
-0.0037484189961105585,
-0.08960315585136414,
-0.01052574161440134,
-0.09160987287759781,
-0.217259019613266,
-0.055492330342531204,
-0.03902740031480789,
0.15296009182929993,
-0.17343194782733917,
0.0007162298425100744,
-0.01991233229637146,
0.10871343314647675,
0.04167479649186134,
-0.05041460692882538,
-0.00047193674254231155,
0.029334647580981255,
0.015221468172967434,
-0.09965639561414719,
0.034500300884246826,
0.015960844233632088,
-0.09844808280467987,
-0.025446536019444466,
-0.10257513076066971,
0.003182763233780861,
0.07420189678668976,
0.07306872308254242,
-0.10779500007629395,
-0.01862812414765358,
-0.062299687415361404,
-0.028657661750912666,
-0.04617060720920563,
0.03427696228027344,
0.17348100244998932,
0.016821635887026787,
0.11347188800573349,
-0.07533484697341919,
-0.08589013665914536,
0.018747147172689438,
0.009162467904388905,
0.05962658300995827,
0.11379291862249374,
0.07670103758573532,
-0.09697328507900238,
0.05802800878882408,
0.0882202759385109,
-0.0490032322704792,
0.13884267210960388,
-0.04899415001273155,
-0.07883167266845703,
-0.029574433341622353,
0.001825522631406784,
-0.001049100887030363,
0.15036803483963013,
-0.04831719398498535,
0.009600451216101646,
0.03463956341147423,
0.029934823513031006,
0.00866115465760231,
-0.16199944913387299,
-0.024018175899982452,
0.020562054589390755,
-0.04797745123505592,
-0.027678195387125015,
0.011426541954278946,
0.012938033789396286,
0.09119073301553726,
0.049021895974874496,
0.00027061012224294245,
0.00559447193518281,
-0.013391788117587566,
-0.049095164984464645,
0.20150160789489746,
-0.09185749292373657,
-0.04539509862661362,
-0.08458293974399567,
-0.0017720804316923022,
-0.010966192930936813,
-0.0360492542386055,
0.017160391435027122,
-0.10130750387907028,
-0.023441985249519348,
-0.06566648185253143,
0.0038382788188755512,
-0.04791899025440216,
0.010523408651351929,
0.0009077252470888197,
0.018904533237218857,
0.05772647634148598,
-0.13437166810035706,
0.014433507807552814,
-0.06277674436569214,
-0.11335708945989609,
0.030962035059928894,
0.05248843505978584,
0.0879150927066803,
0.06328440457582474,
-0.026183949783444405,
0.01690380461513996,
-0.045341912657022476,
0.23328588902950287,
-0.08615683019161224,
0.005312077701091766,
0.124308742582798,
0.023685498163104057,
0.037302225828170776,
0.10134750604629517,
0.027808768674731255,
-0.10024159401655197,
0.04378780350089073,
0.07446494698524475,
-0.04437733069062233,
-0.25499027967453003,
0.008780800737440586,
-0.044002167880535126,
-0.08375908434391022,
0.08886554837226868,
0.0489811934530735,
-0.03911112621426582,
0.06529835611581802,
0.002826412906870246,
0.00891309417784214,
-0.022521289065480232,
0.08783849328756332,
0.08532794564962387,
0.05501227453351021,
0.1059509888291359,
-0.040489327162504196,
-0.017833726480603218,
0.0646096020936966,
0.0279867984354496,
0.3082430064678192,
-0.04781418666243553,
0.10080569237470627,
0.05261795595288277,
0.14739945530891418,
-0.02145645208656788,
0.03754621371626854,
0.011127009987831116,
-0.006054471712559462,
-0.02910328470170498,
-0.05340911075472832,
-0.025126835331320763,
0.005832923110574484,
-0.0701122134923935,
0.04216773808002472,
-0.056628573685884476,
0.045645572245121,
0.016230110079050064,
0.29051029682159424,
0.0014898310182616115,
-0.26436683535575867,
-0.0998016968369484,
-0.01266718003898859,
-0.038492150604724884,
-0.050098028033971786,
0.01324003841727972,
0.11851461231708527,
-0.13056057691574097,
0.02458794228732586,
-0.06511025130748749,
0.08527129143476486,
-0.027802271768450737,
-0.0052635520696640015,
0.0382511131465435,
0.16401615738868713,
-0.021561918780207634,
0.06174076348543167,
-0.22388136386871338,
0.23058542609214783,
0.008561399765312672,
0.12215426564216614,
-0.053769301623106,
0.006677819415926933,
0.023729264736175537,
0.0006661284132860601,
0.09528093785047531,
-0.002403741469606757,
-0.04697206988930702,
-0.14085015654563904,
-0.05237682908773422,
0.07088048756122589,
0.13883936405181885,
-0.05029356852173805,
0.10117098689079285,
-0.059329453855752945,
0.010271989740431309,
0.03702442720532417,
-0.0813392698764801,
-0.12055817991495132,
-0.10317949205636978,
-0.02013203874230385,
-0.0010792558314278722,
-0.06210354343056679,
-0.06394419819116592,
-0.06633282452821732,
0.023467492312192917,
0.11520074307918549,
0.0006325424765236676,
-0.034382399171590805,
-0.14895577728748322,
0.0736757218837738,
0.15443481504917145,
-0.0682108998298645,
0.033525802195072174,
0.002857823856174946,
0.07997865974903107,
0.03508531302213669,
-0.07705704122781754,
0.06376540660858154,
-0.06739559769630432,
-0.18003283441066742,
-0.048275191336870193,
0.10208557546138763,
0.07111804932355881,
0.04169575497508049,
-0.005612098146229982,
0.04762334004044533,
-0.028157249093055725,
-0.09139031916856766,
0.028844168409705162,
0.03138506039977074,
0.03495534136891365,
0.04257919639348984,
-0.07731073349714279,
0.08518952131271362,
-0.04411274567246437,
-0.01988474279642105,
0.12112016975879669,
0.23104530572891235,
-0.10476337373256683,
0.09534166753292084,
0.05769256129860878,
-0.061037588864564896,
-0.16578489542007446,
0.07297028601169586,
0.10546897351741791,
0.012601504102349281,
0.05989083647727966,
-0.21567685902118683,
0.12285500764846802,
0.10278241336345673,
-0.013713900931179523,
0.040095459669828415,
-0.2773910462856293,
-0.12022878974676132,
0.050492338836193085,
0.1249474361538887,
0.08652304112911224,
-0.12597507238388062,
-0.018442224711179733,
-0.015202295035123825,
-0.1282137930393219,
0.07713199406862259,
-0.11258784681558609,
0.13118460774421692,
-0.024368196725845337,
0.10965917259454727,
0.012317942455410957,
-0.027007868513464928,
0.1071203425526619,
0.05160406604409218,
0.0968041718006134,
-0.04280310124158859,
0.0009974086424335837,
0.058985959738492966,
-0.04867379367351532,
0.000052342376875458285,
-0.06699438393115997,
0.08890911191701889,
-0.1376911699771881,
-0.007457916624844074,
-0.0880306139588356,
0.04224095493555069,
-0.04115236923098564,
-0.06587381660938263,
-0.04137076810002327,
0.05641121789813042,
0.044343605637550354,
-0.03300477936863899,
0.0390789657831192,
-0.025287646800279617,
0.10350871831178665,
0.026329223066568375,
0.08617927879095078,
0.01607373170554638,
-0.05477985739707947,
0.02203836478292942,
-0.01205514371395111,
0.06363420188426971,
-0.16918039321899414,
0.009483755566179752,
0.09881393611431122,
0.06844610720872879,
0.10173878073692322,
0.041456859558820724,
-0.047490935772657394,
0.016994645819067955,
0.027508987113833427,
-0.09590573608875275,
-0.11471029371023178,
0.03977064788341522,
-0.037295740097761154,
-0.14725427329540253,
0.04253878816962242,
0.11817964166402817,
-0.04054833576083183,
-0.03115263022482395,
-0.020141901448369026,
0.0028902620542794466,
-0.020701313391327858,
0.18116457760334015,
0.06166934594511986,
0.060145508497953415,
-0.10319654643535614,
0.11964992433786392,
0.03465871140360832,
-0.026616420596837997,
0.0511050745844841,
0.08252044767141342,
-0.10079894214868546,
-0.006616515573114157,
0.07812238484621048,
0.12664078176021576,
-0.05466276407241821,
0.0007849952671676874,
-0.10199923068284988,
-0.08676699548959732,
0.058697961270809174,
0.1417839229106903,
0.050298500806093216,
-0.017209285870194435,
-0.04547687992453575,
0.04550414904952049,
-0.14013421535491943,
0.0740218460559845,
0.03249482065439224,
0.06459514051675797,
-0.07761073857545853,
0.06461259722709656,
0.003936434630304575,
0.01872156746685505,
-0.014506010338664055,
0.0031409289222210646,
-0.09675312042236328,
-0.007656671106815338,
-0.08383837342262268,
0.0016666611190885305,
0.0006319325766526163,
0.01876586303114891,
-0.024627340957522392,
-0.07102742791175842,
-0.04439385607838631,
0.036936305463314056,
-0.08621680736541748,
-0.05092566832900047,
0.0076248967088758945,
0.04271905869245529,
-0.12335021793842316,
-0.004862355999648571,
0.02827823907136917,
-0.09763700515031815,
0.09852009266614914,
0.07212615758180618,
0.020815176889300346,
0.02912437729537487,
-0.12143848091363907,
-0.03447943925857544,
-0.014132566750049591,
-0.00972711481153965,
0.060540180653333664,
-0.09768764674663544,
-0.004977663047611713,
-0.045671042054891586,
0.06283926963806152,
0.012561737559735775,
0.060849208384752274,
-0.14067493379116058,
0.014845182187855244,
-0.06997073441743851,
-0.04540560767054558,
-0.07898830622434616,
0.040166061371564865,
0.09272948652505875,
0.05897066369652748,
0.14054352045059204,
-0.07462476193904877,
0.025589918717741966,
-0.2047332227230072,
-0.03698755428195,
-0.013357259333133698,
-0.05629434064030647,
-0.14483685791492462,
-0.046988457441329956,
0.08250477910041809,
-0.03983870521187782,
0.09125298261642456,
-0.0260888934135437,
0.07369998097419739,
0.03700781241059303,
-0.04975702613592148,
-0.03291542828083038,
-0.008656565099954605,
0.20145845413208008,
0.0715239942073822,
-0.014307886362075806,
0.10238739848136902,
0.0009936373680830002,
0.031491249799728394,
0.04665624722838402,
0.16975581645965576,
0.2220512479543686,
0.04060947149991989,
0.04963763803243637,
0.06336470693349838,
-0.0773809477686882,
-0.06949321180582047,
0.17447611689567566,
-0.01109266746789217,
0.06796605885028839,
-0.04623553901910782,
0.19280867278575897,
0.1194143071770668,
-0.1668972223997116,
0.04831039905548096,
-0.04601018875837326,
-0.08188049495220184,
-0.1185380294919014,
-0.007712105754762888,
-0.08351694792509079,
-0.12403383105993271,
0.03654125705361366,
-0.1191638931632042,
0.046391576528549194,
0.11053867638111115,
0.01484399102628231,
0.03522884100675583,
0.1263372004032135,
-0.008523521013557911,
-0.0055083464831113815,
0.06778661906719208,
0.004319800529628992,
-0.00980112049728632,
-0.04115239903330803,
-0.07496143132448196,
0.05946394056081772,
0.001178314327262342,
0.08145392686128616,
-0.046348948031663895,
-0.014992552809417248,
0.03088892437517643,
-0.03025720827281475,
-0.07990012317895889,
0.02567409537732601,
0.04463065043091774,
0.05290041118860245,
0.049408555030822754,
0.042080022394657135,
-0.010770442895591259,
-0.03327565640211105,
0.3264743387699127,
-0.06611686944961548,
-0.10252929478883743,
-0.12426585704088211,
0.2147088199853897,
0.03303946182131767,
-0.027359995990991592,
0.03171057626605034,
-0.08598106354475021,
-0.0000069145753514021635,
0.1693010777235031,
0.17629779875278473,
-0.06664244085550308,
-0.020438270643353462,
-0.000166578363860026,
-0.016246693208813667,
-0.03175199776887894,
0.12613321840763092,
0.09772748500108719,
-0.013853035867214203,
-0.060872457921504974,
-0.016429590061306953,
-0.015711285173892975,
-0.0336148701608181,
-0.042860180139541626,
0.048323940485715866,
0.02134063094854355,
-0.02617153897881508,
-0.04222419112920761,
0.0759769156575203,
0.00454928120598197,
-0.25498902797698975,
0.06229766830801964,
-0.15581659972667694,
-0.1765640676021576,
-0.05156635120511055,
0.03053186647593975,
0.005830977112054825,
0.05678310617804527,
-0.014725148677825928,
0.0037773512303829193,
0.08776471763849258,
-0.011286494322121143,
-0.03404180705547333,
-0.12132053822278976,
0.12306404858827591,
-0.04788092151284218,
0.170121431350708,
-0.03144697844982147,
0.04221511632204056,
0.11619550734758377,
0.030338609591126442,
-0.1348983347415924,
0.0362924225628376,
0.06161404028534889,
-0.09673701226711273,
0.020943375304341316,
0.14855580031871796,
-0.047117095440626144,
0.09709218144416809,
0.045984022319316864,
-0.10941258817911148,
0.0013493925798684359,
-0.0683213546872139,
-0.033785946667194366,
-0.08634613454341888,
-0.010435055010020733,
-0.06475481390953064,
0.16643661260604858,
0.2219986915588379,
-0.037586405873298645,
0.00811247993260622,
-0.09853712469339371,
0.011327224783599377,
0.06978166848421097,
0.03685074299573898,
-0.049691662192344666,
-0.18182697892189026,
0.005838286597281694,
0.060057323426008224,
-0.001846238854341209,
-0.25204041600227356,
-0.0720270499587059,
0.036300115287303925,
-0.030021946877241135,
-0.03307727351784706,
0.10983352363109589,
0.04799850285053253,
0.050611454993486404,
-0.031236005946993828,
-0.15108317136764526,
-0.033992867916822433,
0.15664342045783997,
-0.1733272671699524,
-0.036146361380815506
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07800672203302383,
0.10225936770439148,
-0.0025845039635896683,
0.07721872627735138,
0.1317368596792221,
0.03332017734646797,
0.10688326507806778,
0.12336620688438416,
-0.1277405172586441,
0.0674908384680748,
0.09154348820447922,
0.0851438045501709,
0.03286808729171753,
0.13002271950244904,
-0.03910963982343674,
-0.23300927877426147,
0.007463006768375635,
-0.017066413536667824,
-0.0678229108452797,
0.10217814147472382,
0.08780647814273834,
-0.09788112342357635,
0.08093499392271042,
-0.003006358863785863,
-0.18468394875526428,
0.028195058926939964,
-0.020846785977482796,
-0.05209976062178612,
0.09860927611589432,
-0.004651134368032217,
0.08289556950330734,
0.00763333635404706,
0.11822425574064255,
-0.1920214593410492,
0.013423536904156208,
0.07358050346374512,
0.033951207995414734,
0.09502726048231125,
0.020134922116994858,
-0.003044219221919775,
0.16614067554473877,
-0.13582424819469452,
0.09836411476135254,
0.028821319341659546,
-0.08560315519571304,
-0.16798467934131622,
-0.0952569842338562,
0.014026900753378868,
0.03487040475010872,
0.08733347058296204,
0.00848271045833826,
0.17905782163143158,
-0.09203272312879562,
0.07981416583061218,
0.23622186481952667,
-0.27732986211776733,
-0.07873233407735825,
0.04161493107676506,
0.04674601927399635,
0.07760562002658844,
-0.11872677505016327,
-0.01560475304722786,
0.019967323169112206,
0.02972780168056488,
0.09339321404695511,
-0.025527456775307655,
-0.0879148319363594,
-0.006956905592232943,
-0.11906731873750687,
0.004192133899778128,
0.1121354028582573,
0.04411734640598297,
-0.05031663551926613,
-0.04939214885234833,
-0.058049216866493225,
-0.07049362361431122,
-0.03471026569604874,
-0.033994004130363464,
0.043172743171453476,
-0.057843081653118134,
-0.12233486771583557,
-0.034097351133823395,
-0.048184510320425034,
-0.07745031267404556,
-0.020854230970144272,
0.20718641579151154,
0.0491451695561409,
0.03291608393192291,
-0.050910353660583496,
0.08158791810274124,
0.0119320098310709,
-0.12920315563678741,
-0.029794437810778618,
0.002689796034246683,
-0.07982254028320312,
-0.03793743997812271,
-0.0559835322201252,
0.015021334402263165,
0.04344067722558975,
0.21058784425258636,
-0.055348291993141174,
0.0834747701883316,
0.031206853687763214,
-0.020529404282569885,
-0.02139771170914173,
0.12499166280031204,
-0.019650109112262726,
-0.0769800916314125,
0.02857978828251362,
0.05823469161987305,
0.025852683931589127,
0.0025900332257151604,
-0.05575761944055557,
-0.03192440792918205,
0.08218612521886826,
0.031471528112888336,
-0.06123236566781998,
0.026312755420804024,
0.0028670444153249264,
-0.020019015297293663,
0.005030014552175999,
-0.11393914371728897,
0.01683136448264122,
-0.006466306280344725,
-0.07522640377283096,
-0.014285795390605927,
0.008937866427004337,
-0.015955952927470207,
0.010155296884477139,
0.09985122084617615,
-0.08921109139919281,
-0.021294686943292618,
-0.07645150274038315,
-0.07033924758434296,
-0.0008787085535004735,
-0.14917325973510742,
0.012793418951332569,
-0.07379869371652603,
-0.15422414243221283,
-0.03211406245827675,
0.0426563061773777,
-0.07064593583345413,
-0.028641078621149063,
-0.03835519403219223,
-0.07654482126235962,
0.026336301118135452,
0.003033855464309454,
0.18640850484371185,
-0.05295064300298691,
0.08038045465946198,
0.020015377551317215,
0.0501866340637207,
-0.022318601608276367,
0.03125868737697601,
-0.08995363116264343,
0.003613400738686323,
-0.1693422645330429,
0.06247539073228836,
-0.07646699249744415,
0.01562226191163063,
-0.1295938938856125,
-0.08716829121112823,
-0.026911864057183266,
-0.02856355533003807,
0.08397065103054047,
0.10174284130334854,
-0.13821811974048615,
-0.027750827372074127,
0.11250318586826324,
-0.07718920707702637,
-0.057288289070129395,
0.06705307960510254,
-0.06719391793012619,
0.05058109760284424,
0.0571860671043396,
0.18408752977848053,
0.07495664805173874,
-0.11410631984472275,
-0.023295368999242783,
0.002527422271668911,
0.029860569164156914,
-0.010841771960258484,
0.05099882557988167,
0.010240238159894943,
0.02079443261027336,
0.016786346212029457,
-0.03919301927089691,
0.004026554059237242,
-0.09714383631944656,
-0.06249191612005234,
-0.057342011481523514,
-0.08213990181684494,
-0.034704919904470444,
0.0135071175172925,
0.0379941463470459,
-0.08080829679965973,
-0.08426616340875626,
0.08145315945148468,
0.14160683751106262,
-0.04373607039451599,
0.020982909947633743,
-0.07303290069103241,
0.019271880388259888,
-0.05329539626836777,
-0.03059256263077259,
-0.20457015931606293,
-0.06660483777523041,
0.0335550419986248,
-0.027738414704799652,
0.05741046369075775,
0.0025820983573794365,
0.07140097767114639,
0.037598080933094025,
-0.0373920239508152,
0.006147291976958513,
-0.08993794023990631,
-0.0071127950213849545,
-0.0932881310582161,
-0.2215956747531891,
-0.03817354515194893,
-0.03059385158121586,
0.12722188234329224,
-0.16183745861053467,
-0.009286727756261826,
-0.032653070986270905,
0.12040399760007858,
0.029637999832630157,
-0.06223716586828232,
-0.01451119314879179,
0.031485412269830704,
0.0023130958434194326,
-0.09228924661874771,
0.03206043690443039,
0.017375245690345764,
-0.07259120792150497,
-0.05731379613280296,
-0.11572571843862534,
0.0057244328781962395,
0.07866473495960236,
0.06267692148685455,
-0.09643501788377762,
0.006379707250744104,
-0.0646083801984787,
-0.033672261983156204,
-0.057725582271814346,
0.04153924435377121,
0.17789915204048157,
0.0061117480508983135,
0.10706009715795517,
-0.08014638721942902,
-0.0728408619761467,
0.021963227540254593,
0.005709188524633646,
0.04248011112213135,
0.09227990359067917,
0.11140455305576324,
-0.1270526796579361,
0.06486964970827103,
0.08241984248161316,
-0.06283439695835114,
0.12615327537059784,
-0.03935469314455986,
-0.07921804487705231,
-0.034268248826265335,
-0.018311146646738052,
-0.01474323496222496,
0.13733533024787903,
-0.05202512443065643,
0.025284437462687492,
0.030053094029426575,
0.0391962006688118,
0.020326167345046997,
-0.1529851108789444,
-0.002503100549802184,
0.00843567494302988,
-0.04356488212943077,
-0.017621716484427452,
0.022538434714078903,
0.019482025876641273,
0.09757302701473236,
0.03364189714193344,
-0.01627814956009388,
-0.006669192109256983,
-0.0037636368069797754,
-0.05211666598916054,
0.19125297665596008,
-0.09105036407709122,
-0.04285011440515518,
-0.07735612988471985,
0.0011712867999449372,
-0.039168741554021835,
-0.04288351163268089,
0.02688075229525566,
-0.08817027509212494,
-0.03812696784734726,
-0.0738954246044159,
-0.0003017642884515226,
-0.04709676653146744,
0.02590418979525566,
0.029123803600668907,
0.003658316796645522,
0.06176386773586273,
-0.13393598794937134,
0.00490594794973731,
-0.07247789204120636,
-0.10374826937913895,
0.016797229647636414,
0.06481671333312988,
0.09209024906158447,
0.05812736973166466,
-0.030178211629390717,
0.021605318412184715,
-0.03205633535981178,
0.25148600339889526,
-0.0556563101708889,
-0.001455457415431738,
0.1078583300113678,
0.02446012943983078,
0.04580707848072052,
0.0918811559677124,
0.03503842651844025,
-0.1008211150765419,
0.03070993721485138,
0.08604692667722702,
-0.03541106730699539,
-0.23802919685840607,
-0.006145552732050419,
-0.035236794501543045,
-0.11332448571920395,
0.08263682574033737,
0.050322338938713074,
-0.04363701492547989,
0.06449198722839355,
0.009407518431544304,
0.023440087214112282,
-0.051724083721637726,
0.0923539251089096,
0.10137689113616943,
0.07506328821182251,
0.10233888775110245,
-0.04814610630273819,
-0.021240277215838432,
0.0695979967713356,
-0.004439075011759996,
0.2922068238258362,
-0.028658397495746613,
0.06913888454437256,
0.053251199424266815,
0.14037834107875824,
-0.02161249704658985,
0.03614475578069687,
0.006530426908284426,
-0.003716247621923685,
-0.02547358348965645,
-0.058054640889167786,
-0.02881595492362976,
-0.0016398148145526648,
-0.07827887684106827,
0.05568530037999153,
-0.06179637461900711,
0.0651247426867485,
0.020118726417422295,
0.26245418190956116,
-0.0025927622336894274,
-0.2807259261608124,
-0.07909581810235977,
-0.02181602269411087,
-0.03824927657842636,
-0.044207409024238586,
0.013453575782477856,
0.1000497043132782,
-0.10370311141014099,
0.05177287757396698,
-0.058509424328804016,
0.08081280440092087,
-0.028534352779388428,
-0.0043068621307611465,
0.032318148761987686,
0.1846705675125122,
-0.015649979934096336,
0.0500909760594368,
-0.19975098967552185,
0.21578387916088104,
0.01847701147198677,
0.1325729638338089,
-0.05097654461860657,
0.008038468658924103,
0.02415861375629902,
-0.0013842687476426363,
0.07535786926746368,
-0.005846268963068724,
-0.07620719820261002,
-0.12579302489757538,
-0.07325536757707596,
0.08111783117055893,
0.14055226743221283,
-0.015364636667072773,
0.10265272855758667,
-0.049021270126104355,
0.01916327141225338,
0.04078218713402748,
-0.06805813312530518,
-0.15689454972743988,
-0.09724682569503784,
-0.01747671514749527,
0.03841447830200195,
-0.0948728546500206,
-0.04662425071001053,
-0.07560817152261734,
-0.010788053274154663,
0.11439413577318192,
0.024435900151729584,
-0.019412804394960403,
-0.13718119263648987,
0.08837238699197769,
0.1495472937822342,
-0.07359297573566437,
0.023189686238765717,
-0.007849818095564842,
0.06448496133089066,
0.04336687549948692,
-0.09547756612300873,
0.04740587994456291,
-0.05768342688679695,
-0.1601824164390564,
-0.04619917646050453,
0.08972310274839401,
0.07195397466421127,
0.039756275713443756,
-0.005911511834710836,
0.04974231868982315,
-0.020455680787563324,
-0.0999801754951477,
0.01224841084331274,
0.03518061712384224,
0.05020906403660774,
0.03715069219470024,
-0.08316083997488022,
0.060648903250694275,
-0.03306252136826515,
-0.00475799897685647,
0.11388582736253738,
0.2419988363981247,
-0.08963356167078018,
0.08532404154539108,
0.05873668193817139,
-0.06751914322376251,
-0.1442885845899582,
0.06376736611127853,
0.10342554748058319,
0.00011852945317514241,
0.05601217597723007,
-0.1968754529953003,
0.1415814310312271,
0.11357633769512177,
-0.012154651805758476,
0.03710774704813957,
-0.2740436792373657,
-0.11809521913528442,
0.059390123933553696,
0.1318347454071045,
0.11858437210321426,
-0.13242384791374207,
-0.013534250669181347,
-0.017352184280753136,
-0.12329516559839249,
0.10859999805688858,
-0.11469127237796783,
0.13443879783153534,
-0.033810343593358994,
0.11005197465419769,
0.004888499155640602,
-0.030043404549360275,
0.10710491240024567,
0.04926195740699768,
0.09750523418188095,
-0.041566185653209686,
0.012916278094053268,
0.05988967418670654,
-0.04817540943622589,
0.014037348330020905,
-0.07422438263893127,
0.08441169559955597,
-0.12118180841207504,
-0.006677201949059963,
-0.07936465740203857,
0.05145619437098503,
-0.036451589316129684,
-0.05288419872522354,
-0.05349843204021454,
0.03682694584131241,
0.055661894381046295,
-0.03703819587826729,
0.0546269454061985,
-0.00020921956456732005,
0.09144553542137146,
0.02390259876847267,
0.06828057765960693,
0.0005175601108931005,
-0.047422055155038834,
0.020479535683989525,
-0.00874223094433546,
0.060991447418928146,
-0.13925085961818695,
0.005285789258778095,
0.10690641403198242,
0.05244826525449753,
0.09596865624189377,
0.04310653731226921,
-0.04742608219385147,
0.01251245941966772,
0.03738115727901459,
-0.11417794227600098,
-0.10134880244731903,
0.04714600741863251,
-0.03939148783683777,
-0.13886894285678864,
0.04685831069946289,
0.11486299335956573,
-0.04946766793727875,
-0.022206101566553116,
-0.018340134993195534,
0.005662701558321714,
-0.021459022536873817,
0.18510274589061737,
0.04276663810014725,
0.04067656397819519,
-0.10208074003458023,
0.13190899789333344,
0.02934652753174305,
-0.02396462671458721,
0.05910515412688255,
0.08553171157836914,
-0.09545121341943741,
0.0023411514703184366,
0.09330868721008301,
0.17376792430877686,
-0.07204730808734894,
-0.015499729663133621,
-0.10448800027370453,
-0.07152210175991058,
0.06171703338623047,
0.15906988084316254,
0.0559961199760437,
-0.01768953539431095,
-0.051292240619659424,
0.0407424196600914,
-0.14177174866199493,
0.06300480663776398,
0.031659964472055435,
0.0700208768248558,
-0.08742039650678635,
0.05774618312716484,
0.007694486994296312,
0.005257704760879278,
-0.016897404566407204,
0.014123204164206982,
-0.09263341873884201,
-0.02910575643181801,
-0.07633289694786072,
0.01068426575511694,
-0.013368427753448486,
0.01591932587325573,
-0.010275989770889282,
-0.067447230219841,
-0.0700102299451828,
0.03672854229807854,
-0.076960489153862,
-0.05271568149328232,
0.012259600684046745,
0.04343514144420624,
-0.1335192620754242,
0.0059754205867648125,
0.015720343217253685,
-0.0891558900475502,
0.08596930652856827,
0.08910129964351654,
0.026006311178207397,
0.03429507091641426,
-0.128076434135437,
-0.03251941129565239,
0.013662472367286682,
0.002672768197953701,
0.06480336934328079,
-0.09481322765350342,
-0.003558133030310273,
-0.02128974162042141,
0.0762256607413292,
0.010139629244804382,
0.0818251371383667,
-0.13136322796344757,
0.00870649702847004,
-0.08364242315292358,
-0.04309651628136635,
-0.06605789810419083,
0.016873257234692574,
0.10191860795021057,
0.052827391773462296,
0.16357718408107758,
-0.07748238742351532,
0.01809750497341156,
-0.20785856246948242,
-0.02784031629562378,
-0.004957671742886305,
-0.054128292948007584,
-0.13755927979946136,
-0.04011814668774605,
0.07743506878614426,
-0.03869186341762543,
0.0988655686378479,
-0.020522497594356537,
0.06271863728761673,
0.039292555302381516,
-0.03305970877408981,
-0.0628407672047615,
-0.028349537402391434,
0.19770027697086334,
0.07777154445648193,
-0.016177063807845116,
0.10778378695249557,
-0.005842720158398151,
0.051668211817741394,
0.03197834640741348,
0.20413321256637573,
0.2083648145198822,
0.007396540138870478,
0.07109919935464859,
0.06206300109624863,
-0.08104204386472702,
-0.0683097392320633,
0.1804201304912567,
-0.026639718562364578,
0.07408735156059265,
-0.029279401525855064,
0.18721066415309906,
0.11145743727684021,
-0.15033848583698273,
0.03162262216210365,
-0.03217759728431702,
-0.07701067626476288,
-0.14028477668762207,
0.0002861622197087854,
-0.09635642170906067,
-0.11898075044155121,
0.04472830891609192,
-0.12026321142911911,
0.055273137986660004,
0.0818696916103363,
0.01224046666175127,
0.03297588974237442,
0.12793949246406555,
-0.026052411645650864,
0.003911532927304506,
0.042503226548433304,
0.0074637774378061295,
-0.029052818194031715,
-0.040473781526088715,
-0.07725971192121506,
0.05037815496325493,
0.006516792345792055,
0.08783672749996185,
-0.04516245797276497,
-0.009310093708336353,
0.042384784668684006,
-0.029337013140320778,
-0.0762895792722702,
0.025352979078888893,
0.0363219752907753,
0.05724254250526428,
0.045620061457157135,
0.04441911727190018,
-0.006788288708776236,
-0.033183399587869644,
0.28052207827568054,
-0.05894406512379646,
-0.09522764384746552,
-0.11471950262784958,
0.20627672970294952,
0.040380749851465225,
-0.02935570478439331,
0.03894908353686333,
-0.0834268108010292,
-0.012894882820546627,
0.15524809062480927,
0.1558327078819275,
-0.06307315826416016,
-0.02482568472623825,
-0.012418752536177635,
-0.01675749197602272,
-0.04044617712497711,
0.11571679264307022,
0.09525801986455917,
0.001787399291060865,
-0.05367812514305115,
-0.028083493933081627,
-0.036142196506261826,
-0.014594447799026966,
-0.04065786674618721,
0.025067923590540886,
0.014640436507761478,
-0.023067597299814224,
-0.035142682492733,
0.062265899032354355,
-0.0022960465867072344,
-0.24155636131763458,
0.06154067814350128,
-0.14415466785430908,
-0.16838455200195312,
-0.025666555389761925,
0.04763016477227211,
-0.01027949620038271,
0.05024024844169617,
-0.023523878306150436,
-0.003706259187310934,
0.08028218150138855,
-0.01994950696825981,
-0.056329939514398575,
-0.12370803207159042,
0.11167098581790924,
-0.057045191526412964,
0.17958815395832062,
-0.01730264723300934,
0.0697442963719368,
0.11812880635261536,
0.04354031756520271,
-0.13954439759254456,
0.046252988278865814,
0.047519657760858536,
-0.11371850222349167,
0.018184734508395195,
0.1434333622455597,
-0.045539986342191696,
0.0880122035741806,
0.04450581967830658,
-0.09450887888669968,
-0.009906685911118984,
-0.04731505736708641,
-0.025782445445656776,
-0.0711614117026329,
-0.012813249602913857,
-0.06856255233287811,
0.17033281922340393,
0.19717423617839813,
-0.024601314216852188,
0.013696021400392056,
-0.09343873709440231,
0.028973234817385674,
0.06781232357025146,
0.038356103003025055,
-0.05003027245402336,
-0.20765437185764313,
0.018465150147676468,
0.049651920795440674,
-0.003219098784029484,
-0.23446591198444366,
-0.07863464951515198,
0.04153510555624962,
-0.03406267240643501,
-0.055379293859004974,
0.09689565002918243,
0.03570907562971115,
0.04812879487872124,
-0.03726647049188614,
-0.14999723434448242,
-0.03816652297973633,
0.1541234254837036,
-0.17911072075366974,
-0.049851831048727036
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07857115566730499,
0.0995291993021965,
-0.002512991428375244,
0.07720760256052017,
0.1320650726556778,
0.03357489034533501,
0.10722466558218002,
0.1215781643986702,
-0.12644945085048676,
0.06649031490087509,
0.09044034779071808,
0.08635005354881287,
0.03307117894291878,
0.1304900199174881,
-0.03848272189497948,
-0.23383140563964844,
0.00732323806732893,
-0.016733212396502495,
-0.06658872961997986,
0.10213585942983627,
0.08811930567026138,
-0.09842346608638763,
0.0803515687584877,
-0.003057358553633094,
-0.18505246937274933,
0.028352348133921623,
-0.020460912957787514,
-0.050875891000032425,
0.09898851811885834,
-0.006293712183833122,
0.08269225805997849,
0.0071686445735394955,
0.12031349539756775,
-0.19012843072414398,
0.01347630750387907,
0.07316257059574127,
0.03411687910556793,
0.09459874033927917,
0.019791487604379654,
-0.0020146742463111877,
0.16548694670200348,
-0.13628539443016052,
0.09866844117641449,
0.028147317469120026,
-0.08569592237472534,
-0.16694945096969604,
-0.09536970406770706,
0.013402318581938744,
0.036490537226200104,
0.08638148009777069,
0.00815226137638092,
0.179075226187706,
-0.09246789664030075,
0.0799817368388176,
0.23690500855445862,
-0.27839794754981995,
-0.07872989028692245,
0.0434994138777256,
0.0485604852437973,
0.07862316071987152,
-0.11835338920354843,
-0.016363093629479408,
0.020471371710300446,
0.029496334493160248,
0.09440633654594421,
-0.026043744757771492,
-0.08795607089996338,
-0.00729382922872901,
-0.12050538510084152,
0.0060594407841563225,
0.1111225113272667,
0.044393863528966904,
-0.04999321699142456,
-0.049202580004930496,
-0.058802198618650436,
-0.070428267121315,
-0.036142755299806595,
-0.034876223653554916,
0.04366951435804367,
-0.05807918682694435,
-0.12101195752620697,
-0.03376428037881851,
-0.047204308211803436,
-0.0779697522521019,
-0.020692875608801842,
0.2064579427242279,
0.04945015907287598,
0.032600585371255875,
-0.05075102671980858,
0.08079444617033005,
0.008907876908779144,
-0.12917490303516388,
-0.028459206223487854,
0.002968032844364643,
-0.0810713991522789,
-0.03887408226728439,
-0.056749649345874786,
0.015061602927744389,
0.04263504222035408,
0.2103932797908783,
-0.052760906517505646,
0.08366618305444717,
0.03285878151655197,
-0.02078249678015709,
-0.021371686831116676,
0.1254025250673294,
-0.020027803257107735,
-0.07882662117481232,
0.02847268246114254,
0.0585787259042263,
0.025935566052794456,
0.0035748265217989683,
-0.05660266801714897,
-0.03206373006105423,
0.08250156044960022,
0.0311068557202816,
-0.0631859302520752,
0.026633245870471,
0.0032245845068246126,
-0.019724177196621895,
0.006240164861083031,
-0.11450307071208954,
0.01742682047188282,
-0.007018020376563072,
-0.07603629678487778,
-0.014039570465683937,
0.00736479414626956,
-0.015049098059535027,
0.010603656992316246,
0.09898164123296738,
-0.08937793225049973,
-0.020679989829659462,
-0.0772453248500824,
-0.07146895676851273,
-0.0010461276397109032,
-0.15171079337596893,
0.01341897901147604,
-0.0726465955376625,
-0.15655399858951569,
-0.03262549266219139,
0.04194488003849983,
-0.07017408311367035,
-0.028677156195044518,
-0.039356209337711334,
-0.07668070495128632,
0.024969760328531265,
0.0038073230534791946,
0.18804293870925903,
-0.05239412561058998,
0.07906144857406616,
0.019960129633545876,
0.05157819390296936,
-0.023163899779319763,
0.031441379338502884,
-0.08915600925683975,
0.0035802172496914864,
-0.17035004496574402,
0.0621865950524807,
-0.07664244621992111,
0.017637910321354866,
-0.1288856416940689,
-0.08679507672786713,
-0.027308762073516846,
-0.028194047510623932,
0.08397476375102997,
0.10134061425924301,
-0.14099884033203125,
-0.02659505046904087,
0.11301884800195694,
-0.07686746120452881,
-0.056446392089128494,
0.06560097634792328,
-0.06752610206604004,
0.05142636224627495,
0.058623362332582474,
0.18417462706565857,
0.07497746497392654,
-0.11362799257040024,
-0.02262463979423046,
0.002707377541810274,
0.02960553765296936,
-0.012694412842392921,
0.05043092742562294,
0.011050628498196602,
0.02176740951836109,
0.016821764409542084,
-0.03802664205431938,
0.003368748351931572,
-0.09689841419458389,
-0.06232045963406563,
-0.05761811137199402,
-0.08309493213891983,
-0.0348518043756485,
0.013513821177184582,
0.038225091993808746,
-0.0812424048781395,
-0.08344040811061859,
0.08297792077064514,
0.14142240583896637,
-0.04356420785188675,
0.02017979696393013,
-0.07264392077922821,
0.01866038329899311,
-0.05371963605284691,
-0.03080431930720806,
-0.20491346716880798,
-0.0655796155333519,
0.033894266933202744,
-0.02852640300989151,
0.05743313580751419,
0.004994167014956474,
0.07194827497005463,
0.03737044706940651,
-0.036692287772893906,
0.00650907913222909,
-0.08909883350133896,
-0.006950327195227146,
-0.09515393525362015,
-0.2211424559354782,
-0.03863990306854248,
-0.03072606958448887,
0.12654046714305878,
-0.16224442422389984,
-0.008998263627290726,
-0.035144608467817307,
0.1200094223022461,
0.028774762526154518,
-0.06183139979839325,
-0.0147626381367445,
0.031052427366375923,
0.003151813754811883,
-0.09249773621559143,
0.03230268508195877,
0.016177948564291,
-0.07178280502557755,
-0.058874815702438354,
-0.11664170771837234,
0.00400208355858922,
0.07809653133153915,
0.0631568655371666,
-0.09616223722696304,
0.007313213311135769,
-0.06421902030706406,
-0.03291511908173561,
-0.056615978479385376,
0.04133932292461395,
0.17666979134082794,
0.005398293491452932,
0.10665538907051086,
-0.08037562668323517,
-0.07359519600868225,
0.021748507395386696,
0.00645951833575964,
0.043971966952085495,
0.09216751158237457,
0.11206629127264023,
-0.12665782868862152,
0.06410479545593262,
0.08447727560997009,
-0.06296468526124954,
0.1243441104888916,
-0.03950118646025658,
-0.07947413623332977,
-0.034111566841602325,
-0.01959298551082611,
-0.01571729965507984,
0.1370495706796646,
-0.05265205353498459,
0.023823825642466545,
0.029360594227910042,
0.03874368965625763,
0.02001481130719185,
-0.15383797883987427,
-0.002169556450098753,
0.007877438329160213,
-0.04275032505393028,
-0.018441475927829742,
0.022173121571540833,
0.018606694415211678,
0.09762069582939148,
0.032712485641241074,
-0.01752951182425022,
-0.00741634052246809,
-0.003940120805054903,
-0.05148947611451149,
0.19153697788715363,
-0.09062764048576355,
-0.04034334793686867,
-0.07537547498941422,
0.000588557100854814,
-0.038032542914152145,
-0.04296153411269188,
0.025940382853150368,
-0.08819644898176193,
-0.03818833455443382,
-0.0737476795911789,
-0.0008345829555764794,
-0.046486373990774155,
0.025899237021803856,
0.030297037214040756,
0.003454966004937887,
0.06082191318273544,
-0.13409541547298431,
0.004859926179051399,
-0.07373741269111633,
-0.10402270406484604,
0.016193171963095665,
0.06484408676624298,
0.09153218567371368,
0.05976584181189537,
-0.03021438978612423,
0.021518394351005554,
-0.03171154484152794,
0.25217553973197937,
-0.05550939217209816,
0.00030736031476408243,
0.10789626836776733,
0.024637071415781975,
0.04573385417461395,
0.0929899737238884,
0.03501603752374649,
-0.10125402361154556,
0.030375901609659195,
0.0853915885090828,
-0.034857917577028275,
-0.23831778764724731,
-0.006193847395479679,
-0.03482861816883087,
-0.11555887013673782,
0.08229847997426987,
0.04983648285269737,
-0.043385330587625504,
0.06555529683828354,
0.01019413210451603,
0.02448471449315548,
-0.05173787474632263,
0.09148934483528137,
0.10003571212291718,
0.07444258779287338,
0.10285157710313797,
-0.04857012256979942,
-0.022101914510130882,
0.0683693140745163,
-0.004694959614425898,
0.29302194714546204,
-0.027157682925462723,
0.06796494126319885,
0.05296490713953972,
0.1397315263748169,
-0.02190747670829296,
0.03804484009742737,
0.006994210183620453,
-0.004698783624917269,
-0.025805912911891937,
-0.05785024166107178,
-0.0269645806401968,
-0.0016783615574240685,
-0.07880723476409912,
0.05556359514594078,
-0.06102626398205757,
0.06448496878147125,
0.019653569906949997,
0.26075279712677,
-0.001013750210404396,
-0.2811518907546997,
-0.07783323526382446,
-0.02152146026492119,
-0.03920277953147888,
-0.043597180396318436,
0.013228489086031914,
0.0984041765332222,
-0.1030513346195221,
0.05263359472155571,
-0.058262087404727936,
0.08112183213233948,
-0.027492567896842957,
-0.0036503123119473457,
0.031504422426223755,
0.18616227805614471,
-0.015383629128336906,
0.050104036927223206,
-0.20034079253673553,
0.2151772379875183,
0.01854247972369194,
0.13328105211257935,
-0.051553890109062195,
0.007518360391259193,
0.024259870871901512,
-0.0014919622335582972,
0.07531587034463882,
-0.005801195744425058,
-0.07687510550022125,
-0.12579315900802612,
-0.07294704765081406,
0.08112574368715286,
0.14118492603302002,
-0.013925982639193535,
0.1029033362865448,
-0.048368941992521286,
0.01831216923892498,
0.041268281638622284,
-0.06912332773208618,
-0.15809620916843414,
-0.09618105739355087,
-0.018351076170802116,
0.03747459873557091,
-0.09647826105356216,
-0.045917440205812454,
-0.07560741901397705,
-0.013252835720777512,
0.11258021742105484,
0.026672115549445152,
-0.019370343536138535,
-0.1366555243730545,
0.0887279212474823,
0.15004314482212067,
-0.07333199679851532,
0.02403515949845314,
-0.007253821473568678,
0.06460419297218323,
0.04350707679986954,
-0.09535286575555801,
0.047750022262334824,
-0.05769865959882736,
-0.15948592126369476,
-0.045655928552150726,
0.09050819277763367,
0.07252474129199982,
0.03987604007124901,
-0.004695604555308819,
0.04985640570521355,
-0.020867548882961273,
-0.10013808310031891,
0.011553244665265083,
0.03513545170426369,
0.04918818920850754,
0.03733106330037117,
-0.08408188819885254,
0.059358298778533936,
-0.032966192811727524,
-0.0031203278340399265,
0.11341802030801773,
0.23963336646556854,
-0.0895833969116211,
0.0845586284995079,
0.0581236369907856,
-0.06777788698673248,
-0.14416538178920746,
0.06503661721944809,
0.10356981307268143,
0.00026770669501274824,
0.05660867318511009,
-0.19528500735759735,
0.1431216448545456,
0.11345517635345459,
-0.011694665998220444,
0.03672868758440018,
-0.27460381388664246,
-0.11803906410932541,
0.05984973907470703,
0.13248880207538605,
0.11908333748579025,
-0.1319638192653656,
-0.013098087161779404,
-0.018512215465307236,
-0.12326588481664658,
0.1085963174700737,
-0.11581834405660629,
0.13408192992210388,
-0.0335947722196579,
0.10956189036369324,
0.004639487247914076,
-0.030109236016869545,
0.10577038675546646,
0.05142433941364288,
0.09815900772809982,
-0.04174520820379257,
0.011909362860023975,
0.061464615166187286,
-0.0475037656724453,
0.014665180817246437,
-0.07340759038925171,
0.08391384035348892,
-0.11971389502286911,
-0.006602378562092781,
-0.07885142415761948,
0.05070088058710098,
-0.0361286997795105,
-0.052448734641075134,
-0.05320320278406143,
0.03650086745619774,
0.05497503653168678,
-0.03707059100270271,
0.05290243774652481,
0.0007926475373096764,
0.09085476398468018,
0.020420659333467484,
0.06906701624393463,
-0.0002924890723079443,
-0.046120163053274155,
0.020965613424777985,
-0.008356409147381783,
0.06002090126276016,
-0.14008474349975586,
0.004803364630788565,
0.10700321942567825,
0.05249477177858353,
0.09564359486103058,
0.04317006468772888,
-0.04722989723086357,
0.011910505592823029,
0.03699066489934921,
-0.1125960424542427,
-0.10301288962364197,
0.04815465584397316,
-0.04050423577427864,
-0.13881412148475647,
0.0487828254699707,
0.11327957361936569,
-0.04955689236521721,
-0.022841060534119606,
-0.01929314434528351,
0.005228416528552771,
-0.0215643011033535,
0.18654665350914001,
0.04370186850428581,
0.04052860662341118,
-0.10281805694103241,
0.13109280169010162,
0.029081838205456734,
-0.02237502485513687,
0.058033935725688934,
0.08637121319770813,
-0.09630367159843445,
0.0022742890287190676,
0.09498190134763718,
0.1751888394355774,
-0.070380300283432,
-0.015090999193489552,
-0.10450322180986404,
-0.07226244360208511,
0.061831723898649216,
0.16020211577415466,
0.056784093379974365,
-0.019125016406178474,
-0.05109403282403946,
0.0410185270011425,
-0.1414838433265686,
0.062399741262197495,
0.03176209703087807,
0.07034514099359512,
-0.08706200122833252,
0.05837366729974747,
0.007772509474307299,
0.0057004583068192005,
-0.017061999067664146,
0.015340304002165794,
-0.09244368970394135,
-0.02971608377993107,
-0.07631068676710129,
0.009115341119468212,
-0.01388424914330244,
0.01580769568681717,
-0.010579701513051987,
-0.06750016659498215,
-0.06980443000793457,
0.03619217500090599,
-0.07715613394975662,
-0.05280027538537979,
0.011954146437346935,
0.042219918221235275,
-0.1328398585319519,
0.005853477865457535,
0.01492292620241642,
-0.0882803425192833,
0.08537091314792633,
0.08824890851974487,
0.026697445660829544,
0.03543923422694206,
-0.1290377825498581,
-0.032451946288347244,
0.014278982765972614,
0.00308838183991611,
0.06547222286462784,
-0.09364030510187149,
-0.003624900011345744,
-0.020962359383702278,
0.07832169532775879,
0.009473978541791439,
0.07946054637432098,
-0.13035894930362701,
0.009009555913507938,
-0.08483925461769104,
-0.04312526434659958,
-0.06627975404262543,
0.0165487639605999,
0.10109781473875046,
0.05215216800570488,
0.16413068771362305,
-0.07634526491165161,
0.018029849976301193,
-0.2086159735918045,
-0.028171846643090248,
-0.0055110035464167595,
-0.05476381629705429,
-0.13736148178577423,
-0.04100094735622406,
0.07793467491865158,
-0.03863581642508507,
0.10043196380138397,
-0.019724350422620773,
0.06366199254989624,
0.03860565274953842,
-0.02901754528284073,
-0.06285784393548965,
-0.027800260111689568,
0.19770926237106323,
0.07773570716381073,
-0.016191888600587845,
0.10708453506231308,
-0.0047667017206549644,
0.05194834619760513,
0.02997933328151703,
0.20379605889320374,
0.2082032561302185,
0.006192685104906559,
0.07094884663820267,
0.06280338019132614,
-0.08086460828781128,
-0.0667833462357521,
0.1821836233139038,
-0.02780863456428051,
0.07267347723245621,
-0.029315916821360588,
0.18981653451919556,
0.10999523848295212,
-0.15013694763183594,
0.03192107379436493,
-0.0318940170109272,
-0.07771560549736023,
-0.1397451013326645,
-0.00022170724696479738,
-0.09693748503923416,
-0.11866258829832077,
0.04463914781808853,
-0.12000562250614166,
0.05486057698726654,
0.08302223682403564,
0.012054359540343285,
0.03247437998652458,
0.12855076789855957,
-0.026175200939178467,
0.00472252769395709,
0.04237266629934311,
0.007242466323077679,
-0.028647134080529213,
-0.04024842754006386,
-0.0764416754245758,
0.0504741445183754,
0.0059895901940763,
0.0875886008143425,
-0.046613991260528564,
-0.010684490203857422,
0.04182474687695503,
-0.0286557599902153,
-0.0760030522942543,
0.02561814896762371,
0.036052100360393524,
0.05674850940704346,
0.04422531649470329,
0.045008737593889236,
-0.007083727978169918,
-0.033519454300403595,
0.2789667844772339,
-0.05886776000261307,
-0.09689751267433167,
-0.1138029471039772,
0.20503012835979462,
0.040720365941524506,
-0.02914864383637905,
0.039214685559272766,
-0.08318758755922318,
-0.01158177200704813,
0.1562732458114624,
0.1560022234916687,
-0.06343196332454681,
-0.025096677243709564,
-0.012626047246158123,
-0.01728888601064682,
-0.04102502763271332,
0.11623768508434296,
0.09576810151338577,
-0.00012988645175937563,
-0.053449928760528564,
-0.027806557714939117,
-0.035534657537937164,
-0.015099394135177135,
-0.04146614298224449,
0.02373272180557251,
0.01588710956275463,
-0.02305733598768711,
-0.03345830366015434,
0.06333375722169876,
-0.0008813735330477357,
-0.24132928252220154,
0.05962963402271271,
-0.14331643283367157,
-0.1686410754919052,
-0.026066318154335022,
0.04801677539944649,
-0.008775142952799797,
0.05020170286297798,
-0.023872319608926773,
-0.0033487274777144194,
0.0798589214682579,
-0.02001078799366951,
-0.05642447620630264,
-0.1240670457482338,
0.11215180903673172,
-0.059578586369752884,
0.17837552726268768,
-0.017265427857637405,
0.07073259353637695,
0.1178826168179512,
0.04242881387472153,
-0.13874094188213348,
0.04707074910402298,
0.047017812728881836,
-0.11300971359014511,
0.019370393827557564,
0.14288334548473358,
-0.04505617171525955,
0.0845082551240921,
0.0439501591026783,
-0.09482884407043457,
-0.009237382560968399,
-0.04648584499955177,
-0.02611572854220867,
-0.0715450793504715,
-0.011642360128462315,
-0.06854595243930817,
0.17064562439918518,
0.19714803993701935,
-0.02462238259613514,
0.014425000175833702,
-0.09381499141454697,
0.0280157383531332,
0.06773727387189865,
0.038594700396060944,
-0.05058317631483078,
-0.208017036318779,
0.018059832975268364,
0.04854461923241615,
-0.003212078008800745,
-0.23235340416431427,
-0.07779455184936523,
0.039885520935058594,
-0.03515893220901489,
-0.05556362867355347,
0.0959298387169838,
0.03690638393163681,
0.04827573150396347,
-0.03707360476255417,
-0.15088960528373718,
-0.03838245943188667,
0.1547456830739975,
-0.17965735495090485,
-0.04940100386738777
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07763452082872391,
0.10131627321243286,
-0.00262382160872221,
0.07735952734947205,
0.1311645656824112,
0.03391896188259125,
0.10704534500837326,
0.12260692566633224,
-0.1272030770778656,
0.06684882193803787,
0.09093352407217026,
0.0852588415145874,
0.032711613923311234,
0.12985217571258545,
-0.03863689675927162,
-0.2340312898159027,
0.007202270440757275,
-0.018133612349629402,
-0.06858200579881668,
0.10218631476163864,
0.08868930488824844,
-0.09728636592626572,
0.08037887513637543,
-0.0027885353192687035,
-0.18539871275424957,
0.028375931084156036,
-0.019638968631625175,
-0.05063241347670555,
0.09872696548700333,
-0.005383909679949284,
0.08285216987133026,
0.008248614147305489,
0.11996427923440933,
-0.19127902388572693,
0.013159165158867836,
0.07237877696752548,
0.03431396186351776,
0.09487965703010559,
0.02038010209798813,
-0.0019782758317887783,
0.16557522118091583,
-0.1353139728307724,
0.09807024896144867,
0.02830614149570465,
-0.08544953912496567,
-0.16748180985450745,
-0.09561719000339508,
0.014352651312947273,
0.03669770434498787,
0.08554313331842422,
0.008924520574510098,
0.17809706926345825,
-0.09218002110719681,
0.07960614562034607,
0.23600271344184875,
-0.27908793091773987,
-0.07899387925863266,
0.042012955993413925,
0.04740038514137268,
0.0790691152215004,
-0.11747010797262192,
-0.015907704830169678,
0.020247215405106544,
0.029567541554570198,
0.09328994154930115,
-0.026098402217030525,
-0.08793734014034271,
-0.007452081888914108,
-0.12023531645536423,
0.0041311862878501415,
0.11148487031459808,
0.043915875256061554,
-0.050995901226997375,
-0.04790712893009186,
-0.05919985845685005,
-0.06829923391342163,
-0.03484594076871872,
-0.03542506694793701,
0.04360958933830261,
-0.05717494338750839,
-0.12082114815711975,
-0.03567393496632576,
-0.04859023541212082,
-0.07846899330615997,
-0.020953970029950142,
0.20781727135181427,
0.04953451082110405,
0.0339212492108345,
-0.05048780143260956,
0.08209861814975739,
0.011090440675616264,
-0.12886224687099457,
-0.029127279296517372,
0.0031584668904542923,
-0.08057413250207901,
-0.03833629935979843,
-0.056254174560308456,
0.01583690568804741,
0.04302925243973732,
0.20986559987068176,
-0.05528905615210533,
0.08315588533878326,
0.03172118589282036,
-0.01998128555715084,
-0.021072272211313248,
0.1249396950006485,
-0.018459603190422058,
-0.0761328786611557,
0.028532614931464195,
0.05818672478199005,
0.02639600820839405,
0.00339300069026649,
-0.056398872286081314,
-0.032485973089933395,
0.08340834826231003,
0.03226618468761444,
-0.06244784593582153,
0.024685900658369064,
0.00188945431727916,
-0.02000344730913639,
0.004798715468496084,
-0.1145344227552414,
0.017418144270777702,
-0.006429366301745176,
-0.0747104063630104,
-0.015233571641147137,
0.008142640814185143,
-0.01495977584272623,
0.011036819778382778,
0.09813929349184036,
-0.08808229118585587,
-0.020089630037546158,
-0.07572738826274872,
-0.06948982924222946,
-0.0009427565964870155,
-0.14884328842163086,
0.012807857245206833,
-0.07344299554824829,
-0.1544175148010254,
-0.03119904361665249,
0.042725928127765656,
-0.07155702263116837,
-0.031030116602778435,
-0.038245730102062225,
-0.07569511979818344,
0.026173608377575874,
0.002994018141180277,
0.18760302662849426,
-0.05245635285973549,
0.08010168373584747,
0.02004770003259182,
0.05120674520730972,
-0.022610455751419067,
0.030537379905581474,
-0.08845602720975876,
0.0045234751887619495,
-0.17041239142417908,
0.06269454956054688,
-0.07581768929958344,
0.01437497790902853,
-0.1300613433122635,
-0.08624684810638428,
-0.02622394450008869,
-0.029173636808991432,
0.08314661681652069,
0.10157332569360733,
-0.13905617594718933,
-0.026532767340540886,
0.11225271224975586,
-0.0785352885723114,
-0.0562467984855175,
0.06678542494773865,
-0.06731942296028137,
0.052859481424093246,
0.05675370991230011,
0.18400344252586365,
0.07538371533155441,
-0.11403826624155045,
-0.020981932058930397,
0.004124464001506567,
0.029312949627637863,
-0.010892483405768871,
0.05200204998254776,
0.010213250294327736,
0.019469808787107468,
0.016554173082113266,
-0.040153663605451584,
0.0032650029752403498,
-0.09750142693519592,
-0.06310286372900009,
-0.05754393711686134,
-0.08250388503074646,
-0.03367017209529877,
0.012152132578194141,
0.03847675770521164,
-0.08047878742218018,
-0.08298803120851517,
0.08204789459705353,
0.1419244259595871,
-0.043057411909103394,
0.02137358859181404,
-0.07301471382379532,
0.018699323758482933,
-0.05428118631243706,
-0.031200913712382317,
-0.20466488599777222,
-0.06642118841409683,
0.03452694043517113,
-0.028622455894947052,
0.05741099640727043,
0.004948028828948736,
0.07120022922754288,
0.037639789283275604,
-0.0370456799864769,
0.006149537395685911,
-0.08980654925107956,
-0.007504240144044161,
-0.09444160759449005,
-0.22187000513076782,
-0.03834352269768715,
-0.03076786920428276,
0.12737080454826355,
-0.1616523712873459,
-0.010030661709606647,
-0.03389260545372963,
0.11999859660863876,
0.029130151495337486,
-0.061601508408784866,
-0.015418441034853458,
0.03014158084988594,
0.002521762391552329,
-0.09170511364936829,
0.03225473314523697,
0.01746201701462269,
-0.07244516164064407,
-0.056628771126270294,
-0.1147584617137909,
0.005207796581089497,
0.07679019868373871,
0.06301701813936234,
-0.0962878167629242,
0.006430336274206638,
-0.06460987776517868,
-0.0336737297475338,
-0.05784790590405464,
0.041577182710170746,
0.17721419036388397,
0.005245919805020094,
0.10687419027090073,
-0.0804959163069725,
-0.07304399460554123,
0.021681219339370728,
0.004618486389517784,
0.04267747327685356,
0.09197728335857391,
0.1118222177028656,
-0.12832054495811462,
0.06358098983764648,
0.08402594178915024,
-0.06279413402080536,
0.12412947416305542,
-0.03950805589556694,
-0.07920617610216141,
-0.035467445850372314,
-0.017255444079637527,
-0.015175867825746536,
0.13665151596069336,
-0.05159486457705498,
0.025706376880407333,
0.029561197385191917,
0.03946604207158089,
0.019872164353728294,
-0.1543971449136734,
-0.002328547416254878,
0.008253630250692368,
-0.04436193406581879,
-0.016582628712058067,
0.02166835032403469,
0.019617237150669098,
0.09803424030542374,
0.03346271812915802,
-0.017089270055294037,
-0.006134229712188244,
-0.00392485037446022,
-0.05263303592801094,
0.19084271788597107,
-0.09048981964588165,
-0.04253675788640976,
-0.07719884067773819,
0.0013238027459010482,
-0.038130879402160645,
-0.04275460168719292,
0.02656910941004753,
-0.08718360215425491,
-0.03758024051785469,
-0.07394775748252869,
0.00024977661087177694,
-0.04738657549023628,
0.026460492983460426,
0.030634749680757523,
0.003744886489585042,
0.06294673681259155,
-0.1334538459777832,
0.00492143165320158,
-0.07315099239349365,
-0.10497675836086273,
0.01719135046005249,
0.06517758965492249,
0.09110576659440994,
0.05869004875421524,
-0.029501620680093765,
0.021353956311941147,
-0.031243227422237396,
0.25239261984825134,
-0.05484010651707649,
-0.0005625178455375135,
0.10820179432630539,
0.023232536390423775,
0.046364910900592804,
0.09277375042438507,
0.03436867520213127,
-0.1010437086224556,
0.03057900071144104,
0.08509909361600876,
-0.035243645310401917,
-0.23825319111347198,
-0.0064867036417126656,
-0.03419563174247742,
-0.11454727500677109,
0.08229655027389526,
0.05013071373105049,
-0.04657382145524025,
0.06444063037633896,
0.010477579198777676,
0.023299815133213997,
-0.05140675976872444,
0.09156221896409988,
0.10099246352910995,
0.07454193383455276,
0.10238045454025269,
-0.04786955192685127,
-0.022038675844669342,
0.0698913186788559,
-0.003702564164996147,
0.2920796871185303,
-0.027667509391903877,
0.06729652732610703,
0.0532359704375267,
0.1410570591688156,
-0.0220661461353302,
0.03623754903674126,
0.007288065738976002,
-0.003803573315963149,
-0.026048006489872932,
-0.05784285441040993,
-0.028107529506087303,
-0.0006255954504013062,
-0.0775592252612114,
0.05536169558763504,
-0.06193749979138374,
0.066841721534729,
0.019361495971679688,
0.26262709498405457,
-0.0008370282594114542,
-0.2791512608528137,
-0.07777796685695648,
-0.02085677720606327,
-0.03908611461520195,
-0.04342971369624138,
0.013239394873380661,
0.10048985481262207,
-0.10402751713991165,
0.05144635587930679,
-0.058043576776981354,
0.08032338321208954,
-0.028252117335796356,
-0.00455103674903512,
0.0308523066341877,
0.18418455123901367,
-0.014355838298797607,
0.05069946497678757,
-0.19837063550949097,
0.215473935008049,
0.01836223155260086,
0.13178597390651703,
-0.05020689591765404,
0.008098466321825981,
0.023507598787546158,
-0.001484308042563498,
0.07595875859260559,
-0.004892537370324135,
-0.07766976207494736,
-0.12600591778755188,
-0.07433158904314041,
0.08088740706443787,
0.1413889229297638,
-0.015381881967186928,
0.10257167369127274,
-0.049049053341150284,
0.019045906141400337,
0.040663763880729675,
-0.06784043461084366,
-0.15744008123874664,
-0.09578553587198257,
-0.017810216173529625,
0.036208558827638626,
-0.09678009152412415,
-0.0470929853618145,
-0.07568006962537766,
-0.009220664389431477,
0.11478176712989807,
0.025418326258659363,
-0.019635967910289764,
-0.13641777634620667,
0.08862186968326569,
0.14949552714824677,
-0.07420787960290909,
0.022847428917884827,
-0.007525166962295771,
0.06515111774206161,
0.0433969721198082,
-0.09559629112482071,
0.04764080047607422,
-0.05752193555235863,
-0.16096505522727966,
-0.04581074044108391,
0.09118271619081497,
0.07216646522283554,
0.04015858471393585,
-0.0046619391068816185,
0.04933112487196922,
-0.019597776234149933,
-0.09986119717359543,
0.012898539192974567,
0.035387031733989716,
0.04910873994231224,
0.0370943583548069,
-0.08371423184871674,
0.06191646307706833,
-0.03270838409662247,
-0.004493026062846184,
0.11493942141532898,
0.24297943711280823,
-0.09000682085752487,
0.08656849712133408,
0.05810121074318886,
-0.06837195158004761,
-0.14447849988937378,
0.06273411214351654,
0.10523257404565811,
-0.0005053016357123852,
0.05782735347747803,
-0.19607645273208618,
0.1421061009168625,
0.11322334408760071,
-0.012912408448755741,
0.0357995368540287,
-0.2755998969078064,
-0.1180969625711441,
0.05874814838171005,
0.13212648034095764,
0.12064164131879807,
-0.1314307153224945,
-0.014061611145734787,
-0.01694273203611374,
-0.1232391968369484,
0.10783781856298447,
-0.11389650404453278,
0.13410457968711853,
-0.033585354685783386,
0.11020335555076599,
0.004899301566183567,
-0.029442835599184036,
0.10763842612504959,
0.049684684723615646,
0.09656451642513275,
-0.04124557599425316,
0.012466197833418846,
0.05946363881230354,
-0.04834766685962677,
0.014021876268088818,
-0.07313219457864761,
0.08459161967039108,
-0.12081349641084671,
-0.006956850178539753,
-0.07879441976547241,
0.05052397772669792,
-0.03699062019586563,
-0.052430346608161926,
-0.053111482411623,
0.03632063791155815,
0.05529845878481865,
-0.036961380392313004,
0.05264510586857796,
0.0013365221675485373,
0.08964820951223373,
0.02520955726504326,
0.06794679164886475,
0.0015705586411058903,
-0.04716996103525162,
0.019476016983389854,
-0.008570127189159393,
0.060389116406440735,
-0.13914822041988373,
0.006166979670524597,
0.10628101229667664,
0.05188479274511337,
0.09596052020788193,
0.04285140335559845,
-0.04795383661985397,
0.012965267524123192,
0.036915142089128494,
-0.113885797560215,
-0.10367521643638611,
0.047400329262018204,
-0.03950170800089836,
-0.13916060328483582,
0.04656374827027321,
0.1158246174454689,
-0.04894295707345009,
-0.02278944104909897,
-0.01902664825320244,
0.006298219785094261,
-0.021837947890162468,
0.18524706363677979,
0.04244229942560196,
0.04115898534655571,
-0.10164476186037064,
0.131278395652771,
0.02916533686220646,
-0.021877333521842957,
0.058141715824604034,
0.08562988042831421,
-0.09533807635307312,
0.002713143825531006,
0.09457681328058243,
0.1739223599433899,
-0.07205914705991745,
-0.014822614379227161,
-0.1042875200510025,
-0.07246241718530655,
0.061279457062482834,
0.15849724411964417,
0.057049620896577835,
-0.018164383247494698,
-0.050961319357156754,
0.04073427990078926,
-0.1404801607131958,
0.06310944259166718,
0.03252185136079788,
0.07025611400604248,
-0.08800925314426422,
0.057970643043518066,
0.007388454396277666,
0.007218943443149328,
-0.01728816144168377,
0.014217043295502663,
-0.09236522018909454,
-0.029633846133947372,
-0.07783140987157822,
0.01125736441463232,
-0.012684048153460026,
0.015762513503432274,
-0.009825113229453564,
-0.06806037575006485,
-0.0697348341345787,
0.037032581865787506,
-0.07706792652606964,
-0.05282876268029213,
0.011223534122109413,
0.04284550994634628,
-0.13307619094848633,
0.00514640100300312,
0.016146866604685783,
-0.08930865675210953,
0.08669041842222214,
0.08907578885555267,
0.02668733149766922,
0.034887686371803284,
-0.12700334191322327,
-0.032678619027137756,
0.014582882635295391,
0.0030526428017765284,
0.06485900282859802,
-0.09549403190612793,
-0.004275203682482243,
-0.02078910358250141,
0.07740544527769089,
0.009666641242802143,
0.08267930895090103,
-0.13118606805801392,
0.009184535592794418,
-0.08508671075105667,
-0.044867634773254395,
-0.06597530841827393,
0.015991756692528725,
0.10053064674139023,
0.05357666686177254,
0.16413092613220215,
-0.07757723331451416,
0.0183730348944664,
-0.20831362903118134,
-0.027900708839297295,
-0.005360706243664026,
-0.05266194045543671,
-0.1378163993358612,
-0.040736615657806396,
0.07683146744966507,
-0.03856243938207626,
0.09881911426782608,
-0.02037920616567135,
0.06232428923249245,
0.038924627006053925,
-0.031920675188302994,
-0.06138106435537338,
-0.027984468266367912,
0.19609664380550385,
0.0778299942612648,
-0.01587076298892498,
0.1074431985616684,
-0.006435849703848362,
0.05249602720141411,
0.029962208122015,
0.20516768097877502,
0.2082124948501587,
0.0069220298901200294,
0.07100413739681244,
0.06274782121181488,
-0.08041351288557053,
-0.06896402686834335,
0.1806400716304779,
-0.027407493442296982,
0.07286003232002258,
-0.028332507237792015,
0.18785372376441956,
0.11109022796154022,
-0.15089881420135498,
0.030702224001288414,
-0.03193676844239235,
-0.07778274267911911,
-0.14101697504520416,
0.0005747857503592968,
-0.09765848517417908,
-0.11889077723026276,
0.04450086131691933,
-0.1199798732995987,
0.0553300678730011,
0.08197461813688278,
0.01140151359140873,
0.03331703320145607,
0.12618914246559143,
-0.025429124012589455,
0.004471505526453257,
0.041705772280693054,
0.007573411799967289,
-0.027911003679037094,
-0.03865095227956772,
-0.0770648792386055,
0.05030889809131622,
0.00786665640771389,
0.08733706176280975,
-0.045338574796915054,
-0.009467074647545815,
0.04135502129793167,
-0.029506610706448555,
-0.0763331726193428,
0.025189630687236786,
0.03590871021151543,
0.05693957582116127,
0.04330293834209442,
0.045272815972566605,
-0.006720621604472399,
-0.03309809789061546,
0.28033149242401123,
-0.05862616002559662,
-0.09608869254589081,
-0.11408165842294693,
0.20720413327217102,
0.039109159260988235,
-0.02825601026415825,
0.040349092334508896,
-0.08301892131567001,
-0.013349991291761398,
0.15448948740959167,
0.1553475707769394,
-0.06501545757055283,
-0.024910438805818558,
-0.012806575745344162,
-0.017000248655676842,
-0.04007235914468765,
0.11713593453168869,
0.09495551884174347,
0.001789228874258697,
-0.05418949946761131,
-0.028381843119859695,
-0.03681323304772377,
-0.014440136030316353,
-0.041674572974443436,
0.02486429549753666,
0.01463642530143261,
-0.021743375808000565,
-0.034963298588991165,
0.06235259398818016,
-0.0009161303169094026,
-0.24069681763648987,
0.060261379927396774,
-0.1427571326494217,
-0.16911357641220093,
-0.025071581825613976,
0.048697054386138916,
-0.010667615570127964,
0.050286032259464264,
-0.024720191955566406,
-0.003785064211115241,
0.08038586378097534,
-0.020129753276705742,
-0.056831348687410355,
-0.12298955768346786,
0.11287573724985123,
-0.058781832456588745,
0.17966873943805695,
-0.016390003263950348,
0.07132772356271744,
0.11733588576316833,
0.04294774681329727,
-0.13972902297973633,
0.04584416747093201,
0.047243472188711166,
-0.11323995888233185,
0.018531113862991333,
0.1438630074262619,
-0.04533648490905762,
0.0865369662642479,
0.045461639761924744,
-0.09439826011657715,
-0.010981541126966476,
-0.04624298959970474,
-0.02609952725470066,
-0.07092337310314178,
-0.013434569351375103,
-0.06975290179252625,
0.16964425146579742,
0.19647540152072906,
-0.02489010989665985,
0.014756796881556511,
-0.09329258650541306,
0.028865577653050423,
0.06772947311401367,
0.0402328297495842,
-0.049376752227544785,
-0.20810389518737793,
0.017797265201807022,
0.049460481852293015,
-0.002724174177274108,
-0.23362977802753448,
-0.0796213373541832,
0.040552038699388504,
-0.03403262421488762,
-0.05537790432572365,
0.09740979969501495,
0.03532468527555466,
0.047683682292699814,
-0.03675813227891922,
-0.15189482271671295,
-0.038932688534259796,
0.1540987193584442,
-0.17908763885498047,
-0.049549564719200134
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07799851894378662,
0.10011155158281326,
-0.002582166576758027,
0.07830294221639633,
0.13243073225021362,
0.034323520958423615,
0.10743747651576996,
0.1218603253364563,
-0.12725605070590973,
0.06637881696224213,
0.09100772440433502,
0.08527607470750809,
0.0319521389901638,
0.12948033213615417,
-0.03853053227066994,
-0.23399026691913605,
0.006752045825123787,
-0.017520667985081673,
-0.0680660754442215,
0.10222575813531876,
0.08780426532030106,
-0.09789375960826874,
0.08091052621603012,
-0.0030807447619736195,
-0.1864018440246582,
0.028770316392183304,
-0.020244110375642776,
-0.05036713182926178,
0.09895728528499603,
-0.005295535083860159,
0.0832478478550911,
0.007470903918147087,
0.11983887851238251,
-0.18976932764053345,
0.013441700488328934,
0.07262472063302994,
0.033891018480062485,
0.09447340667247772,
0.019881876185536385,
-0.0023764106445014477,
0.16435259580612183,
-0.1356995850801468,
0.09797044098377228,
0.02821146324276924,
-0.08570771664381027,
-0.16817247867584229,
-0.09508395940065384,
0.01293761283159256,
0.035743001848459244,
0.08649186789989471,
0.008588379248976707,
0.17754553258419037,
-0.09235930442810059,
0.08000578731298447,
0.23444944620132446,
-0.2797504961490631,
-0.07937777042388916,
0.04256241023540497,
0.047451041638851166,
0.07937397807836533,
-0.11796946078538895,
-0.015122023411095142,
0.020488444715738297,
0.03041793219745159,
0.09357341378927231,
-0.026458553969860077,
-0.08805236965417862,
-0.007004170212894678,
-0.12007833272218704,
0.005694457795470953,
0.11208085715770721,
0.04410398006439209,
-0.0506364107131958,
-0.04804873466491699,
-0.05849277973175049,
-0.06957581639289856,
-0.03528785705566406,
-0.034452103078365326,
0.04374207183718681,
-0.05806542932987213,
-0.12121457606554031,
-0.03408181667327881,
-0.048144519329071045,
-0.07743723690509796,
-0.02112806960940361,
0.20693330466747284,
0.04932722821831703,
0.0336538664996624,
-0.05028904229402542,
0.08134376257658005,
0.010650013573467731,
-0.1286628544330597,
-0.028068462386727333,
0.002646060660481453,
-0.08015873283147812,
-0.038276687264442444,
-0.056931231170892715,
0.017041852697730064,
0.04323938488960266,
0.21006745100021362,
-0.05537814274430275,
0.08378628641366959,
0.03233691677451134,
-0.02059190906584263,
-0.02108832076191902,
0.12397568672895432,
-0.019885659217834473,
-0.07766953855752945,
0.028612608090043068,
0.05816052109003067,
0.02577722631394863,
0.0032087985891848803,
-0.05669105798006058,
-0.03165169805288315,
0.08209668099880219,
0.03175463154911995,
-0.06354574114084244,
0.02664562501013279,
0.002835439285263419,
-0.01958448626101017,
0.004630800802260637,
-0.11431204527616501,
0.01712442748248577,
-0.006698674988001585,
-0.0748567208647728,
-0.014310210943222046,
0.0076797292567789555,
-0.015772666782140732,
0.010764684528112411,
0.09865686297416687,
-0.08897508680820465,
-0.02064334787428379,
-0.07635416835546494,
-0.07054038345813751,
-0.0015412474749609828,
-0.148790180683136,
0.013808360323309898,
-0.0731959342956543,
-0.15476040542125702,
-0.03239870071411133,
0.042559221386909485,
-0.07137303054332733,
-0.029669301584362984,
-0.03825845569372177,
-0.07651720196008682,
0.025813139975070953,
0.003169011790305376,
0.18825064599514008,
-0.05283357948064804,
0.07903961837291718,
0.021010149270296097,
0.05120709910988808,
-0.023258309811353683,
0.03068414330482483,
-0.08823707699775696,
0.0038353749550879,
-0.17102432250976562,
0.06195734441280365,
-0.07649676501750946,
0.016256429255008698,
-0.12890422344207764,
-0.08732829988002777,
-0.025302467867732048,
-0.02800997532904148,
0.08280986547470093,
0.1009359359741211,
-0.13888561725616455,
-0.02655823901295662,
0.11151629686355591,
-0.07725891470909119,
-0.05635080859065056,
0.06667938828468323,
-0.06763772666454315,
0.051871467381715775,
0.05717160552740097,
0.18428510427474976,
0.07534770667552948,
-0.11356619745492935,
-0.021751202642917633,
0.003305671503767371,
0.029589641839265823,
-0.011515891179442406,
0.05042409524321556,
0.011187535710632801,
0.019532546401023865,
0.017074793577194214,
-0.038919419050216675,
0.00323889241553843,
-0.09709302335977554,
-0.06261289864778519,
-0.0569571852684021,
-0.08240213245153427,
-0.034380316734313965,
0.012592434883117676,
0.03834546357393265,
-0.08140712976455688,
-0.08334150910377502,
0.08140625059604645,
0.1415822058916092,
-0.04351837560534477,
0.020251698791980743,
-0.07326658815145493,
0.01923774555325508,
-0.05424055829644203,
-0.030712543055415154,
-0.20441275835037231,
-0.06727471947669983,
0.033484481275081635,
-0.026496637612581253,
0.05760518088936806,
0.005224874243140221,
0.07177504152059555,
0.037852998822927475,
-0.03731266409158707,
0.005963706877082586,
-0.08852028846740723,
-0.0070066070184111595,
-0.09425543248653412,
-0.22234012186527252,
-0.0385468527674675,
-0.030579719692468643,
0.12741097807884216,
-0.16220302879810333,
-0.009704498574137688,
-0.03383014351129532,
0.11945170164108276,
0.028757372871041298,
-0.06101977452635765,
-0.014769277535378933,
0.03162737563252449,
0.0033224087674170732,
-0.09170077741146088,
0.03281401842832565,
0.017144715413451195,
-0.07128660380840302,
-0.05797329545021057,
-0.11520165950059891,
0.006549069192260504,
0.07787923514842987,
0.061774738132953644,
-0.09678611159324646,
0.006793313194066286,
-0.06410002708435059,
-0.03349911794066429,
-0.056114792823791504,
0.04156628996133804,
0.17828112840652466,
0.004670122172683477,
0.1067817434668541,
-0.07980020344257355,
-0.07261280715465546,
0.021811919286847115,
0.005708799697458744,
0.04390138015151024,
0.09196639060974121,
0.11113759875297546,
-0.12671135365962982,
0.06386492401361465,
0.08347062766551971,
-0.06326887011528015,
0.12478786706924438,
-0.039621759206056595,
-0.07858321815729141,
-0.03493497148156166,
-0.018440978601574898,
-0.01556699350476265,
0.1373569518327713,
-0.05204739421606064,
0.024665486067533493,
0.02919781766831875,
0.039039209485054016,
0.02016414701938629,
-0.15346987545490265,
-0.002306433627381921,
0.00752777885645628,
-0.04332950338721275,
-0.01731189899146557,
0.02229277789592743,
0.019085213541984558,
0.09769868850708008,
0.03332475945353508,
-0.018194589763879776,
-0.006133678834885359,
-0.0038014953024685383,
-0.0517236702144146,
0.19118666648864746,
-0.09095345437526703,
-0.04132873937487602,
-0.07690749317407608,
-0.0005813681636936963,
-0.03919944539666176,
-0.04319486767053604,
0.026231346651911736,
-0.08842141181230545,
-0.03794512525200844,
-0.0736197754740715,
-0.0004022752109449357,
-0.04730253666639328,
0.026309330016374588,
0.03001578338444233,
0.003191941184923053,
0.062438949942588806,
-0.13327957689762115,
0.005049822852015495,
-0.07345597445964813,
-0.10498537868261337,
0.0176592655479908,
0.06581711769104004,
0.09179544448852539,
0.05834576115012169,
-0.029508953914046288,
0.021417303010821342,
-0.031115006655454636,
0.2532626986503601,
-0.05499355494976044,
-0.0003387394535820931,
0.10798339545726776,
0.024005426093935966,
0.04538753628730774,
0.09285344928503036,
0.03535760939121246,
-0.10146838426589966,
0.03025694563984871,
0.0855979472398758,
-0.03485991805791855,
-0.23815035820007324,
-0.006060242187231779,
-0.03465544804930687,
-0.1147361546754837,
0.08183895796537399,
0.05003364756703377,
-0.046054430305957794,
0.06504595279693604,
0.01141416560858488,
0.024066144600510597,
-0.05219211429357529,
0.09122844785451889,
0.10264992713928223,
0.07407915592193604,
0.10312250256538391,
-0.048212930560112,
-0.022329378873109818,
0.06920722126960754,
-0.003590768203139305,
0.2927989363670349,
-0.027909405529499054,
0.06698773056268692,
0.05371631681919098,
0.14049533009529114,
-0.021479107439517975,
0.03683127835392952,
0.006674715783447027,
-0.004505302291363478,
-0.026031775400042534,
-0.057596445083618164,
-0.027025338262319565,
-0.001638896414078772,
-0.0791013091802597,
0.05511821061372757,
-0.061908308416604996,
0.0654543861746788,
0.019707292318344116,
0.2618490159511566,
-0.001432169578038156,
-0.2812874913215637,
-0.07827134430408478,
-0.021573506295681,
-0.03856739401817322,
-0.0430489256978035,
0.013554001227021217,
0.09988700598478317,
-0.10346804559230804,
0.052085258066654205,
-0.057956498116254807,
0.08020849525928497,
-0.028265908360481262,
-0.003498875303193927,
0.03282751888036728,
0.18578240275382996,
-0.015317688696086407,
0.05018460750579834,
-0.19904598593711853,
0.2144569754600525,
0.018655473366379738,
0.13238529860973358,
-0.05031025409698486,
0.007866945117712021,
0.02434609644114971,
-0.000032171014026971534,
0.07551631331443787,
-0.005485346540808678,
-0.07745079696178436,
-0.12608766555786133,
-0.07317448407411575,
0.08186732977628708,
0.14091657102108002,
-0.014238541945815086,
0.10320372879505157,
-0.04831261560320854,
0.01855984888970852,
0.04055766761302948,
-0.06904866546392441,
-0.15752272307872772,
-0.09634815901517868,
-0.018713295459747314,
0.03771305829286575,
-0.09656031429767609,
-0.04621336609125137,
-0.07597922533750534,
-0.01090418640524149,
0.1139785647392273,
0.027009448036551476,
-0.019678283482789993,
-0.13684609532356262,
0.08773612231016159,
0.14949935674667358,
-0.0733199492096901,
0.023017441853880882,
-0.00715038413181901,
0.06402281671762466,
0.04436922073364258,
-0.0954386442899704,
0.04788980633020401,
-0.05815718322992325,
-0.15962888300418854,
-0.04591451585292816,
0.09018252789974213,
0.0717439129948616,
0.039440177381038666,
-0.005179577507078648,
0.049350589513778687,
-0.02042578160762787,
-0.10047703981399536,
0.013003585860133171,
0.03386309742927551,
0.04989086836576462,
0.03709733486175537,
-0.08467955887317657,
0.06178012862801552,
-0.0320420041680336,
-0.003599872812628746,
0.11350507289171219,
0.2405543327331543,
-0.08957506716251373,
0.08446025848388672,
0.05814872309565544,
-0.06791797280311584,
-0.1441516876220703,
0.06419919431209564,
0.1037420928478241,
-0.00013585647684521973,
0.05633491277694702,
-0.1966618001461029,
0.14315001666545868,
0.11272645741701126,
-0.011875146068632603,
0.037599943578243256,
-0.2731051743030548,
-0.11737246811389923,
0.058977313339710236,
0.13280758261680603,
0.1218208372592926,
-0.13198566436767578,
-0.013347131200134754,
-0.01716669276356697,
-0.12297041714191437,
0.10718371719121933,
-0.1164252907037735,
0.13430465757846832,
-0.033978551626205444,
0.11083894222974777,
0.004185252357274294,
-0.029450125992298126,
0.10674722492694855,
0.05056885629892349,
0.09774886816740036,
-0.04166841879487038,
0.013467174954712391,
0.05943044647574425,
-0.04759368672966957,
0.013677709735929966,
-0.07391276210546494,
0.08407721668481827,
-0.12100663036108017,
-0.006773077417165041,
-0.07933766394853592,
0.05048930644989014,
-0.03680223599076271,
-0.05226665362715721,
-0.052473537623882294,
0.03644656389951706,
0.054239530116319656,
-0.03722254931926727,
0.05149118974804878,
0.00040751867345534265,
0.09015525877475739,
0.023869037628173828,
0.06857778877019882,
0.000012606166819750797,
-0.04756445065140724,
0.020780419930815697,
-0.009079056791961193,
0.0603911429643631,
-0.13883599638938904,
0.00487684179097414,
0.10686548054218292,
0.051173679530620575,
0.09552380442619324,
0.04362351447343826,
-0.047008316963911057,
0.012334423139691353,
0.037719592452049255,
-0.11417271196842194,
-0.10260903090238571,
0.0476728230714798,
-0.04263342544436455,
-0.1385163813829422,
0.04789385944604874,
0.11562340706586838,
-0.048743654042482376,
-0.022461840882897377,
-0.018997475504875183,
0.005567850545048714,
-0.02186429128050804,
0.18577784299850464,
0.0429970845580101,
0.040618132799863815,
-0.102627232670784,
0.1308954358100891,
0.029007326811552048,
-0.022284887731075287,
0.058100540190935135,
0.08670847117900848,
-0.09621798992156982,
0.0018003262812271714,
0.09385278075933456,
0.1757415235042572,
-0.07157200574874878,
-0.015499116852879524,
-0.10537712275981903,
-0.07282207161188126,
0.06114831194281578,
0.15814484655857086,
0.05687318742275238,
-0.019380517303943634,
-0.05117398872971535,
0.04027082026004791,
-0.14121344685554504,
0.062723807990551,
0.03140304237604141,
0.07069994509220123,
-0.08799407631158829,
0.057259365916252136,
0.007251985836774111,
0.006480289623141289,
-0.017258156090974808,
0.014818659983575344,
-0.09298529475927353,
-0.02971099130809307,
-0.07778653502464294,
0.010331816971302032,
-0.012644157744944096,
0.016338126733899117,
-0.010121689178049564,
-0.0670907124876976,
-0.07035940140485764,
0.03711789473891258,
-0.07720271497964859,
-0.05243483558297157,
0.012955568730831146,
0.04330655559897423,
-0.13237610459327698,
0.005453029647469521,
0.015001729130744934,
-0.08887997269630432,
0.08637936413288116,
0.08888290822505951,
0.02638450264930725,
0.03510396182537079,
-0.12623043358325958,
-0.03318706527352333,
0.014023132622241974,
0.002430941676720977,
0.06566459685564041,
-0.09438002109527588,
-0.0037585098762065172,
-0.021067913621664047,
0.07793377339839935,
0.009860904887318611,
0.08182282745838165,
-0.13052044808864594,
0.009821772575378418,
-0.08428330719470978,
-0.04342839866876602,
-0.06659665703773499,
0.016313424333930016,
0.1013321503996849,
0.05316418036818504,
0.16408506035804749,
-0.07754141837358475,
0.01776871085166931,
-0.20831115543842316,
-0.02826438844203949,
-0.0055008539929986,
-0.05417421832680702,
-0.13732333481311798,
-0.04082278162240982,
0.07746313512325287,
-0.03922443091869354,
0.1005273088812828,
-0.020181354135274887,
0.062456972897052765,
0.0384640172123909,
-0.03158573433756828,
-0.06243107467889786,
-0.027684679254889488,
0.19563817977905273,
0.07726721465587616,
-0.016410144045948982,
0.10723886638879776,
-0.005125104449689388,
0.051793988794088364,
0.0311985332518816,
0.20362085103988647,
0.20763951539993286,
0.0077979727648198605,
0.07099631428718567,
0.06324449181556702,
-0.08084625005722046,
-0.0675334632396698,
0.181590735912323,
-0.02703113853931427,
0.07340435683727264,
-0.02934085763990879,
0.18664276599884033,
0.11016160249710083,
-0.14984358847141266,
0.031000645831227303,
-0.03286316618323326,
-0.07784510403871536,
-0.13995927572250366,
0.0018650422571226954,
-0.09709405153989792,
-0.11947315186262131,
0.04451964423060417,
-0.12041649222373962,
0.0545012392103672,
0.08312332630157471,
0.011918371543288231,
0.03281576186418533,
0.12770716845989227,
-0.02464352920651436,
0.005262316204607487,
0.04191502556204796,
0.007190992124378681,
-0.028328148648142815,
-0.03866128250956535,
-0.076602503657341,
0.04967273399233818,
0.006386360619217157,
0.08663994073867798,
-0.04585675150156021,
-0.009961193427443504,
0.04193371906876564,
-0.029147207736968994,
-0.07595831155776978,
0.0250401608645916,
0.03652302920818329,
0.05655933544039726,
0.04426758736371994,
0.04486677795648575,
-0.007588587701320648,
-0.03317451477050781,
0.2794242799282074,
-0.05862506851553917,
-0.0951036661863327,
-0.11441953480243683,
0.2068721204996109,
0.038820281624794006,
-0.028779756277799606,
0.039155542850494385,
-0.08216186612844467,
-0.012318000197410583,
0.155678391456604,
0.1572461873292923,
-0.06537596881389618,
-0.025168946012854576,
-0.012439525686204433,
-0.01701868511736393,
-0.040285468101501465,
0.1175813153386116,
0.09534592181444168,
0.0009377124952152371,
-0.05412089079618454,
-0.028257960453629494,
-0.03627894073724747,
-0.014181581325829029,
-0.04177144169807434,
0.02426527440547943,
0.015693124383687973,
-0.02216929942369461,
-0.034200169146060944,
0.062312062829732895,
-0.0015640902565792203,
-0.2406623363494873,
0.06091172993183136,
-0.142507866024971,
-0.1693558394908905,
-0.026129957288503647,
0.04826170951128006,
-0.009913568384945393,
0.05080800876021385,
-0.024780571460723877,
-0.0039966413751244545,
0.0809566006064415,
-0.0205573458224535,
-0.05586385354399681,
-0.12378431111574173,
0.11268088221549988,
-0.05921437591314316,
0.17859290540218353,
-0.01690680719912052,
0.07098536938428879,
0.11797428131103516,
0.042633045464754105,
-0.13940629363059998,
0.046441130340099335,
0.0465814471244812,
-0.11384280771017075,
0.018979834392666817,
0.14290551841259003,
-0.04521559178829193,
0.0858636349439621,
0.04437074065208435,
-0.09358254075050354,
-0.01010148786008358,
-0.04618123918771744,
-0.026051118969917297,
-0.07088097184896469,
-0.013476723805069923,
-0.06956391781568527,
0.17004644870758057,
0.1978384554386139,
-0.02468613162636757,
0.014694761484861374,
-0.09326376765966415,
0.02876228466629982,
0.06820397078990936,
0.03885526582598686,
-0.050181273370981216,
-0.20793978869915009,
0.018149036914110184,
0.0503927506506443,
-0.003157437779009342,
-0.23447996377944946,
-0.07846594601869583,
0.040395066142082214,
-0.03384670242667198,
-0.05557958781719208,
0.0971875712275505,
0.03631213307380676,
0.048603605479002,
-0.03721830993890762,
-0.14983168244361877,
-0.03890904411673546,
0.153950035572052,
-0.1791551411151886,
-0.050196655094623566
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07794936001300812,
0.09976553171873093,
-0.002547850599512458,
0.07803469151258469,
0.1326141208410263,
0.0338125117123127,
0.1072097048163414,
0.12209510803222656,
-0.12671734392642975,
0.0667562410235405,
0.09053675830364227,
0.08585679531097412,
0.0324351042509079,
0.12925034761428833,
-0.03840840235352516,
-0.2335662543773651,
0.007274987176060677,
-0.017316026613116264,
-0.06767795979976654,
0.10210557281970978,
0.0878438651561737,
-0.09813012927770615,
0.08026673644781113,
-0.003435867838561535,
-0.1865406036376953,
0.029047993943095207,
-0.020511476323008537,
-0.05013324320316315,
0.09875036031007767,
-0.005734311416745186,
0.08284489810466766,
0.007688915356993675,
0.11963530629873276,
-0.19016408920288086,
0.013401688076555729,
0.07298579066991806,
0.03404629975557327,
0.09463559091091156,
0.020802132785320282,
-0.0019239152316004038,
0.16557739675045013,
-0.13519945740699768,
0.09771210700273514,
0.02862728200852871,
-0.08561892807483673,
-0.16655512154102325,
-0.09560035914182663,
0.012351509183645248,
0.03629429638385773,
0.08708737045526505,
0.008102993480861187,
0.1783384382724762,
-0.09295165538787842,
0.08004265278577805,
0.23598651587963104,
-0.27850398421287537,
-0.07930732518434525,
0.043136559426784515,
0.047350745648145676,
0.0784936249256134,
-0.11876054853200912,
-0.016245806589722633,
0.02059955708682537,
0.030083635821938515,
0.09276456385850906,
-0.025835106149315834,
-0.08916337043046951,
-0.007388478610664606,
-0.12042661011219025,
0.005398558918386698,
0.11105015873908997,
0.04425540566444397,
-0.0502295084297657,
-0.047881316393613815,
-0.0588434673845768,
-0.06897737830877304,
-0.03496820107102394,
-0.034336671233177185,
0.04386604204773903,
-0.05792348459362984,
-0.12077003717422485,
-0.03396758809685707,
-0.04809899628162384,
-0.07835546135902405,
-0.020733991637825966,
0.20654310286045074,
0.04942038282752037,
0.03348929062485695,
-0.05083634704351425,
0.081277996301651,
0.010947276838123798,
-0.1288294941186905,
-0.02892548218369484,
0.0033273056615144014,
-0.0802183672785759,
-0.03838970512151718,
-0.05669356882572174,
0.016017993912100792,
0.042723432183265686,
0.208348348736763,
-0.05512743443250656,
0.08402266353368759,
0.03165937960147858,
-0.020536018535494804,
-0.02171149104833603,
0.12408796697854996,
-0.018934456631541252,
-0.07581563293933868,
0.027943843975663185,
0.058351144194602966,
0.025443589314818382,
0.0036178366281092167,
-0.056129712611436844,
-0.03182140365242958,
0.08271727710962296,
0.031401246786117554,
-0.06323537975549698,
0.026118382811546326,
0.002415923634544015,
-0.019771935418248177,
0.0052833156660199165,
-0.11419245600700378,
0.017032045871019363,
-0.007114851847290993,
-0.0751420110464096,
-0.015109044499695301,
0.007796265184879303,
-0.016016066074371338,
0.010521396063268185,
0.09873548895120621,
-0.08946722745895386,
-0.020605476573109627,
-0.07710077613592148,
-0.07024253904819489,
-0.001190885086543858,
-0.15027980506420135,
0.01346423476934433,
-0.07237912714481354,
-0.1549665629863739,
-0.03242937847971916,
0.042093776166439056,
-0.07138808071613312,
-0.029487719759345055,
-0.038792096078395844,
-0.07698583602905273,
0.02606542967259884,
0.0034988129045814276,
0.18897351622581482,
-0.052485108375549316,
0.07970336824655533,
0.02096834033727646,
0.05107899010181427,
-0.02258303575217724,
0.031255755573511124,
-0.08908261358737946,
0.003706460352987051,
-0.17034274339675903,
0.0621144063770771,
-0.07716376334428787,
0.016101490706205368,
-0.12961764633655548,
-0.08671724051237106,
-0.026491234079003334,
-0.028628552332520485,
0.08350472897291183,
0.10164761543273926,
-0.13884305953979492,
-0.02630334533751011,
0.11187553405761719,
-0.07802385091781616,
-0.05629146471619606,
0.0657728835940361,
-0.06736189126968384,
0.05159248411655426,
0.05668090283870697,
0.18430346250534058,
0.07483319193124771,
-0.11310087144374847,
-0.022892383858561516,
0.002839887049049139,
0.03016825020313263,
-0.01211770810186863,
0.05041803792119026,
0.010996459051966667,
0.02036367729306221,
0.01702866703271866,
-0.03901011869311333,
0.003113086801022291,
-0.0971662700176239,
-0.061991699039936066,
-0.057567257434129715,
-0.08250351995229721,
-0.034707289189100266,
0.013564287684857845,
0.03819773718714714,
-0.08105337619781494,
-0.08279578387737274,
0.0815301164984703,
0.14155761897563934,
-0.04300027713179588,
0.020451711490750313,
-0.0726868212223053,
0.018136322498321533,
-0.05509888753294945,
-0.030775798484683037,
-0.2053523063659668,
-0.06774963438510895,
0.03410141542553902,
-0.026771875098347664,
0.0575016513466835,
0.005045863799750805,
0.0715838149189949,
0.03716456890106201,
-0.03727615624666214,
0.005835407879203558,
-0.0893888920545578,
-0.007604388985782862,
-0.09456861019134521,
-0.22207289934158325,
-0.038761019706726074,
-0.031115107238292694,
0.125573068857193,
-0.16175000369548798,
-0.009883254766464233,
-0.03381733223795891,
0.11982782185077667,
0.02872982993721962,
-0.06154733523726463,
-0.014767080545425415,
0.03097440116107464,
0.002958497731015086,
-0.09203527122735977,
0.03269493579864502,
0.016531741246581078,
-0.07091232389211655,
-0.05788174271583557,
-0.11542201042175293,
0.005051909945905209,
0.07756824791431427,
0.06315626949071884,
-0.09682904928922653,
0.006893878802657127,
-0.06423311680555344,
-0.03387467935681343,
-0.057233769446611404,
0.04222162812948227,
0.17761318385601044,
0.004922271706163883,
0.10636325925588608,
-0.08024050295352936,
-0.07287933677434921,
0.02238723635673523,
0.005795445293188095,
0.04329020902514458,
0.09262006729841232,
0.11263065040111542,
-0.12817804515361786,
0.06450590491294861,
0.08406417071819305,
-0.06262216717004776,
0.12513862550258636,
-0.03978780284523964,
-0.0791097953915596,
-0.03389574587345123,
-0.017835697159171104,
-0.015440446324646473,
0.1369692087173462,
-0.051707420498132706,
0.025017714127898216,
0.029086625203490257,
0.0394291877746582,
0.020199287682771683,
-0.15376952290534973,
-0.0023963560815900564,
0.007368029095232487,
-0.043308623135089874,
-0.017733845859766006,
0.022210977971553802,
0.019026078283786774,
0.09782485663890839,
0.032994795590639114,
-0.01794142834842205,
-0.006531302351504564,
-0.0038210884667932987,
-0.05180098116397858,
0.19189482927322388,
-0.09066139161586761,
-0.040342725813388824,
-0.0760970339179039,
0.00016486515232827514,
-0.038978271186351776,
-0.04319682717323303,
0.02636118233203888,
-0.08939094841480255,
-0.03791818395256996,
-0.07357964664697647,
-0.00041804462671279907,
-0.0474296435713768,
0.025569912046194077,
0.029531758278608322,
0.003438932355493307,
0.0618881918489933,
-0.1339133232831955,
0.005097632296383381,
-0.07385678589344025,
-0.1053539291024208,
0.017348507419228554,
0.06529242545366287,
0.09146702289581299,
0.05844492092728615,
-0.02971719019114971,
0.02129238285124302,
-0.03126673772931099,
0.2522682547569275,
-0.05513015389442444,
-0.0008324691443704069,
0.10813511162996292,
0.0248945914208889,
0.0459878072142601,
0.09292855858802795,
0.03463907167315483,
-0.10154999047517776,
0.03082503378391266,
0.08598265796899796,
-0.035138215869665146,
-0.23913677036762238,
-0.006042200140655041,
-0.03484063968062401,
-0.11503410339355469,
0.08218296617269516,
0.050288207828998566,
-0.046252258121967316,
0.06511587649583817,
0.010481557808816433,
0.023108378052711487,
-0.05233609676361084,
0.09168864041566849,
0.10107836872339249,
0.07472534477710724,
0.10304083675146103,
-0.04818427562713623,
-0.021807797253131866,
0.06853874027729034,
-0.0034485117066651583,
0.2939909100532532,
-0.027907973155379295,
0.06721167266368866,
0.05335652828216553,
0.14067858457565308,
-0.021941984072327614,
0.03743250295519829,
0.006551275961101055,
-0.004542797803878784,
-0.025961147621273994,
-0.057466086000204086,
-0.027249492704868317,
-0.001688309945166111,
-0.0796639695763588,
0.05542939156293869,
-0.06195249408483505,
0.06580910831689835,
0.0188409686088562,
0.26248428225517273,
-0.0012598747853189707,
-0.280504047870636,
-0.07770427316427231,
-0.021773111075162888,
-0.038908038288354874,
-0.04366867244243622,
0.013330871239304543,
0.09963371604681015,
-0.10311296582221985,
0.0515163391828537,
-0.058292318135499954,
0.08087923377752304,
-0.02714601531624794,
-0.0041695330291986465,
0.03185589239001274,
0.1858973503112793,
-0.015135597437620163,
0.05072320997714996,
-0.1995549201965332,
0.21624329686164856,
0.01855219341814518,
0.13225893676280975,
-0.05059404671192169,
0.00789498258382082,
0.02365376427769661,
-0.001945378608070314,
0.07583419978618622,
-0.00539950979873538,
-0.0780300423502922,
-0.1256544291973114,
-0.07264050096273422,
0.08161770552396774,
0.14159606397151947,
-0.01460442878305912,
0.10267717391252518,
-0.04842990264296532,
0.018803421407938004,
0.04127190262079239,
-0.06872326135635376,
-0.15776947140693665,
-0.09644758701324463,
-0.0183255672454834,
0.03710184618830681,
-0.09737496823072433,
-0.04613811522722244,
-0.075862355530262,
-0.009891732595860958,
0.1138487458229065,
0.02704518474638462,
-0.019292855635285378,
-0.1366962194442749,
0.08834585547447205,
0.1497071087360382,
-0.07363594323396683,
0.02298366278409958,
-0.007467408198863268,
0.06393589824438095,
0.04352527856826782,
-0.09565619379281998,
0.048538077622652054,
-0.058151617646217346,
-0.16012413799762726,
-0.045843929052352905,
0.08997668325901031,
0.07211961597204208,
0.03970950469374657,
-0.005316943395882845,
0.04972407594323158,
-0.020386284217238426,
-0.10026707500219345,
0.013257413171231747,
0.03408236429095268,
0.04994695633649826,
0.037389494478702545,
-0.08430132269859314,
0.060858000069856644,
-0.032719049602746964,
-0.004155560862272978,
0.11324380338191986,
0.24160631000995636,
-0.08943910151720047,
0.08471672981977463,
0.05847851559519768,
-0.0682113915681839,
-0.14443369209766388,
0.06457899510860443,
0.10432937741279602,
-0.0005287467502057552,
0.05705222114920616,
-0.19663476943969727,
0.14321762323379517,
0.11271672695875168,
-0.012168439105153084,
0.03677056357264519,
-0.27353736758232117,
-0.11739014834165573,
0.0588906966149807,
0.13264207541942596,
0.12045235931873322,
-0.13165047764778137,
-0.013240478932857513,
-0.016831815242767334,
-0.12245114892721176,
0.10789041966199875,
-0.11482460796833038,
0.1346203237771988,
-0.034310102462768555,
0.11089765280485153,
0.004315542988479137,
-0.029839826747775078,
0.10575170814990997,
0.05076270177960396,
0.09784567356109619,
-0.04129771143198013,
0.013184725306928158,
0.05968521535396576,
-0.04774002358317375,
0.013849682174623013,
-0.07404157519340515,
0.08450479805469513,
-0.12017378956079483,
-0.006432706024497747,
-0.07982248812913895,
0.05079558119177818,
-0.036735933274030685,
-0.052146222442388535,
-0.052539799362421036,
0.03664268180727959,
0.05475442856550217,
-0.03742070868611336,
0.05255885049700737,
0.000778075132984668,
0.09145176410675049,
0.02418501488864422,
0.0688801258802414,
0.0005385145777836442,
-0.04672825336456299,
0.02001224458217621,
-0.008396592922508717,
0.06048581376671791,
-0.13961724936962128,
0.004639995750039816,
0.10671783238649368,
0.05197231099009514,
0.0954279750585556,
0.04406629502773285,
-0.047642532736063004,
0.012370234355330467,
0.03690722957253456,
-0.11358623951673508,
-0.10335268825292587,
0.047839224338531494,
-0.039316147565841675,
-0.13919726014137268,
0.04807602986693382,
0.11430314183235168,
-0.049578942358493805,
-0.022565139457583427,
-0.018922174349427223,
0.0059522418305277824,
-0.021308599039912224,
0.1864260584115982,
0.04295939579606056,
0.04117584601044655,
-0.10241138190031052,
0.13134139776229858,
0.02873811312019825,
-0.02287992648780346,
0.05822078883647919,
0.08640238642692566,
-0.09566840529441833,
0.0019914607983082533,
0.09496353566646576,
0.17450730502605438,
-0.07133762538433075,
-0.014301842078566551,
-0.10457542538642883,
-0.07245376706123352,
0.06146654859185219,
0.15909208357334137,
0.05677233263850212,
-0.019164342433214188,
-0.05097215995192528,
0.04066072404384613,
-0.1416299045085907,
0.06284777075052261,
0.031099170446395874,
0.07085957378149033,
-0.08764726668596268,
0.05603983253240585,
0.0074813952669501305,
0.0067101879976689816,
-0.017042815685272217,
0.015125708654522896,
-0.09264975041151047,
-0.02982085756957531,
-0.07578878849744797,
0.0102412523701787,
-0.012651816941797733,
0.0158094372600317,
-0.010532533749938011,
-0.06735146045684814,
-0.06954292953014374,
0.03723164275288582,
-0.07742972671985626,
-0.05289356783032417,
0.01238833088427782,
0.043021999299526215,
-0.13266931474208832,
0.005609192885458469,
0.015320935286581516,
-0.08863313496112823,
0.08557306975126266,
0.08843645453453064,
0.026711316779255867,
0.03533979877829552,
-0.12794306874275208,
-0.03286302462220192,
0.014309252612292767,
0.0025249584577977657,
0.06539270281791687,
-0.09392372518777847,
-0.004079279489815235,
-0.0211196206510067,
0.07801180332899094,
0.009479472413659096,
0.08109191805124283,
-0.13109764456748962,
0.008956719189882278,
-0.08528363704681396,
-0.04406864568591118,
-0.0660230740904808,
0.016594743356108665,
0.10148068517446518,
0.053320981562137604,
0.16394135355949402,
-0.07726778835058212,
0.01836857758462429,
-0.2084577977657318,
-0.028101569041609764,
-0.0053879874758422375,
-0.05412646010518074,
-0.13769011199474335,
-0.040194205939769745,
0.07761341333389282,
-0.03903799131512642,
0.09833803027868271,
-0.0200689435005188,
0.06295198947191238,
0.03859490156173706,
-0.03120320849120617,
-0.06260283291339874,
-0.027869155630469322,
0.1957666575908661,
0.07714861631393433,
-0.015963828191161156,
0.10870268940925598,
-0.005097614135593176,
0.05169743299484253,
0.03220994025468826,
0.20515196025371552,
0.2084578275680542,
0.006525406613945961,
0.07124265283346176,
0.06318297237157822,
-0.08129395544528961,
-0.06748919188976288,
0.18165355920791626,
-0.026449499651789665,
0.07325482368469238,
-0.029399381950497627,
0.18715257942676544,
0.11033625900745392,
-0.14970487356185913,
0.031403325498104095,
-0.033051393926143646,
-0.07753963768482208,
-0.14000871777534485,
0.0019257699605077505,
-0.09701784700155258,
-0.11940789222717285,
0.044976856559515,
-0.12073848396539688,
0.05499919876456261,
0.08343624323606491,
0.011814870871603489,
0.03293650969862938,
0.12833142280578613,
-0.02406427264213562,
0.005079064983874559,
0.04212227836251259,
0.007120397873222828,
-0.028459329158067703,
-0.038929857313632965,
-0.07647596299648285,
0.050846196711063385,
0.0063995434902608395,
0.08683335781097412,
-0.04582304134964943,
-0.01062504667788744,
0.04161985591053963,
-0.02896539680659771,
-0.07614030689001083,
0.02519977279007435,
0.03664879500865936,
0.056581199169158936,
0.04482075572013855,
0.04471909627318382,
-0.006948661990463734,
-0.03331043943762779,
0.2804725766181946,
-0.058892298489809036,
-0.09461770206689835,
-0.11352406442165375,
0.20730890333652496,
0.03978331387042999,
-0.028774499893188477,
0.039130281656980515,
-0.08280519396066666,
-0.012626112438738346,
0.15433254837989807,
0.15535323321819305,
-0.06421466171741486,
-0.02494325488805771,
-0.012718322686851025,
-0.017004186287522316,
-0.040059227496385574,
0.1174105778336525,
0.09562027454376221,
0.0012837001122534275,
-0.054372455924749374,
-0.028177451342344284,
-0.03628426790237427,
-0.014748483896255493,
-0.04097532853484154,
0.024142717942595482,
0.015921084210276604,
-0.02244274504482746,
-0.0342836007475853,
0.062272846698760986,
-0.0014233059482648969,
-0.2401382029056549,
0.06020908057689667,
-0.1432594656944275,
-0.16891597211360931,
-0.025999052450060844,
0.04800527170300484,
-0.009971209801733494,
0.05082609876990318,
-0.024763595312833786,
-0.0034205911215394735,
0.07933096587657928,
-0.02045273780822754,
-0.05628274381160736,
-0.12448906898498535,
0.11215090751647949,
-0.05961890146136284,
0.17887789011001587,
-0.016976453363895416,
0.07031750679016113,
0.11786755174398422,
0.04288628324866295,
-0.14007630944252014,
0.04623840004205704,
0.04676241800189018,
-0.11402896791696548,
0.018779097124934196,
0.14374089241027832,
-0.04513787478208542,
0.08601706475019455,
0.044024694710969925,
-0.09521331638097763,
-0.009876211173832417,
-0.04678067937493324,
-0.02537612058222294,
-0.07137414813041687,
-0.01203152071684599,
-0.06950382143259048,
0.1701977699995041,
0.19803518056869507,
-0.02464929223060608,
0.014103802852332592,
-0.0935637429356575,
0.028556667268276215,
0.06740210950374603,
0.03968053683638573,
-0.04986898973584175,
-0.2079949676990509,
0.01829865388572216,
0.05025823041796684,
-0.003341470379382372,
-0.23434774577617645,
-0.0781913697719574,
0.04064011201262474,
-0.03480996564030647,
-0.055731531232595444,
0.096546970307827,
0.036183442920446396,
0.04810049757361412,
-0.037246253341436386,
-0.1514635682106018,
-0.038778871297836304,
0.1543208807706833,
-0.17928031086921692,
-0.04982885345816612
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-256-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-256-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07847407460212708,
0.10049767792224884,
-0.0025630139280110598,
0.07848971337080002,
0.13280680775642395,
0.03429205343127251,
0.10644543915987015,
0.12196626514196396,
-0.1268104761838913,
0.06651689112186432,
0.09063773602247238,
0.08545344322919846,
0.032331615686416626,
0.12867344915866852,
-0.03877321258187294,
-0.23347921669483185,
0.006594144273549318,
-0.0172086451202631,
-0.06640029698610306,
0.10167934745550156,
0.08789556473493576,
-0.09830693900585175,
0.08026155829429626,
-0.0038018205668777227,
-0.1861306130886078,
0.028808625414967537,
-0.020138004794716835,
-0.050210971385240555,
0.09902256727218628,
-0.00514856306836009,
0.08283329755067825,
0.0069067711010575294,
0.11986790597438812,
-0.19066761434078217,
0.013255957514047623,
0.07322720438241959,
0.034040387719869614,
0.09463435411453247,
0.019901039078831673,
-0.0012168304529041052,
0.16510644555091858,
-0.13562433421611786,
0.09775028377771378,
0.028395600616931915,
-0.08539223670959473,
-0.16689984500408173,
-0.09510459005832672,
0.013229217380285263,
0.036108750849962234,
0.0864666998386383,
0.00897032581269741,
0.17884282767772675,
-0.09192803502082825,
0.08062929660081863,
0.23639999330043793,
-0.2779447138309479,
-0.07866157591342926,
0.042593296617269516,
0.04721233993768692,
0.07965458929538727,
-0.1179642304778099,
-0.01611141487956047,
0.02038973942399025,
0.02985263243317604,
0.09320142865180969,
-0.026525424793362617,
-0.09077539294958115,
-0.007454110309481621,
-0.1201775074005127,
0.005248700734227896,
0.11136040836572647,
0.04455746337771416,
-0.050351861864328384,
-0.047362327575683594,
-0.05925983563065529,
-0.06962461769580841,
-0.03542288392782211,
-0.03436395153403282,
0.04353959485888481,
-0.05774426832795143,
-0.11975084990262985,
-0.034249547868967056,
-0.04759090021252632,
-0.07744500786066055,
-0.020356042310595512,
0.20646505057811737,
0.049668196588754654,
0.03332702815532684,
-0.05008837580680847,
0.08103181421756744,
0.009402678348124027,
-0.12869615852832794,
-0.028531255200505257,
0.003787221619859338,
-0.08034905046224594,
-0.03859493136405945,
-0.056737612932920456,
0.01501611527055502,
0.0426432341337204,
0.2093675136566162,
-0.054820578545331955,
0.08388108015060425,
0.03184255212545395,
-0.019956840202212334,
-0.021630359813570976,
0.1248885989189148,
-0.01946215331554413,
-0.07700113952159882,
0.028960561379790306,
0.05807230994105339,
0.026011833921074867,
0.0030017257668077946,
-0.05711796134710312,
-0.03216997906565666,
0.08283129334449768,
0.031156163662672043,
-0.06316185742616653,
0.02565392293035984,
0.0023457841016352177,
-0.019908012822270393,
0.005593206267803907,
-0.1142326295375824,
0.01726928912103176,
-0.007098200730979443,
-0.07500948011875153,
-0.01468206662684679,
0.00838614720851183,
-0.015316583216190338,
0.010453945957124233,
0.09823135286569595,
-0.0886797308921814,
-0.02007581666111946,
-0.07702885568141937,
-0.0704180970788002,
-0.0010172250913456082,
-0.14977748692035675,
0.013959634117782116,
-0.07337193936109543,
-0.15518836677074432,
-0.032318614423274994,
0.0422191247344017,
-0.07088863849639893,
-0.02933361940085888,
-0.03817138075828552,
-0.07612252980470657,
0.02563171088695526,
0.003292290261015296,
0.18786892294883728,
-0.05266940966248512,
0.07930118590593338,
0.020754896104335785,
0.05067102611064911,
-0.023690879344940186,
0.0314788818359375,
-0.08836212754249573,
0.0036223279312253,
-0.17059996724128723,
0.0621730275452137,
-0.07668961584568024,
0.016687309369444847,
-0.12860476970672607,
-0.08655211329460144,
-0.025328291580080986,
-0.028204739093780518,
0.08390694111585617,
0.1007196456193924,
-0.1390698403120041,
-0.026057248935103416,
0.1113055944442749,
-0.07731351256370544,
-0.05640186741948128,
0.06719325482845306,
-0.0674908459186554,
0.05121937021613121,
0.05717480182647705,
0.18411116302013397,
0.07578887045383453,
-0.11284786462783813,
-0.02163492888212204,
0.003970704041421413,
0.030807681381702423,
-0.013023304753005505,
0.050204284489154816,
0.011134369298815727,
0.02051484026014805,
0.017205165699124336,
-0.03821278363466263,
0.0037167652044445276,
-0.09721700847148895,
-0.062325622886419296,
-0.05778869614005089,
-0.08249689638614655,
-0.035490889102220535,
0.01390075497329235,
0.038177479058504105,
-0.08082354813814163,
-0.0830182358622551,
0.08257942646741867,
0.14153191447257996,
-0.04324071481823921,
0.02066086418926716,
-0.07216348499059677,
0.01867966167628765,
-0.053919531404972076,
-0.03078235313296318,
-0.2050308883190155,
-0.06653441488742828,
0.03406968712806702,
-0.02684374712407589,
0.057176101952791214,
0.00519341928884387,
0.0706852450966835,
0.03763185814023018,
-0.03681149706244469,
0.006767757702618837,
-0.08875937759876251,
-0.007236899808049202,
-0.09539365023374557,
-0.22155024111270905,
-0.03873439505696297,
-0.03068581596016884,
0.12601706385612488,
-0.16283009946346283,
-0.009556842036545277,
-0.034685488790273666,
0.11935009807348251,
0.028305619955062866,
-0.06147536635398865,
-0.015002721920609474,
0.03168323263525963,
0.002774636959657073,
-0.09225121885538101,
0.03271765261888504,
0.01711120270192623,
-0.07167859375476837,
-0.058946382254362106,
-0.11537019163370132,
0.005556879565119743,
0.07710739970207214,
0.06172104924917221,
-0.09688092768192291,
0.006379980593919754,
-0.06374742835760117,
-0.033793769776821136,
-0.056602559983730316,
0.041429854929447174,
0.17822317779064178,
0.00512633565813303,
0.10763420909643173,
-0.07998519390821457,
-0.07243708521127701,
0.02216850034892559,
0.005797354504466057,
0.04405539110302925,
0.09238999336957932,
0.11244069039821625,
-0.12635648250579834,
0.06391195952892303,
0.08323375880718231,
-0.06369419395923615,
0.12422583997249603,
-0.03956161439418793,
-0.07875947654247284,
-0.034082431346178055,
-0.01799658127129078,
-0.015378052368760109,
0.13671226799488068,
-0.052955493330955505,
0.02434636279940605,
0.029223240911960602,
0.039022136479616165,
0.020371083170175552,
-0.15381263196468353,
-0.0023711444810032845,
0.008018550463020802,
-0.04290856048464775,
-0.016560401767492294,
0.021492980420589447,
0.01879703253507614,
0.09749703854322433,
0.03281028941273689,
-0.01830482855439186,
-0.0060531096532940865,
-0.003770118113607168,
-0.051963210105895996,
0.19146089255809784,
-0.09078726917505264,
-0.04126840457320213,
-0.07714280486106873,
0.00019903508655261248,
-0.03902966529130936,
-0.043418038636446,
0.026638614013791084,
-0.08792781084775925,
-0.037944067269563675,
-0.07397687435150146,
-0.0017509142635390162,
-0.04731110855937004,
0.025314969941973686,
0.029647374525666237,
0.0030761915259063244,
0.062281202524900436,
-0.1337786614894867,
0.005008141975849867,
-0.07327607274055481,
-0.10479526966810226,
0.017596010118722916,
0.06544949859380722,
0.0918220803141594,
0.05915703997015953,
-0.030209993943572044,
0.020920461043715477,
-0.03083420917391777,
0.2520211637020111,
-0.05466263368725777,
-0.0005001317476853728,
0.10819149017333984,
0.023890385404229164,
0.046221885830163956,
0.09236655384302139,
0.03509300574660301,
-0.10154739767313004,
0.030561910942196846,
0.08537203818559647,
-0.03549152985215187,
-0.23863546550273895,
-0.006094730459153652,
-0.035092275589704514,
-0.11490244418382645,
0.08187147229909897,
0.05029423162341118,
-0.04574236273765564,
0.06518049538135529,
0.01087406650185585,
0.024470198899507523,
-0.053142331540584564,
0.09142401069402695,
0.10161007940769196,
0.07430541515350342,
0.10273347795009613,
-0.04787196218967438,
-0.021581709384918213,
0.0689009353518486,
-0.004292864818125963,
0.2925480008125305,
-0.02805548720061779,
0.06782001256942749,
0.05277140811085701,
0.14112424850463867,
-0.022107508033514023,
0.03689183294773102,
0.006196259520947933,
-0.004727722145617008,
-0.02611546777188778,
-0.05746544897556305,
-0.028168966993689537,
-0.00124235893599689,
-0.08028101921081543,
0.05603930354118347,
-0.06191584840416908,
0.0663546472787857,
0.01860693097114563,
0.26232853531837463,
-0.001505268388427794,
-0.2802633047103882,
-0.07783804833889008,
-0.02191237546503544,
-0.03908393532037735,
-0.04398150369524956,
0.01339290663599968,
0.10039286315441132,
-0.10301907360553741,
0.050807882100343704,
-0.05764084681868553,
0.08111977577209473,
-0.02822643704712391,
-0.0038399414625018835,
0.031458236277103424,
0.18636584281921387,
-0.015032917261123657,
0.05084212124347687,
-0.2006053924560547,
0.21514591574668884,
0.018770476803183556,
0.13235940039157867,
-0.05056298151612282,
0.008330619893968105,
0.023690395057201385,
-0.0003225051914341748,
0.07502895593643188,
-0.005044286604970694,
-0.0771014541387558,
-0.126919686794281,
-0.07291596382856369,
0.08149600028991699,
0.1406404823064804,
-0.013149118050932884,
0.10227163136005402,
-0.0487123504281044,
0.0188596248626709,
0.04129769653081894,
-0.06811337172985077,
-0.15773025155067444,
-0.09671362489461899,
-0.01873595640063286,
0.03796492516994476,
-0.09706021845340729,
-0.046004608273506165,
-0.07548639923334122,
-0.011473871767520905,
0.11412354558706284,
0.027528464794158936,
-0.01933850161731243,
-0.1366766393184662,
0.08907855302095413,
0.14897994697093964,
-0.0738837718963623,
0.02315218187868595,
-0.007575604133307934,
0.06377467513084412,
0.04361947998404503,
-0.09489857405424118,
0.04850776493549347,
-0.05803651362657547,
-0.15965056419372559,
-0.04613744467496872,
0.08929663151502609,
0.07207802683115005,
0.03981993347406387,
-0.005062983371317387,
0.049481283873319626,
-0.020731894299387932,
-0.10027868300676346,
0.012097600847482681,
0.03498375415802002,
0.0495278462767601,
0.03775455802679062,
-0.08455459773540497,
0.06163787841796875,
-0.03245658800005913,
-0.004448975436389446,
0.1142885684967041,
0.24125798046588898,
-0.08979640156030655,
0.08387021720409393,
0.058820758014917374,
-0.06836950778961182,
-0.14392228424549103,
0.06463354825973511,
0.10388865321874619,
-0.0004486620891839266,
0.05726506561040878,
-0.1961136907339096,
0.14345382153987885,
0.11378180980682373,
-0.011863785795867443,
0.036868613213300705,
-0.2736418545246124,
-0.11756523698568344,
0.059574831277132034,
0.13265001773834229,
0.12187428772449493,
-0.13236719369888306,
-0.013243031688034534,
-0.017448430880904198,
-0.12346039712429047,
0.10675492137670517,
-0.11422161012887955,
0.13442115485668182,
-0.034148167818784714,
0.10976281017065048,
0.004380637779831886,
-0.029883740469813347,
0.10615915805101395,
0.05134228616952896,
0.09785739332437515,
-0.04171181097626686,
0.013419026508927345,
0.05910392478108406,
-0.04790404811501503,
0.014181378297507763,
-0.07375319302082062,
0.0845930278301239,
-0.1218462660908699,
-0.00647423230111599,
-0.07871556282043457,
0.05091971158981323,
-0.03660736605525017,
-0.05216272547841072,
-0.052538733929395676,
0.035912130028009415,
0.05461382493376732,
-0.03719169646501541,
0.05348336324095726,
0.0008414069307036698,
0.0909472182393074,
0.024984370917081833,
0.06768535822629929,
-0.0019457923481240869,
-0.04673963785171509,
0.019932515919208527,
-0.00843821745365858,
0.060400962829589844,
-0.13955488801002502,
0.005209051072597504,
0.10708307474851608,
0.052138883620500565,
0.09605088829994202,
0.043093785643577576,
-0.04702970013022423,
0.012586005963385105,
0.03648149222135544,
-0.11317121982574463,
-0.10248465836048126,
0.047508399933576584,
-0.040056012570858,
-0.13883362710475922,
0.04723459109663963,
0.11453871428966522,
-0.050239019095897675,
-0.022096293047070503,
-0.01918407715857029,
0.005136324092745781,
-0.021412251517176628,
0.18579745292663574,
0.04384390264749527,
0.04088233783841133,
-0.10230869054794312,
0.13096477091312408,
0.029158342629671097,
-0.021836474537849426,
0.0583001971244812,
0.08648353815078735,
-0.0957530215382576,
0.0017466775607317686,
0.09508643299341202,
0.17455753684043884,
-0.07204966247081757,
-0.015140264295041561,
-0.10497574508190155,
-0.0715959221124649,
0.061180323362350464,
0.15840445458889008,
0.05706850066781044,
-0.019452357664704323,
-0.051155637949705124,
0.040248069912195206,
-0.1416555494070053,
0.06272736191749573,
0.031046733260154724,
0.07119327038526535,
-0.08776574581861496,
0.05770406872034073,
0.007973933592438698,
0.006822666618973017,
-0.01693449541926384,
0.014879414811730385,
-0.09267548471689224,
-0.029575573280453682,
-0.07755187153816223,
0.010470760054886341,
-0.012315199710428715,
0.016108548268675804,
-0.010566681623458862,
-0.0672791451215744,
-0.06938805431127548,
0.03727826476097107,
-0.07698878645896912,
-0.052823811769485474,
0.012702234089374542,
0.04295777902007103,
-0.13210901618003845,
0.005262884311378002,
0.01495827455073595,
-0.0882529765367508,
0.0858420580625534,
0.08774538338184357,
0.026389073580503464,
0.035079363733530045,
-0.12851402163505554,
-0.03262600675225258,
0.014261918142437935,
0.0032469644211232662,
0.06555581837892532,
-0.09256969392299652,
-0.0035572166088968515,
-0.020926302298903465,
0.07786204665899277,
0.00922755990177393,
0.08117325603961945,
-0.13130244612693787,
0.009113911539316177,
-0.08526556938886642,
-0.04410579800605774,
-0.06636051833629608,
0.016437726095318794,
0.10094032436609268,
0.05280757695436478,
0.16364938020706177,
-0.07713579386472702,
0.01823701709508896,
-0.2084597945213318,
-0.028321504592895508,
-0.005257160868495703,
-0.053966421633958817,
-0.13772736489772797,
-0.04115733131766319,
0.07748693227767944,
-0.03869815915822983,
0.09987061470746994,
-0.01981833390891552,
0.06321427971124649,
0.03836071491241455,
-0.03181666508316994,
-0.06243421137332916,
-0.02831334061920643,
0.19595088064670563,
0.07777060568332672,
-0.015787966549396515,
0.1079498901963234,
-0.004881834611296654,
0.052594952285289764,
0.031682468950748444,
0.20338410139083862,
0.20866207778453827,
0.00607614079490304,
0.07099776715040207,
0.06290095299482346,
-0.0809878259897232,
-0.06764129549264908,
0.18122205138206482,
-0.026604320853948593,
0.07394012808799744,
-0.029522256925702095,
0.18744122982025146,
0.10999105870723724,
-0.14962150156497955,
0.031278736889362335,
-0.032508138567209244,
-0.07775532454252243,
-0.14010973274707794,
0.002672145375981927,
-0.09711694717407227,
-0.11927285045385361,
0.044398482888936996,
-0.12028791010379791,
0.05522390082478523,
0.08411990851163864,
0.01154742669314146,
0.032752763479948044,
0.1273568719625473,
-0.023984616622328758,
0.0052417246624827385,
0.04213116690516472,
0.007066449616104364,
-0.028127919882535934,
-0.04004518687725067,
-0.0771697610616684,
0.050534747540950775,
0.006600784137845039,
0.08727168291807175,
-0.04591349884867668,
-0.009257277473807335,
0.04204142093658447,
-0.02886541374027729,
-0.0764436200261116,
0.02529929392039776,
0.036031678318977356,
0.05646263062953949,
0.043685127049684525,
0.044990405440330505,
-0.00688124168664217,
-0.03338417783379555,
0.2797430157661438,
-0.058417610824108124,
-0.09501496702432632,
-0.11324846744537354,
0.20645862817764282,
0.04000512510538101,
-0.028752541169524193,
0.03902490809559822,
-0.0828913077712059,
-0.011790349148213863,
0.15484605729579926,
0.1546514481306076,
-0.06515821069478989,
-0.02492580935359001,
-0.012390711344778538,
-0.017134476453065872,
-0.04087870195508003,
0.1178615391254425,
0.09588655829429626,
0.00006467298226198182,
-0.053431130945682526,
-0.02838888205587864,
-0.03642889857292175,
-0.014679031446576118,
-0.04222980514168739,
0.023360472172498703,
0.01593668945133686,
-0.02186688594520092,
-0.033820297569036484,
0.06188883259892464,
-0.0009040150907821953,
-0.2411041259765625,
0.060216426849365234,
-0.14322763681411743,
-0.16905640065670013,
-0.026092762127518654,
0.04821169376373291,
-0.009310920722782612,
0.0500895231962204,
-0.024644318968057632,
-0.002998755779117346,
0.08021891117095947,
-0.02083868533372879,
-0.056356608867645264,
-0.12393838167190552,
0.11176474392414093,
-0.05901198461651802,
0.17873039841651917,
-0.016936924308538437,
0.07055871933698654,
0.11760096251964569,
0.043081048876047134,
-0.13922344148159027,
0.04645586758852005,
0.0464823953807354,
-0.11347667127847672,
0.018744124099612236,
0.1425505429506302,
-0.04509136080741882,
0.08586849272251129,
0.04444495961070061,
-0.09468886256217957,
-0.010347390547394753,
-0.04762958362698555,
-0.025826267898082733,
-0.07109658420085907,
-0.013149624690413475,
-0.06920411437749863,
0.17037905752658844,
0.19771035015583038,
-0.024504901841282845,
0.013819887302815914,
-0.09373825788497925,
0.028081022202968597,
0.06773246824741364,
0.039924997836351395,
-0.05022956803441048,
-0.2079043686389923,
0.018870582804083824,
0.04989565908908844,
-0.003150674281641841,
-0.23369187116622925,
-0.0785973072052002,
0.04042933136224747,
-0.03492206707596779,
-0.05545482784509659,
0.0966842845082283,
0.036453742533922195,
0.048271797597408295,
-0.03732091933488846,
-0.1517672836780548,
-0.03923807293176651,
0.15444345772266388,
-0.17914408445358276,
-0.05004579573869705
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08864407241344452,
0.09784232825040817,
-0.00256283744238317,
0.07781636714935303,
0.13956768810749054,
0.029662882909178734,
0.09692434221506119,
0.13775894045829773,
-0.11378856003284454,
0.0459609255194664,
0.0967826321721077,
0.07696039229631424,
0.02955995686352253,
0.1449601799249649,
-0.030807897448539734,
-0.24399776756763458,
-0.007745009381324053,
-0.02193952351808548,
-0.10543512552976608,
0.1134868711233139,
0.09968055784702301,
-0.09868741780519485,
0.06709038466215134,
-0.019971759989857674,
-0.17466408014297485,
0.01436395850032568,
-0.018334094434976578,
-0.05374659225344658,
0.11690156906843185,
-0.0016207268927246332,
0.07865128666162491,
0.009228293783962727,
0.1157296672463417,
-0.19539065659046173,
0.01831740140914917,
0.07220780849456787,
0.04449032247066498,
0.09419722855091095,
0.004327976610511541,
-0.02183959260582924,
0.12065167725086212,
-0.13060827553272247,
0.09767971187829971,
0.032469846308231354,
-0.09809523075819016,
-0.2060137689113617,
-0.09684836864471436,
0.01401386596262455,
0.04170791432261467,
0.08989401161670685,
0.006969246082007885,
0.14951717853546143,
-0.10647694021463394,
0.08027603477239609,
0.23023681342601776,
-0.2662705183029175,
-0.08158142119646072,
0.046028949320316315,
0.06087154522538185,
0.08466355502605438,
-0.1228194460272789,
-0.012886526994407177,
0.010034988634288311,
0.019984114915132523,
0.10067155957221985,
-0.02732682228088379,
-0.0747516006231308,
0.013012521900236607,
-0.11086197942495346,
-0.0007921176147647202,
0.11642817407846451,
0.03795453906059265,
-0.05487893149256706,
-0.08024127781391144,
-0.03773960471153259,
-0.060565583407878876,
-0.034301016479730606,
-0.013886809349060059,
0.03559210151433945,
-0.0617915503680706,
-0.14369787275791168,
-0.046560801565647125,
-0.04942861199378967,
-0.09216044098138809,
0.0003670846053864807,
0.21092990040779114,
0.03837208449840546,
0.021448982879519463,
-0.04949463903903961,
0.10583887249231339,
0.016246477141976357,
-0.12444843351840973,
-0.034261636435985565,
-0.006187611725181341,
-0.0934901013970375,
-0.034284163266420364,
-0.05603623762726784,
0.03029094636440277,
0.040087729692459106,
0.2285301834344864,
-0.027531849220395088,
0.0733555257320404,
0.03655265271663666,
-0.013331491500139236,
-0.027269231155514717,
0.14692023396492004,
-0.02783242240548134,
-0.07789505273103714,
0.009610055014491081,
0.06400682032108307,
0.00622128788381815,
-0.006224116776138544,
-0.06169171631336212,
-0.04458543285727501,
0.061297714710235596,
0.057482197880744934,
-0.04982795938849449,
0.027303962036967278,
-0.009358425624668598,
-0.023639706894755363,
-0.0003783086722251028,
-0.11884212493896484,
0.008712545968592167,
-0.008441358804702759,
-0.07978150248527527,
-0.05328932777047157,
0.014117554761469364,
-0.012435603886842728,
0.008859481662511826,
0.09040189534425735,
-0.07369542866945267,
-0.03727478161454201,
-0.07947509735822678,
-0.07482670992612839,
-0.014721004292368889,
-0.15994898974895477,
0.021435976028442383,
-0.06982365250587463,
-0.15211418271064758,
-0.031904615461826324,
0.050818055868148804,
-0.07927175611257553,
-0.034065477550029755,
-0.03263991326093674,
-0.08062121272087097,
0.018556833267211914,
0.0006777389207854867,
0.2133820354938507,
-0.049251947551965714,
0.09634438902139664,
0.014347203075885773,
0.05344314128160477,
0.004868769086897373,
0.037952374666929245,
-0.07963069528341293,
0.013737081550061703,
-0.1737469732761383,
0.07609042525291443,
-0.08939612656831741,
0.0273971538990736,
-0.14574605226516724,
-0.08793266862630844,
-0.0027273152954876423,
-0.01967528648674488,
0.08297263830900192,
0.10524384677410126,
-0.1252318173646927,
-0.024089915677905083,
0.12686510384082794,
-0.056039582937955856,
-0.05725488439202309,
0.06418859958648682,
-0.07383856922388077,
0.0814836174249649,
0.04892770200967789,
0.18745313584804535,
0.09882435202598572,
-0.10742183774709702,
0.024402914568781853,
0.01830768771469593,
0.03794723376631737,
0.0022157200146466494,
0.057293765246868134,
-0.0001875434973044321,
0.02181876450777054,
0.01519798580557108,
-0.0834275633096695,
0.016348714008927345,
-0.09251649677753448,
-0.06223874166607857,
-0.04159877449274063,
-0.08974589407444,
0.007015796843916178,
0.009082915261387825,
0.028856150805950165,
-0.07957777380943298,
-0.08862336724996567,
0.06317241489887238,
0.14079231023788452,
-0.04621821641921997,
0.010818012058734894,
-0.07993358373641968,
0.0034718778915703297,
-0.028770748525857925,
-0.022319884970784187,
-0.19564644992351532,
-0.06059151887893677,
0.024917012080550194,
0.011259682476520538,
0.044729627668857574,
-0.0075352732092142105,
0.08014345914125443,
0.02222706563770771,
-0.05021326616406441,
-0.004348761402070522,
-0.0912383645772934,
-0.010512182489037514,
-0.08986970037221909,
-0.21701116859912872,
-0.05539451912045479,
-0.039097029715776443,
0.15330757200717926,
-0.17170436680316925,
0.0005148820928297937,
-0.018406078219413757,
0.11008185148239136,
0.042292915284633636,
-0.050940800458192825,
-0.00021939596626907587,
0.030217904597520828,
0.015109039843082428,
-0.09945381432771683,
0.03459492325782776,
0.016546251252293587,
-0.09964775294065475,
-0.02498781308531761,
-0.10276665538549423,
0.0019347632769495249,
0.07522745430469513,
0.07420825958251953,
-0.10671789944171906,
-0.01878899522125721,
-0.06286929547786713,
-0.028579087927937508,
-0.04723767563700676,
0.034786757081747055,
0.17235605418682098,
0.018105270341038704,
0.11264261603355408,
-0.07496019452810287,
-0.08572084456682205,
0.017957625910639763,
0.008737805299460888,
0.058628782629966736,
0.11341266334056854,
0.07696588337421417,
-0.10017827153205872,
0.058862511068582535,
0.08738578855991364,
-0.04908990114927292,
0.14098067581653595,
-0.049352601170539856,
-0.07956066727638245,
-0.03038983792066574,
0.0015850591007620096,
-0.0009958044392988086,
0.1515493392944336,
-0.047154445201158524,
0.010343740694224834,
0.03536411002278328,
0.02997482195496559,
0.008513862267136574,
-0.1618170291185379,
-0.02424849011003971,
0.02066221460700035,
-0.048402413725852966,
-0.029220610857009888,
0.012456510215997696,
0.013225789181888103,
0.09124094247817993,
0.0499076172709465,
0.0015248944982886314,
0.005446381401270628,
-0.013869750313460827,
-0.04905027523636818,
0.20125220715999603,
-0.0917687714099884,
-0.046002406626939774,
-0.08511870354413986,
-0.00041827227687463164,
-0.010993880219757557,
-0.03559720143675804,
0.01681559346616268,
-0.10067292302846909,
-0.02296740561723709,
-0.0650702640414238,
0.005161145236343145,
-0.0476163774728775,
0.011143064126372337,
0.0005074051441624761,
0.01927180401980877,
0.05699591711163521,
-0.13411211967468262,
0.014163713902235031,
-0.06197468936443329,
-0.11066345870494843,
0.030228300020098686,
0.05141833424568176,
0.08752278983592987,
0.06263597309589386,
-0.026298394426703453,
0.017662763595581055,
-0.04626720771193504,
0.23142008483409882,
-0.08647258579730988,
0.004175597336143255,
0.12448040395975113,
0.02421015128493309,
0.037110257893800735,
0.10003145039081573,
0.02816995047032833,
-0.0992683693766594,
0.044346898794174194,
0.07550747692584991,
-0.04424731805920601,
-0.25332021713256836,
0.008720523677766323,
-0.044189825654029846,
-0.08258120715618134,
0.08903659135103226,
0.048257481306791306,
-0.03901390731334686,
0.06499283760786057,
0.002256476553156972,
0.007653376087546349,
-0.022204674780368805,
0.08830732107162476,
0.08439584076404572,
0.05539507418870926,
0.10551503300666809,
-0.041330303996801376,
-0.017955342307686806,
0.06548389792442322,
0.027634194120764732,
0.30681857466697693,
-0.04847605153918266,
0.10078930854797363,
0.05297311022877693,
0.14732244610786438,
-0.02119525708258152,
0.036857083439826965,
0.012191184796392918,
-0.00462328502908349,
-0.028488565236330032,
-0.054117925465106964,
-0.026487838476896286,
0.0061135911382734776,
-0.06749244034290314,
0.042362578213214874,
-0.05623878538608551,
0.04564923048019409,
0.01637236215174198,
0.2922035753726959,
0.0018120546592399478,
-0.26490849256515503,
-0.1014583632349968,
-0.012158144265413284,
-0.037405773997306824,
-0.050349220633506775,
0.013532178476452827,
0.1192038506269455,
-0.13064910471439362,
0.024936456233263016,
-0.06614450365304947,
0.0846901535987854,
-0.02828093059360981,
-0.00607228884473443,
0.0392582044005394,
0.16266058385372162,
-0.02164488285779953,
0.06077911704778671,
-0.2218470573425293,
0.23058541119098663,
0.007936200127005577,
0.12179610133171082,
-0.05436166003346443,
0.006769230123609304,
0.024451306089758873,
-0.0008643636247143149,
0.0951826348900795,
-0.00317506049759686,
-0.04559830576181412,
-0.13899332284927368,
-0.052373819053173065,
0.07105099409818649,
0.13974182307720184,
-0.05240297317504883,
0.10224728286266327,
-0.05999571457505226,
0.011181918904185295,
0.037043988704681396,
-0.08125726878643036,
-0.12020495533943176,
-0.10286101698875427,
-0.019354023039340973,
0.0008230130770243704,
-0.06075986847281456,
-0.06478069722652435,
-0.0666278526186943,
0.025688331574201584,
0.11616834253072739,
-0.002204709453508258,
-0.03446604311466217,
-0.14901438355445862,
0.0740949958562851,
0.1547592282295227,
-0.06783957779407501,
0.03274663910269737,
0.0023831722792237997,
0.07974593341350555,
0.03532900661230087,
-0.07747786492109299,
0.06298493593931198,
-0.06644658744335175,
-0.17953284084796906,
-0.04799830541014671,
0.10283250361680984,
0.07124915719032288,
0.04136540740728378,
-0.006696648430079222,
0.04750434681773186,
-0.027309367433190346,
-0.09116373211145401,
0.028934650123119354,
0.03112991340458393,
0.03670824319124222,
0.04308873414993286,
-0.07667835056781769,
0.08458705246448517,
-0.04370874539017677,
-0.019863862544298172,
0.12216692417860031,
0.23095540702342987,
-0.10428136587142944,
0.09531548619270325,
0.05665893107652664,
-0.05984436348080635,
-0.16662749648094177,
0.07187940925359726,
0.10487248003482819,
0.014151203446090221,
0.058228831738233566,
-0.21714022755622864,
0.12075797468423843,
0.1017771065235138,
-0.013385512866079807,
0.03973018005490303,
-0.27901193499565125,
-0.11983589082956314,
0.0504537932574749,
0.12525585293769836,
0.08487514406442642,
-0.12577109038829803,
-0.01889139786362648,
-0.01549072377383709,
-0.12831692397594452,
0.07933223247528076,
-0.11390722543001175,
0.13178211450576782,
-0.024050770327448845,
0.11012432724237442,
0.012681386433541775,
-0.027520224452018738,
0.10787320882081985,
0.04899272695183754,
0.09662759304046631,
-0.04275589436292648,
0.0010175135685130954,
0.060005370527505875,
-0.04895162582397461,
0.00014458743680734187,
-0.06881872564554214,
0.08911973237991333,
-0.13738171756267548,
-0.007689903024584055,
-0.08854804933071136,
0.04237891733646393,
-0.040582627058029175,
-0.0664578527212143,
-0.0413687564432621,
0.05739028751850128,
0.04503615200519562,
-0.03305838629603386,
0.040765173733234406,
-0.026328597217798233,
0.10318753123283386,
0.02516166865825653,
0.08668675273656845,
0.021040156483650208,
-0.05527082085609436,
0.02187936380505562,
-0.0126187764108181,
0.06416648626327515,
-0.1696183681488037,
0.009354747831821442,
0.09852340072393417,
0.0690511018037796,
0.1017366498708725,
0.041292402893304825,
-0.04693710058927536,
0.017250020056962967,
0.028589162975549698,
-0.09678571671247482,
-0.1136201024055481,
0.039511583745479584,
-0.03688264265656471,
-0.14694145321846008,
0.042486101388931274,
0.11947400122880936,
-0.0390293151140213,
-0.030778560787439346,
-0.019657449796795845,
0.0025873659178614616,
-0.02151319570839405,
0.18021732568740845,
0.06054641306400299,
0.05961396545171738,
-0.10230900347232819,
0.12060334533452988,
0.03390192613005638,
-0.02721443399786949,
0.05225576087832451,
0.08186683058738708,
-0.10068869590759277,
-0.005808683577924967,
0.07689748704433441,
0.12520228326320648,
-0.05553784593939781,
0.00019617275393102318,
-0.10217360407114029,
-0.08652643859386444,
0.05872217193245888,
0.14305399358272552,
0.04910176992416382,
-0.015373635105788708,
-0.046880096197128296,
0.045557454228401184,
-0.14025822281837463,
0.07418794929981232,
0.03310380503535271,
0.0631815642118454,
-0.07720158249139786,
0.06558053940534592,
0.003424703376367688,
0.016446365043520927,
-0.01460103690624237,
0.002368328860029578,
-0.09653706103563309,
-0.007146604359149933,
-0.0860045775771141,
0.001170040457509458,
-0.0005905759171582758,
0.01829317770898342,
-0.02468947134912014,
-0.0704532191157341,
-0.044591013342142105,
0.037055015563964844,
-0.0858222022652626,
-0.050933729857206345,
0.007225187495350838,
0.04301803186535835,
-0.12431107461452484,
-0.00396468723192811,
0.028492063283920288,
-0.09799177944660187,
0.0984954833984375,
0.0736762210726738,
0.02075456827878952,
0.02823779173195362,
-0.1187606230378151,
-0.033752404153347015,
-0.014369807206094265,
-0.009735196828842163,
0.059095948934555054,
-0.09920170903205872,
-0.004534258972853422,
-0.046276163309812546,
0.06171487644314766,
0.01365596242249012,
0.0627467930316925,
-0.1401141732931137,
0.014445709995925426,
-0.0687832236289978,
-0.04429515451192856,
-0.07889734208583832,
0.04069012030959129,
0.09444978088140488,
0.05964626371860504,
0.14001132547855377,
-0.07510800659656525,
0.025104442611336708,
-0.20300287008285522,
-0.03622276708483696,
-0.012911025434732437,
-0.05566999316215515,
-0.14527377486228943,
-0.04655720666050911,
0.0824638232588768,
-0.03958054259419441,
0.08899229019880295,
-0.0276944637298584,
0.07297864556312561,
0.037563081830739975,
-0.05168696865439415,
-0.03131691738963127,
-0.008760298602283001,
0.20423810184001923,
0.07161695510149002,
-0.015036491677165031,
0.10222168266773224,
0.0004599676176439971,
0.03038969077169895,
0.04592592269182205,
0.1709226369857788,
0.2222365140914917,
0.043091848492622375,
0.04959990456700325,
0.06287022680044174,
-0.07739373296499252,
-0.0703001394867897,
0.17309877276420593,
-0.010226615704596043,
0.06747176498174667,
-0.0465773306787014,
0.19338996708393097,
0.11853992938995361,
-0.16699723899364471,
0.04752297326922417,
-0.04578106477856636,
-0.08174643665552139,
-0.11836151778697968,
-0.009646723046898842,
-0.08259943127632141,
-0.12293816357851028,
0.03708041459321976,
-0.11888650804758072,
0.04575340077280998,
0.10838087648153305,
0.015024027787148952,
0.03491898253560066,
0.12650971114635468,
-0.011373440735042095,
-0.007468742318451405,
0.06867103278636932,
0.004887626506388187,
-0.011022606864571571,
-0.041796233505010605,
-0.0751602053642273,
0.05938882753252983,
0.0009728763834573328,
0.08174893260002136,
-0.04590649902820587,
-0.015048683620989323,
0.03103523887693882,
-0.030399981886148453,
-0.07918225973844528,
0.025243369862437248,
0.04557127133011818,
0.05444492772221565,
0.05164002254605293,
0.04192344844341278,
-0.010989886708557606,
-0.03319127485156059,
0.3258166015148163,
-0.06743109226226807,
-0.1031891480088234,
-0.12543916702270508,
0.21612440049648285,
0.032540541142225266,
-0.027447490021586418,
0.032130166888237,
-0.08651480823755264,
-0.0010332614183425903,
0.16699637472629547,
0.17622190713882446,
-0.06613525748252869,
-0.02008405141532421,
0.00040069103124551475,
-0.015945756807923317,
-0.031999215483665466,
0.12404046952724457,
0.09730277955532074,
-0.012169444933533669,
-0.06111849844455719,
-0.01674445904791355,
-0.015823841094970703,
-0.03347295895218849,
-0.04075295850634575,
0.04954598471522331,
0.019951054826378822,
-0.027761582285165787,
-0.043764930218458176,
0.07633571326732635,
0.0030194211285561323,
-0.2548550069332123,
0.06445083022117615,
-0.1559857726097107,
-0.17628876864910126,
-0.050634920597076416,
0.029463069513440132,
0.004640849307179451,
0.05652492120862007,
-0.014220990240573883,
0.0036445145960897207,
0.08887779712677002,
-0.010041608475148678,
-0.03449932485818863,
-0.12049316614866257,
0.1230430155992508,
-0.046800240874290466,
0.16888996958732605,
-0.03186154365539551,
0.042896345257759094,
0.11674642562866211,
0.030692383646965027,
-0.13422469794750214,
0.03720245882868767,
0.06284024566411972,
-0.09719192981719971,
0.019612908363342285,
0.1486099511384964,
-0.047948163002729416,
0.10098210722208023,
0.04585620015859604,
-0.10868619382381439,
0.0009951312094926834,
-0.06708969175815582,
-0.03431606665253639,
-0.08727108687162399,
-0.010921305976808071,
-0.06472951918840408,
0.16699489951133728,
0.2215237021446228,
-0.037648726254701614,
0.008174135349690914,
-0.09783699363470078,
0.012911401689052582,
0.07021857798099518,
0.035346873104572296,
-0.04968913272023201,
-0.1812686324119568,
0.00533993449062109,
0.058828551322221756,
-0.0015500761801376939,
-0.2532603442668915,
-0.07262808084487915,
0.03749667853116989,
-0.028567921370267868,
-0.03270462527871132,
0.10973606258630753,
0.04684299975633621,
0.05065753310918808,
-0.03134768083691597,
-0.15146523714065552,
-0.033448848873376846,
0.15582744777202606,
-0.1734907478094101,
-0.03617288917303085
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08913476020097733,
0.09544762969017029,
-0.0024918289855122566,
0.07797182351350784,
0.13971301913261414,
0.029876135289669037,
0.09744683653116226,
0.13623309135437012,
-0.11241256445646286,
0.04504432901740074,
0.09562183171510696,
0.07830988615751266,
0.029696308076381683,
0.1455574929714203,
-0.030198708176612854,
-0.24465151131153107,
-0.007827820256352425,
-0.021566811949014664,
-0.10393026471138,
0.11343705654144287,
0.09980769455432892,
-0.09920801967382431,
0.06688990443944931,
-0.01990699954330921,
-0.17488795518875122,
0.014705432578921318,
-0.018066851422190666,
-0.05264003574848175,
0.11725584417581558,
-0.003245994448661804,
0.07825826108455658,
0.008716678246855736,
0.11769183725118637,
-0.19380992650985718,
0.01835167407989502,
0.07162576168775558,
0.044421687722206116,
0.09382418543100357,
0.004030486103147268,
-0.020749414339661598,
0.12029780447483063,
-0.13134709000587463,
0.09798453003168106,
0.031801458448171616,
-0.09815775603055954,
-0.2047460526227951,
-0.09707007557153702,
0.013438967056572437,
0.04307997226715088,
0.08877576142549515,
0.006722795311361551,
0.14984405040740967,
-0.10680267959833145,
0.08060847967863083,
0.23141933977603912,
-0.26768240332603455,
-0.08125988394021988,
0.04777728393673897,
0.06266796588897705,
0.08551643788814545,
-0.12253543734550476,
-0.01364986039698124,
0.010672672651708126,
0.019660938531160355,
0.10162712633609772,
-0.02796580083668232,
-0.07479242980480194,
0.01268053986132145,
-0.11225005984306335,
0.000963796628639102,
0.11553234606981277,
0.03848208487033844,
-0.05467139557003975,
-0.08020160347223282,
-0.03854086995124817,
-0.061004262417554855,
-0.0355030782520771,
-0.014713468961417675,
0.03617348149418831,
-0.062180619686841965,
-0.1426309198141098,
-0.046001136302948,
-0.04838100075721741,
-0.0924815982580185,
0.00050178705714643,
0.21018297970294952,
0.038896095007658005,
0.02115764282643795,
-0.04940551146864891,
0.1049661636352539,
0.013454906642436981,
-0.12438953667879105,
-0.033017028123140335,
-0.0057421582750976086,
-0.09456213563680649,
-0.03529275208711624,
-0.05671033635735512,
0.03020109422504902,
0.039321377873420715,
0.22859837114810944,
-0.025329148396849632,
0.07364820688962936,
0.03814990445971489,
-0.0136901019141078,
-0.02709881216287613,
0.14736224710941315,
-0.028395947068929672,
-0.07960230857133865,
0.009543024934828281,
0.06427111476659775,
0.006687615532428026,
-0.005167899187654257,
-0.06269107013940811,
-0.04486256092786789,
0.061806827783584595,
0.057035915553569794,
-0.05206169933080673,
0.02777203731238842,
-0.00905192643404007,
-0.023267775774002075,
0.0009372358908876777,
-0.11940661817789078,
0.009414758533239365,
-0.008953831158578396,
-0.08074281364679337,
-0.053227659314870834,
0.012616285122931004,
-0.011558299884200096,
0.00934243481606245,
0.08947191387414932,
-0.07400782406330109,
-0.03656425327062607,
-0.0800844132900238,
-0.07596397399902344,
-0.014696566388010979,
-0.1624152809381485,
0.022072747349739075,
-0.06864394247531891,
-0.15446758270263672,
-0.032446619123220444,
0.04992302134633064,
-0.07865946739912033,
-0.03393344208598137,
-0.03383408859372139,
-0.08065156638622284,
0.017274046316742897,
0.0016360909212380648,
0.21465560793876648,
-0.04878990724682808,
0.09501230716705322,
0.01436309702694416,
0.05504132807254791,
0.003910001367330551,
0.038044724613428116,
-0.0791732594370842,
0.01365233026444912,
-0.1743859201669693,
0.07561617344617844,
-0.08928097039461136,
0.0293433740735054,
-0.14484530687332153,
-0.08764485269784927,
-0.0032609652262181044,
-0.019344091415405273,
0.08285707235336304,
0.1047930046916008,
-0.12797501683235168,
-0.02305656485259533,
0.12752166390419006,
-0.055812232196331024,
-0.05651693418622017,
0.06285831332206726,
-0.07409150898456573,
0.08206132054328918,
0.05047786980867386,
0.1876334697008133,
0.09884658455848694,
-0.1069740578532219,
0.024546490982174873,
0.018365267664194107,
0.037484657019376755,
0.0003479876322671771,
0.05660302937030792,
0.000749311177060008,
0.02324337139725685,
0.015245006419718266,
-0.08167381584644318,
0.01571519859135151,
-0.09213583171367645,
-0.06219383329153061,
-0.042034294456243515,
-0.09058548510074615,
0.006593972444534302,
0.009083684533834457,
0.029158547520637512,
-0.08017555624246597,
-0.08779294788837433,
0.06451254338026047,
0.14063319563865662,
-0.046067625284194946,
0.010246578603982925,
-0.07962595671415329,
0.002926238812506199,
-0.029203934594988823,
-0.022526849061250687,
-0.19582197070121765,
-0.05939050391316414,
0.02546926774084568,
0.009601493366062641,
0.04485104978084564,
-0.005168223287910223,
0.08070025593042374,
0.022038785740733147,
-0.04951335862278938,
-0.004037461243569851,
-0.09051468968391418,
-0.010431419126689434,
-0.09161555022001266,
-0.21669672429561615,
-0.05576787143945694,
-0.039197273552417755,
0.1522262543439865,
-0.17199356853961945,
0.0008286038064397871,
-0.02060595713555813,
0.10972364991903305,
0.041421983391046524,
-0.05067989602684975,
-0.00022510666167363524,
0.029836339876055717,
0.015753058716654778,
-0.09961748123168945,
0.03479796648025513,
0.01532234251499176,
-0.09887926280498505,
-0.026461970061063766,
-0.1041388213634491,
0.00022262804850470275,
0.07460544258356094,
0.07506453990936279,
-0.10644450038671494,
-0.018110590055584908,
-0.06254885345697403,
-0.028039557859301567,
-0.046128615736961365,
0.034605760127305984,
0.17123301327228546,
0.017269199714064598,
0.1122766062617302,
-0.07529464364051819,
-0.0866328701376915,
0.017769554629921913,
0.009456248953938484,
0.059921011328697205,
0.11322232335805893,
0.07755019515752792,
-0.09986578673124313,
0.058156050741672516,
0.08921235054731369,
-0.049060240387916565,
0.13956771790981293,
-0.04946143180131912,
-0.07979178428649902,
-0.030352193862199783,
0.0005106287426315248,
-0.002077146666124463,
0.1511155366897583,
-0.04765719547867775,
0.008902234025299549,
0.034639522433280945,
0.029502546414732933,
0.008205022662878036,
-0.16259054839611053,
-0.02379823662340641,
0.020220236852765083,
-0.04776537045836449,
-0.03009628877043724,
0.01232309639453888,
0.012431193143129349,
0.0912468358874321,
0.04892878979444504,
0.0002889569732360542,
0.0047555845230817795,
-0.013894224539399147,
-0.04841964319348335,
0.20146621763706207,
-0.09135783463716507,
-0.043906185775995255,
-0.08333415538072586,
-0.0009001709404401481,
-0.010104852728545666,
-0.03589553385972977,
0.015990346670150757,
-0.10085029155015945,
-0.023048564791679382,
-0.06500735878944397,
0.004492027685046196,
-0.046895623207092285,
0.011246786452829838,
0.0018766539869830012,
0.019085563719272614,
0.056019335985183716,
-0.13448412716388702,
0.01403612270951271,
-0.06315223127603531,
-0.1109173446893692,
0.02955249883234501,
0.05154819414019585,
0.08705531060695648,
0.06388043612241745,
-0.026416735723614693,
0.017628589645028114,
-0.045842040330171585,
0.23199449479579926,
-0.08599097281694412,
0.005840931087732315,
0.12420619279146194,
0.024408157914876938,
0.03723769634962082,
0.10115998983383179,
0.02815401926636696,
-0.09967995434999466,
0.04402288794517517,
0.07503701746463776,
-0.04369296878576279,
-0.2535930275917053,
0.00875521544367075,
-0.04360680654644966,
-0.08471663296222687,
0.08879439532756805,
0.04795355722308159,
-0.038476020097732544,
0.06596001237630844,
0.0030915962997823954,
0.00885040033608675,
-0.02238932065665722,
0.08761801570653915,
0.08311334252357483,
0.05505133792757988,
0.10585054010152817,
-0.04177876561880112,
-0.01889660395681858,
0.06421550363302231,
0.02751900628209114,
0.30740293860435486,
-0.04692083224654198,
0.09998823702335358,
0.05251263827085495,
0.14659588038921356,
-0.02154146321117878,
0.03881044313311577,
0.012448916211724281,
-0.005578945390880108,
-0.028765860944986343,
-0.05395881459116936,
-0.024385491386055946,
0.006163540296256542,
-0.06816088408231735,
0.04249485954642296,
-0.055476777255535126,
0.04513945430517197,
0.01609547808766365,
0.2902465760707855,
0.003411072539165616,
-0.2653742730617523,
-0.10001667588949203,
-0.011993380263447762,
-0.038547974079847336,
-0.04967978596687317,
0.013329851441085339,
0.11767508089542389,
-0.13008145987987518,
0.025962991639971733,
-0.06589408963918686,
0.08489341288805008,
-0.027575857937335968,
-0.005487607792019844,
0.038366831839084625,
0.16417710483074188,
-0.021430715918540955,
0.060849107801914215,
-0.22261404991149902,
0.2298145592212677,
0.008086256682872772,
0.12245360016822815,
-0.05501233786344528,
0.006251305807381868,
0.024549078196287155,
-0.0011490487959235907,
0.09515424817800522,
-0.0030810609459877014,
-0.046097464859485626,
-0.13914285600185394,
-0.05227081850171089,
0.07101879268884659,
0.1403331756591797,
-0.050570663064718246,
0.10248871147632599,
-0.05922742933034897,
0.010309925302863121,
0.03742160275578499,
-0.08211734890937805,
-0.12147238105535507,
-0.10176357626914978,
-0.02017735131084919,
0.0002826714189723134,
-0.062291987240314484,
-0.0639856681227684,
-0.06676425039768219,
0.02300890162587166,
0.11427634209394455,
0.00016461646009702235,
-0.034531768411397934,
-0.14828340709209442,
0.07443398237228394,
0.1551593542098999,
-0.06771042197942734,
0.03355518355965614,
0.002960746642202139,
0.07986282557249069,
0.035385213792324066,
-0.07750127464532852,
0.06320800632238388,
-0.06619958579540253,
-0.17897184193134308,
-0.04751934856176376,
0.10370320826768875,
0.07196514308452606,
0.04142921045422554,
-0.005543692037463188,
0.04753020405769348,
-0.027718426659703255,
-0.09124454855918884,
0.02815883792936802,
0.03111560456454754,
0.03578091785311699,
0.043182842433452606,
-0.07759839296340942,
0.08359028398990631,
-0.04363015294075012,
-0.018203014507889748,
0.12181201577186584,
0.22869013249874115,
-0.10420988500118256,
0.09467804431915283,
0.05580632761120796,
-0.06013962998986244,
-0.16621267795562744,
0.07290465384721756,
0.10497341305017471,
0.014249227941036224,
0.058902595192193985,
-0.21550555527210236,
0.12218447029590607,
0.10170313715934753,
-0.012862592935562134,
0.03962477669119835,
-0.27948272228240967,
-0.11985842883586884,
0.0509418323636055,
0.12582187354564667,
0.08531492948532104,
-0.12539197504520416,
-0.018441326916217804,
-0.01663956232368946,
-0.12818947434425354,
0.07950186729431152,
-0.11470314115285873,
0.1314334124326706,
-0.02410302311182022,
0.10991746932268143,
0.012456512078642845,
-0.027556898072361946,
0.10638365894556046,
0.05118721351027489,
0.09741401672363281,
-0.04305471107363701,
-0.000008731316484045237,
0.06154107302427292,
-0.048429735004901886,
0.0010680722771212459,
-0.06846121698617935,
0.08866497874259949,
-0.13591596484184265,
-0.007494866847991943,
-0.08808243274688721,
0.04183310642838478,
-0.040311627089977264,
-0.06599339842796326,
-0.04126794636249542,
0.05684224143624306,
0.044503677636384964,
-0.033171165734529495,
0.03942287340760231,
-0.024979205802083015,
0.1025385856628418,
0.021746117621660233,
0.08747149258852005,
0.020345717668533325,
-0.054057687520980835,
0.02229420840740204,
-0.012235301546752453,
0.06317684054374695,
-0.17018289864063263,
0.008841226808726788,
0.09870124608278275,
0.06914311647415161,
0.1014670729637146,
0.041274696588516235,
-0.04692921042442322,
0.016548344865441322,
0.028336476534605026,
-0.09527748078107834,
-0.11486166715621948,
0.04050247743725777,
-0.03824298456311226,
-0.14680129289627075,
0.04444032907485962,
0.11772125214338303,
-0.039226844906806946,
-0.031528327614068985,
-0.020542005077004433,
0.0022589031141251326,
-0.021468915045261383,
0.18147405982017517,
0.06149901822209358,
0.05936835706233978,
-0.10303244739770889,
0.12008132040500641,
0.03344893828034401,
-0.02576986514031887,
0.05155928060412407,
0.08269046247005463,
-0.10150977224111557,
-0.005705587100237608,
0.07864762842655182,
0.1269695907831192,
-0.053958430886268616,
0.0004094488103874028,
-0.10247019678354263,
-0.08678876608610153,
0.05881601572036743,
0.1444595754146576,
0.04990042746067047,
-0.016544463112950325,
-0.04659942165017128,
0.045730940997600555,
-0.13993680477142334,
0.07365503162145615,
0.03312542662024498,
0.06352602690458298,
-0.0770215317606926,
0.0662110447883606,
0.00346630928106606,
0.01685480587184429,
-0.014658010564744473,
0.0035929977893829346,
-0.09644494205713272,
-0.00803763885051012,
-0.08587157726287842,
-0.00033140924642793834,
-0.0013761199079453945,
0.018235066905617714,
-0.025014303624629974,
-0.07049666345119476,
-0.044544070959091187,
0.03659383952617645,
-0.08598276972770691,
-0.05110012739896774,
0.007061101961880922,
0.041839923709630966,
-0.1237926110625267,
-0.0041185724548995495,
0.027588894590735435,
-0.09707379341125488,
0.09769897907972336,
0.0729011669754982,
0.02136627770960331,
0.02950066514313221,
-0.11966244131326675,
-0.03341834992170334,
-0.01350855641067028,
-0.009467816911637783,
0.0596398301422596,
-0.09831097722053528,
-0.004740309435874224,
-0.04586959257721901,
0.06370949000120163,
0.012912388890981674,
0.06066577509045601,
-0.13913941383361816,
0.0146340848878026,
-0.06995847076177597,
-0.04427425563335419,
-0.07913347333669662,
0.04037444666028023,
0.09370600432157516,
0.05884822458028793,
0.14073649048805237,
-0.0740254670381546,
0.02494976297020912,
-0.2037368267774582,
-0.036612652242183685,
-0.013375836424529552,
-0.056274935603141785,
-0.1452118307352066,
-0.0472220815718174,
0.08276136964559555,
-0.03966706246137619,
0.0908302590250969,
-0.026962097734212875,
0.07344897091388702,
0.036789391189813614,
-0.04752756655216217,
-0.03181425482034683,
-0.008421488106250763,
0.20440217852592468,
0.07137291133403778,
-0.015155081637203693,
0.10158953070640564,
0.0014825069811195135,
0.03081975318491459,
0.04390967637300491,
0.17034859955310822,
0.2217106968164444,
0.04160204529762268,
0.04970654472708702,
0.06369432806968689,
-0.07726548612117767,
-0.06883611530065536,
0.17471261322498322,
-0.01132223755121231,
0.06603942066431046,
-0.04654908925294876,
0.19613492488861084,
0.117213174700737,
-0.166750967502594,
0.04787040874361992,
-0.04507811367511749,
-0.08249092102050781,
-0.11798622459173203,
-0.010467298328876495,
-0.08321511745452881,
-0.12260279059410095,
0.03707408532500267,
-0.11875887215137482,
0.04556163400411606,
0.10900576412677765,
0.014773466624319553,
0.03445837274193764,
0.1272628754377365,
-0.011769942939281464,
-0.0068089659325778484,
0.068386010825634,
0.004746271297335625,
-0.010744805447757244,
-0.04122781753540039,
-0.07424788177013397,
0.059454042464494705,
0.0003756377554964274,
0.08154323697090149,
-0.04730398207902908,
-0.01605544053018093,
0.03053753264248371,
-0.02989238314330578,
-0.07886174321174622,
0.025502780452370644,
0.045197248458862305,
0.05399606004357338,
0.049977466464042664,
0.04255933314561844,
-0.011339474469423294,
-0.03358699381351471,
0.32413560152053833,
-0.06739386916160583,
-0.10445805639028549,
-0.12450997531414032,
0.2149120718240738,
0.0331357941031456,
-0.027330921962857246,
0.03244313225150108,
-0.08630695194005966,
0.00025232622283510864,
0.16789624094963074,
0.17630748450756073,
-0.06586920469999313,
-0.020305713638663292,
0.00001596123183844611,
-0.01641855016350746,
-0.03244045004248619,
0.12431022524833679,
0.09769898653030396,
-0.014221089892089367,
-0.0607595331966877,
-0.016480768099427223,
-0.015328912064433098,
-0.03392661735415459,
-0.04145227000117302,
0.048388876020908356,
0.02102622762322426,
-0.027838926762342453,
-0.04220286011695862,
0.07704256474971771,
0.004678658675402403,
-0.2545279264450073,
0.06254333257675171,
-0.1551569253206253,
-0.176242396235466,
-0.0508931465446949,
0.029708635061979294,
0.006250095088034868,
0.05643242225050926,
-0.014365315437316895,
0.004054322838783264,
0.08832284808158875,
-0.010075857862830162,
-0.03512272983789444,
-0.12089656293392181,
0.123240165412426,
-0.04961573705077171,
0.1675845831632614,
-0.031716488301754,
0.044017668813467026,
0.11653278023004532,
0.029731474816799164,
-0.1334586888551712,
0.0380709283053875,
0.06222527101635933,
-0.09645560383796692,
0.02058366872370243,
0.1481228470802307,
-0.04734810069203377,
0.09742799401283264,
0.04524178430438042,
-0.10867010802030563,
0.001614108681678772,
-0.06617451459169388,
-0.0344562754034996,
-0.08771397173404694,
-0.009557974524796009,
-0.06453852355480194,
0.1674763560295105,
0.2215500771999359,
-0.037674132734537125,
0.00891915149986744,
-0.09830385446548462,
0.011933460831642151,
0.07027080655097961,
0.0350007601082325,
-0.05018829181790352,
-0.1817372888326645,
0.005034797824919224,
0.057619206607341766,
-0.0014575350796803832,
-0.2510373592376709,
-0.07175316661596298,
0.035851236432790756,
-0.02978796884417534,
-0.03308430686593056,
0.10868161171674728,
0.04795171320438385,
0.05087984353303909,
-0.031095106154680252,
-0.1518801599740982,
-0.03348475694656372,
0.1564728021621704,
-0.1742827147245407,
-0.03587208315730095
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0879976898431778,
0.09718496352434158,
-0.0026020670775324106,
0.07800676673650742,
0.13880573213100433,
0.03026161901652813,
0.09714936465024948,
0.13715897500514984,
-0.11323046684265137,
0.045300357043743134,
0.09603630751371384,
0.0769207552075386,
0.029322495684027672,
0.14477483928203583,
-0.030198918655514717,
-0.24498310685157776,
-0.007950776256620884,
-0.023030878975987434,
-0.10584819316864014,
0.11339537799358368,
0.10037600994110107,
-0.09803757816553116,
0.06681615859270096,
-0.01961318962275982,
-0.17538896203041077,
0.014800057746469975,
-0.017234131693840027,
-0.05227106064558029,
0.11694195866584778,
-0.0023401190992444754,
0.07855606079101562,
0.009777959436178207,
0.11749623715877533,
-0.19491694867610931,
0.01803014613687992,
0.07097706943750381,
0.044591497629880905,
0.09406158328056335,
0.004609425086528063,
-0.02079479768872261,
0.1205313429236412,
-0.1303708702325821,
0.09738212078809738,
0.032000888139009476,
-0.09775396436452866,
-0.20524345338344574,
-0.09714517742395401,
0.014492022804915905,
0.043263502418994904,
0.08790000528097153,
0.007516517769545317,
0.14867475628852844,
-0.10658612102270126,
0.08013064414262772,
0.2303152233362198,
-0.26820501685142517,
-0.08163132518529892,
0.04616459459066391,
0.06148263439536095,
0.08600687980651855,
-0.12144900858402252,
-0.013125579804182053,
0.010322964750230312,
0.019815785810351372,
0.10057922452688217,
-0.027992460876703262,
-0.07513955235481262,
0.012554613873362541,
-0.11203227937221527,
-0.0008169942884705961,
0.11569591611623764,
0.03776293247938156,
-0.05560990795493126,
-0.0786607414484024,
-0.03901102766394615,
-0.05830463767051697,
-0.03428403660655022,
-0.015307623893022537,
0.03612872585654259,
-0.06112112104892731,
-0.14228230714797974,
-0.04835947975516319,
-0.04971671104431152,
-0.09310474991798401,
0.0001501058432040736,
0.21179446578025818,
0.03883180394768715,
0.022582603618502617,
-0.04905633628368378,
0.10619188845157623,
0.015346970409154892,
-0.12390740215778351,
-0.03364677354693413,
-0.005600940901786089,
-0.09408851712942123,
-0.03460877761244774,
-0.056244105100631714,
0.031039441004395485,
0.03970218822360039,
0.22781065106391907,
-0.027602778747677803,
0.07297168672084808,
0.036956608295440674,
-0.012827049940824509,
-0.02690950594842434,
0.14671404659748077,
-0.02674918621778488,
-0.07688254863023758,
0.009758997708559036,
0.06377376616001129,
0.007185284048318863,
-0.00542818708345294,
-0.06259918957948685,
-0.04525468498468399,
0.06235501915216446,
0.05817786231637001,
-0.051203757524490356,
0.02569413371384144,
-0.010318666696548462,
-0.023572612553834915,
-0.0004664425796363503,
-0.11935849487781525,
0.009405076503753662,
-0.008412257768213749,
-0.07929737120866776,
-0.05436960235238075,
0.013461700640618801,
-0.0113849937915802,
0.009815394878387451,
0.08856320381164551,
-0.07262063771486282,
-0.035857222974300385,
-0.07871554791927338,
-0.07401861250400543,
-0.014751484617590904,
-0.15946358442306519,
0.021387489512562752,
-0.06950995326042175,
-0.15236611664295197,
-0.030925555154681206,
0.05073636397719383,
-0.08008063584566116,
-0.036302536725997925,
-0.0326659120619297,
-0.07966044545173645,
0.018308544531464577,
0.0008415891206823289,
0.21418200433254242,
-0.04880780354142189,
0.0958852544426918,
0.014219719916582108,
0.05456630140542984,
0.004333741497248411,
0.0371411107480526,
-0.07833240926265717,
0.014624608680605888,
-0.1746557056903839,
0.07621229439973831,
-0.08854229748249054,
0.02597525343298912,
-0.14600342512130737,
-0.08717060089111328,
-0.002068551257252693,
-0.020232925191521645,
0.08207853883504868,
0.10487663745880127,
-0.12614819407463074,
-0.022958064451813698,
0.12664388120174408,
-0.05747483670711517,
-0.05615648254752159,
0.06409560143947601,
-0.07385393977165222,
0.08348831534385681,
0.04852854833006859,
0.18753570318222046,
0.0993000715970993,
-0.10726143419742584,
0.02639137953519821,
0.01975540816783905,
0.03734106943011284,
0.0020788749679923058,
0.058214157819747925,
-0.00011981734132859856,
0.020735614001750946,
0.014948696829378605,
-0.08414351940155029,
0.015642240643501282,
-0.09270414710044861,
-0.06285279989242554,
-0.04197200760245323,
-0.0900486409664154,
0.007777315564453602,
0.007671361323446035,
0.02934929169714451,
-0.07933573424816132,
-0.08735983073711395,
0.06393187493085861,
0.14113983511924744,
-0.0455038920044899,
0.011457127518951893,
-0.07996424287557602,
0.002940727863460779,
-0.029768168926239014,
-0.022931333631277084,
-0.19551631808280945,
-0.060353875160217285,
0.02599175088107586,
0.010005318559706211,
0.04494226723909378,
-0.0054365359246730804,
0.07990909367799759,
0.02231544628739357,
-0.0498289093375206,
-0.004374540410935879,
-0.09127986431121826,
-0.010950853116810322,
-0.09077034890651703,
-0.21728156507015228,
-0.05558684468269348,
-0.03916841000318527,
0.15316298604011536,
-0.17136415839195251,
-0.00017458100046496838,
-0.019553065299987793,
0.10987138748168945,
0.041773486882448196,
-0.050404127687215805,
-0.0009647400584071875,
0.028870701789855957,
0.015224057249724865,
-0.09882792085409164,
0.034784819930791855,
0.016655143350362778,
-0.09954223036766052,
-0.024171726778149605,
-0.10214781016111374,
0.0013208715245127678,
0.07322622090578079,
0.07463277876377106,
-0.10641557723283768,
-0.018787004053592682,
-0.06288371235132217,
-0.02871253341436386,
-0.047492027282714844,
0.03483402729034424,
0.1719023436307907,
0.017146922647953033,
0.11241220682859421,
-0.0753568485379219,
-0.08597137033939362,
0.01762547343969345,
0.0076429941691458225,
0.05875176563858986,
0.11289221048355103,
0.07746949046850204,
-0.10160654038190842,
0.05751849710941315,
0.08886027336120605,
-0.04917420446872711,
0.1393367499113083,
-0.04935692250728607,
-0.0794614851474762,
-0.03160126134753227,
0.002597438869997859,
-0.0015617100289091468,
0.15071354806423187,
-0.046488206833601,
0.010751784779131413,
0.03476794809103012,
0.030170563608407974,
0.008104592561721802,
-0.163146510720253,
-0.023921886458992958,
0.020489593967795372,
-0.04939775913953781,
-0.02830849029123783,
0.011698049493134022,
0.013396845199167728,
0.09168322384357452,
0.049655381590127945,
0.0008362553780898452,
0.005928331520408392,
-0.013908937573432922,
-0.049568600952625275,
0.20067188143730164,
-0.0912884995341301,
-0.045658111572265625,
-0.08508148044347763,
0.000022739553969586268,
-0.010199597105383873,
-0.03565608710050583,
0.016497772186994553,
-0.09966515004634857,
-0.022468287497758865,
-0.06522585451602936,
0.0057371798902750015,
-0.04781264439225197,
0.011735216714441776,
0.0019414123380556703,
0.019337235018610954,
0.058107130229473114,
-0.1337350606918335,
0.014093047939240932,
-0.06256978213787079,
-0.11203033477067947,
0.03043888509273529,
0.05196375027298927,
0.08660456538200378,
0.06295111775398254,
-0.02575424313545227,
0.017429977655410767,
-0.045328766107559204,
0.23207491636276245,
-0.08531160652637482,
0.005066482350230217,
0.12454455345869064,
0.022905079647898674,
0.03779856488108635,
0.1009184792637825,
0.02748456969857216,
-0.09943496435880661,
0.04426376149058342,
0.07470934838056564,
-0.04409134387969971,
-0.25348663330078125,
0.008345047943294048,
-0.04302544519305229,
-0.08373288810253143,
0.08861428499221802,
0.0481608621776104,
-0.042091429233551025,
0.06484512239694595,
0.0033299687784165144,
0.007567217107862234,
-0.022015834227204323,
0.08756469190120697,
0.08416679501533508,
0.05511682480573654,
0.10544834285974503,
-0.04116712883114815,
-0.018805095925927162,
0.06580185145139694,
0.028296738862991333,
0.3064804673194885,
-0.04756120964884758,
0.09910792857408524,
0.05285273864865303,
0.1478373408317566,
-0.021612640470266342,
0.03686448931694031,
0.012806090526282787,
-0.004578373860567808,
-0.02900775335729122,
-0.053895749151706696,
-0.025688882917165756,
0.007309796754270792,
-0.0667908564209938,
0.04222796484827995,
-0.0563085712492466,
0.04761603847146034,
0.015791086480021477,
0.29219916462898254,
0.0033740494400262833,
-0.2632700502872467,
-0.09996748715639114,
-0.01116466149687767,
-0.03830595314502716,
-0.049489375203847885,
0.013361051678657532,
0.11993207037448883,
-0.1309528797864914,
0.024777496233582497,
-0.0656491070985794,
0.0841425359249115,
-0.02810586430132389,
-0.006386023946106434,
0.037698324769735336,
0.162503182888031,
-0.020344126969575882,
0.061394039541482925,
-0.22031551599502563,
0.2301151156425476,
0.007983415387570858,
0.1209564357995987,
-0.05355685204267502,
0.0067963614128530025,
0.023923750966787338,
-0.0009035093826241791,
0.0957910567522049,
-0.0021859901025891304,
-0.04720673710107803,
-0.1390782594680786,
-0.053537845611572266,
0.0709245577454567,
0.14061041176319122,
-0.05210753530263901,
0.10205065459012985,
-0.06007935106754303,
0.011017482727766037,
0.03686942905187607,
-0.08091909438371658,
-0.1209041029214859,
-0.10135464370250702,
-0.01982143148779869,
-0.0009746249997988343,
-0.06272122263908386,
-0.06514202058315277,
-0.0667719691991806,
0.027313290163874626,
0.11670198291540146,
-0.0009803872089833021,
-0.03476716950535774,
-0.148079976439476,
0.07420007139444351,
0.15456368029117584,
-0.06852174550294876,
0.03241504728794098,
0.002742880955338478,
0.08052054792642593,
0.035305336117744446,
-0.0777459442615509,
0.06313509494066238,
-0.06622806936502457,
-0.1802700012922287,
-0.04756302013993263,
0.10413354635238647,
0.07153350114822388,
0.04180007800459862,
-0.0055290646851062775,
0.04713885113596916,
-0.026377324014902115,
-0.09106642007827759,
0.02947995252907276,
0.03160828724503517,
0.03559334576129913,
0.04300302639603615,
-0.07740278542041779,
0.08604808896780014,
-0.04322943836450577,
-0.01954813301563263,
0.12332703918218613,
0.23187249898910522,
-0.10472529381513596,
0.09671091288328171,
0.055770792067050934,
-0.06077980995178223,
-0.16668906807899475,
0.07073689997196198,
0.10646755248308182,
0.013581359758973122,
0.0601406991481781,
-0.21627900004386902,
0.12124168127775192,
0.10144096612930298,
-0.014126632362604141,
0.03827391937375069,
-0.28019821643829346,
-0.11993472278118134,
0.04978165775537491,
0.12552706897258759,
0.08675897866487503,
-0.12471437454223633,
-0.019375666975975037,
-0.015058950521051884,
-0.12793657183647156,
0.07883212715387344,
-0.11331522464752197,
0.13132333755493164,
-0.023857850581407547,
0.11053192615509033,
0.012677903287112713,
-0.02681051939725876,
0.10825499147176743,
0.04929499328136444,
0.09569932520389557,
-0.04255438596010208,
0.0007273665978573263,
0.05949590355157852,
-0.049174241721630096,
0.00033361624809913337,
-0.06804513186216354,
0.08936171978712082,
-0.13693249225616455,
-0.007851636037230492,
-0.0879155844449997,
0.04163582623004913,
-0.04101292043924332,
-0.0660385861992836,
-0.041073936969041824,
0.056744061410427094,
0.04478093981742859,
-0.03301415219902992,
0.03901353105902672,
-0.024599438533186913,
0.10139697045087814,
0.02652580291032791,
0.08665421605110168,
0.022069159895181656,
-0.0546964593231678,
0.020932553336024284,
-0.012491190806031227,
0.06358761340379715,
-0.16927917301654816,
0.01031066756695509,
0.0979071632027626,
0.06848698109388351,
0.10163039714097977,
0.041022103279829025,
-0.04759594425559044,
0.017676537856459618,
0.028184285387396812,
-0.09659450501203537,
-0.1158694475889206,
0.03971121087670326,
-0.03700881451368332,
-0.14711986482143402,
0.04209299385547638,
0.1202925518155098,
-0.03873137757182121,
-0.031469110399484634,
-0.020200004801154137,
0.0032812999561429024,
-0.021923542022705078,
0.18007130920886993,
0.06022027134895325,
0.059942714869976044,
-0.1017753928899765,
0.12015897035598755,
0.03355509415268898,
-0.025127004832029343,
0.051615744829177856,
0.08200839161872864,
-0.10054737329483032,
-0.005377392750233412,
0.0783001258969307,
0.12570330500602722,
-0.055644501000642776,
0.0006506245699711144,
-0.10220856219530106,
-0.08708954602479935,
0.05816038325428963,
0.14260706305503845,
0.05018046498298645,
-0.01586730033159256,
-0.046481262892484665,
0.04555581137537956,
-0.13885807991027832,
0.07429569214582443,
0.03384711220860481,
0.06333201378583908,
-0.0778769999742508,
0.06582178920507431,
0.003186830785125494,
0.018433550372719765,
-0.015008926391601562,
0.0025992412120103836,
-0.09625127911567688,
-0.007876234129071236,
-0.08750410377979279,
0.0018082752358168364,
0.000002889187726395903,
0.018240993842482567,
-0.024269452318549156,
-0.07109362632036209,
-0.0445626862347126,
0.03739962726831436,
-0.08587684482336044,
-0.051075227558612823,
0.00626890454441309,
0.042440496385097504,
-0.1239885464310646,
-0.004805035889148712,
0.028872016817331314,
-0.0980328619480133,
0.09902264177799225,
0.07367440313100815,
0.021484164521098137,
0.02896447293460369,
-0.1176103875041008,
-0.03383351117372513,
-0.013256246224045753,
-0.009400594979524612,
0.059060219675302505,
-0.09991634637117386,
-0.005335207097232342,
-0.04568381607532501,
0.0627567246556282,
0.013092542067170143,
0.06382638961076736,
-0.1397523730993271,
0.014895937405526638,
-0.07034609466791153,
-0.045987945050001144,
-0.0787712037563324,
0.03975101187825203,
0.09300912916660309,
0.060494810342788696,
0.14063973724842072,
-0.07527115195989609,
0.02529139630496502,
-0.20339010655879974,
-0.0362199991941452,
-0.013272816315293312,
-0.05414048582315445,
-0.14565162360668182,
-0.047102559357881546,
0.08162381500005722,
-0.03950420022010803,
0.08922974020242691,
-0.027503708377480507,
0.07227322459220886,
0.03723080828785896,
-0.05027088150382042,
-0.030193133279681206,
-0.008581426925957203,
0.2025860846042633,
0.07147324085235596,
-0.014771531336009502,
0.10188695788383484,
-0.00028672016924247146,
0.031309738755226135,
0.043815728276968,
0.1719457358121872,
0.2217269390821457,
0.042508937418460846,
0.049602121114730835,
0.06382904201745987,
-0.07681534439325333,
-0.07096630334854126,
0.17321544885635376,
-0.01089178491383791,
0.06616055965423584,
-0.045703016221523285,
0.1941891461610794,
0.11829915642738342,
-0.16749806702136993,
0.04659229516983032,
-0.045169562101364136,
-0.08240044862031937,
-0.11934815347194672,
-0.009400426410138607,
-0.08393307775259018,
-0.12295840680599213,
0.036845315247774124,
-0.1186303198337555,
0.04587880149483681,
0.10811575502157211,
0.01406293548643589,
0.035389967262744904,
0.12476272135972977,
-0.01108783483505249,
-0.0068974569439888,
0.06771981716156006,
0.005016293376684189,
-0.009937569499015808,
-0.039763811975717545,
-0.07495737075805664,
0.05931016430258751,
0.002270527882501483,
0.08124223351478577,
-0.04603444039821625,
-0.014889060519635677,
0.03000451624393463,
-0.03074970468878746,
-0.07919563353061676,
0.024991922080516815,
0.04514507204294205,
0.05408833920955658,
0.04909124970436096,
0.042819879949092865,
-0.01104029268026352,
-0.03306415304541588,
0.32546067237854004,
-0.06717339903116226,
-0.10371829569339752,
-0.12477662414312363,
0.21704337000846863,
0.031256869435310364,
-0.026414716616272926,
0.0335274413228035,
-0.08612051606178284,
-0.0015450756764039397,
0.16594889760017395,
0.17580223083496094,
-0.06776472926139832,
-0.02006252110004425,
-0.00013401565956883132,
-0.016158225014805794,
-0.0315907746553421,
0.12534701824188232,
0.09690259397029877,
-0.012255815789103508,
-0.06153719499707222,
-0.017179420217871666,
-0.016610579565167427,
-0.033313341438770294,
-0.04167976602911949,
0.049356698989868164,
0.019705049693584442,
-0.026457451283931732,
-0.043540000915527344,
0.07609660923480988,
0.004266852512955666,
-0.25387513637542725,
0.06300891190767288,
-0.154635488986969,
-0.17671740055084229,
-0.04997099190950394,
0.030433326959609985,
0.004294283222407103,
0.056558821350336075,
-0.015417180955410004,
0.003604323137551546,
0.0890708789229393,
-0.010270983912050724,
-0.03539789095520973,
-0.11975807696580887,
0.12405355274677277,
-0.048678092658519745,
0.16895101964473724,
-0.03086242452263832,
0.04449784755706787,
0.1159232035279274,
0.030213698744773865,
-0.13439594209194183,
0.03682178631424904,
0.062452420592308044,
-0.0967227965593338,
0.01967977173626423,
0.149102583527565,
-0.047827765345573425,
0.09945399314165115,
0.046798694878816605,
-0.1084412932395935,
-0.00016229608445428312,
-0.06615227460861206,
-0.0346134752035141,
-0.0870874747633934,
-0.011419783346354961,
-0.06594914197921753,
0.16636091470718384,
0.22075428068637848,
-0.03785784915089607,
0.009194504469633102,
-0.0978061854839325,
0.012829671613872051,
0.0702529177069664,
0.03708045557141304,
-0.04889349639415741,
-0.1816476732492447,
0.004707708489149809,
0.05866874381899834,
-0.001028002006933093,
-0.25234949588775635,
-0.0736408457159996,
0.03657907992601395,
-0.028550341725349426,
-0.03284131735563278,
0.110158272087574,
0.04664801433682442,
0.05024883151054382,
-0.030842892825603485,
-0.15334196388721466,
-0.03407273069024086,
0.15564534068107605,
-0.17356078326702118,
-0.03596390038728714
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08843725174665451,
0.09612832218408585,
-0.002557982224971056,
0.07896751910448074,
0.14009228348731995,
0.03059105947613716,
0.0975375771522522,
0.13629421591758728,
-0.11348827183246613,
0.0448499396443367,
0.09626322239637375,
0.07697127759456635,
0.028417499735951424,
0.14454975724220276,
-0.030195899307727814,
-0.24469105899333954,
-0.008438709191977978,
-0.022453708574175835,
-0.10520239174365997,
0.11335954070091248,
0.09947342425584793,
-0.09878018498420715,
0.06737486273050308,
-0.019917042925953865,
-0.1764935553073883,
0.015247734263539314,
-0.01791580393910408,
-0.052101247012615204,
0.11714305728673935,
-0.0023203508462756872,
0.0787595734000206,
0.009072591550648212,
0.11729640513658524,
-0.19318322837352753,
0.018296049907803535,
0.0710197389125824,
0.04417738690972328,
0.09370680898427963,
0.004135549068450928,
-0.020998548716306686,
0.11913049221038818,
-0.1308080404996872,
0.0972050130367279,
0.031854577362537384,
-0.09800811111927032,
-0.20598161220550537,
-0.09683314710855484,
0.013105749152600765,
0.042398612946271896,
0.0887787789106369,
0.007181915920227766,
0.14822937548160553,
-0.10666479170322418,
0.08059803396463394,
0.22874294221401215,
-0.2690609097480774,
-0.0820331946015358,
0.046615418046712875,
0.06151149794459343,
0.08633480966091156,
-0.12198691815137863,
-0.012303230352699757,
0.01078332681208849,
0.020636705681681633,
0.10088403522968292,
-0.028347766026854515,
-0.07512400299310684,
0.012970507144927979,
-0.11194560676813126,
0.0007996275089681149,
0.11653966456651688,
0.03816772252321243,
-0.05539863556623459,
-0.07886452972888947,
-0.03822359815239906,
-0.05965018272399902,
-0.034720346331596375,
-0.014342984184622765,
0.03636707365512848,
-0.062158115208148956,
-0.14284741878509521,
-0.04662764444947243,
-0.049267567694187164,
-0.09182600677013397,
0.0001601414114702493,
0.2108396291732788,
0.03881519287824631,
0.022215334698557854,
-0.0487084798514843,
0.10541831701993942,
0.015153669752180576,
-0.12374718487262726,
-0.03255084529519081,
-0.005953247658908367,
-0.0934719517827034,
-0.03469724953174591,
-0.056912437081336975,
0.03261460363864899,
0.04009080305695534,
0.22822310030460358,
-0.027797477319836617,
0.07362259179353714,
0.037524957209825516,
-0.013539697974920273,
-0.026945844292640686,
0.14579476416110992,
-0.028171025216579437,
-0.07859794050455093,
0.01002640463411808,
0.06380592286586761,
0.0066041164100170135,
-0.005417628213763237,
-0.06282034516334534,
-0.04442789778113365,
0.06124483793973923,
0.05773404985666275,
-0.05246745049953461,
0.027543745934963226,
-0.009328571148216724,
-0.02305573597550392,
-0.000448803388280794,
-0.11925201863050461,
0.008984053507447243,
-0.00876997783780098,
-0.07944945991039276,
-0.05338755622506142,
0.012887605465948582,
-0.012223401106894016,
0.00955134816467762,
0.08888442069292068,
-0.0735585018992424,
-0.036521852016448975,
-0.07923220843076706,
-0.07498503476381302,
-0.015271036885678768,
-0.15941251814365387,
0.02249874733388424,
-0.06924127787351608,
-0.15262511372566223,
-0.032060276716947556,
0.05047203227877617,
-0.07988766580820084,
-0.03525787591934204,
-0.03263164311647415,
-0.08054935932159424,
0.01788146235048771,
0.000977297080680728,
0.2150397151708603,
-0.04917115718126297,
0.09477878361940384,
0.015607540495693684,
0.05463027209043503,
0.0037326891906559467,
0.03719688206911087,
-0.07799288630485535,
0.013905011117458344,
-0.17531625926494598,
0.0753750428557396,
-0.08921191096305847,
0.028049379587173462,
-0.14488404989242554,
-0.08825654536485672,
-0.0008974570664577186,
-0.01925935596227646,
0.08168753236532211,
0.10437178611755371,
-0.12567387521266937,
-0.02293856255710125,
0.12583094835281372,
-0.05602763965725899,
-0.05635533481836319,
0.06407690793275833,
-0.07417400181293488,
0.0824408009648323,
0.04904233664274216,
0.18775734305381775,
0.09903539717197418,
-0.10678036510944366,
0.02555631473660469,
0.01900106482207775,
0.037619613111019135,
0.0014018254587426782,
0.05659151077270508,
0.0008825481636449695,
0.020736202597618103,
0.015533569268882275,
-0.08273591101169586,
0.015615255571901798,
-0.09239207953214645,
-0.06240073963999748,
-0.041377052664756775,
-0.09003731608390808,
0.006976606324315071,
0.007854669354856014,
0.02928219735622406,
-0.08046688139438629,
-0.08759407699108124,
0.06306060403585434,
0.1408974975347519,
-0.04598333314061165,
0.010303795337677002,
-0.08019936829805374,
0.0035848652478307486,
-0.029940377920866013,
-0.02250824309885502,
-0.19519907236099243,
-0.06114516779780388,
0.025023095309734344,
0.012186233885586262,
0.045041803270578384,
-0.00507794925943017,
0.08041027188301086,
0.02265462838113308,
-0.05009789019823074,
-0.004541044123470783,
-0.08992861956357956,
-0.010525018908083439,
-0.09065378457307816,
-0.2178008109331131,
-0.055656448006629944,
-0.03900134190917015,
0.15314319729804993,
-0.1719789057970047,
0.00013292963558342308,
-0.01953655295073986,
0.10925038158893585,
0.04128841310739517,
-0.04979992285370827,
-0.0003737152146641165,
0.030244499444961548,
0.016035111621022224,
-0.09882959723472595,
0.03527186065912247,
0.016343751922249794,
-0.09832806140184402,
-0.025544771924614906,
-0.10244280844926834,
0.002884311368688941,
0.0743483453989029,
0.07349520921707153,
-0.10694729536771774,
-0.018342340365052223,
-0.06240974739193916,
-0.028607847169041634,
-0.0456865057349205,
0.034818630665540695,
0.17302580177783966,
0.016440151259303093,
0.11239893734455109,
-0.07474883645772934,
-0.08559882640838623,
0.01777784898877144,
0.008770703338086605,
0.06000197306275368,
0.11289594322443008,
0.07654249668121338,
-0.10000810027122498,
0.05775858461856842,
0.08832062780857086,
-0.04955872893333435,
0.13978229463100433,
-0.04958757758140564,
-0.07881738245487213,
-0.03111894056200981,
0.0014972827630117536,
-0.002046325709670782,
0.151463121175766,
-0.046877890825271606,
0.009710604324936867,
0.03439953550696373,
0.029644623398780823,
0.008361236192286015,
-0.1622411161661148,
-0.02390579879283905,
0.019724814221262932,
-0.04840734228491783,
-0.028828084468841553,
0.012364927679300308,
0.012806539423763752,
0.09128206223249435,
0.049512460827827454,
-0.0004462658835109323,
0.006049284245818853,
-0.013693277724087238,
-0.04861925169825554,
0.20105044543743134,
-0.09163990616798401,
-0.044397760182619095,
-0.08475003391504288,
-0.0021687892731279135,
-0.011261803098022938,
-0.03621317073702812,
0.016242241486907005,
-0.10099012404680252,
-0.022883543744683266,
-0.06485647708177567,
0.004832015372812748,
-0.04789864644408226,
0.011636973358690739,
0.0016674516955390573,
0.01871366612613201,
0.05776585265994072,
-0.1335844248533249,
0.014253654517233372,
-0.06289950013160706,
-0.11200100183486938,
0.030913928523659706,
0.05278034880757332,
0.08729559183120728,
0.062281228601932526,
-0.025837119668722153,
0.017463941127061844,
-0.0452239066362381,
0.2331276535987854,
-0.08536002784967422,
0.005191906820982695,
0.12433163076639175,
0.023738885298371315,
0.03684170916676521,
0.10082361102104187,
0.028495341539382935,
-0.0999467596411705,
0.043923188000917435,
0.07518433034420013,
-0.043607328087091446,
-0.2534596025943756,
0.008883918635547161,
-0.043398331850767136,
-0.08405642956495285,
0.0882520005106926,
0.048082996159791946,
-0.04150573909282684,
0.06545914709568024,
0.004308260511606932,
0.008513625711202621,
-0.0229803379625082,
0.08734691888093948,
0.08606039732694626,
0.05459693819284439,
0.10622789710760117,
-0.0415164977312088,
-0.01916087605059147,
0.06507692486047745,
0.028474675491452217,
0.3073534071445465,
-0.047602057456970215,
0.09892413765192032,
0.05332992970943451,
0.14750023186206818,
-0.021130649372935295,
0.03742561489343643,
0.012270990759134293,
-0.00530243618413806,
-0.02901308611035347,
-0.05367204174399376,
-0.024305468425154686,
0.006171341985464096,
-0.06843850016593933,
0.04201700910925865,
-0.056301236152648926,
0.04622791334986687,
0.01601908542215824,
0.2915027141571045,
0.0027832265477627516,
-0.2655085325241089,
-0.1004113033413887,
-0.011997788213193417,
-0.038013994693756104,
-0.04893215000629425,
0.013696499168872833,
0.11916212737560272,
-0.13037605583667755,
0.02539549395442009,
-0.06552108377218246,
0.08399287611246109,
-0.02821187488734722,
-0.005312229041010141,
0.039527926594018936,
0.16416673362255096,
-0.02125893346965313,
0.06085015833377838,
-0.22103342413902283,
0.22903746366500854,
0.008252746425569057,
0.12162914872169495,
-0.053568121045827866,
0.0065852440893650055,
0.024749787524342537,
0.0004785115597769618,
0.09528280049562454,
-0.0028311561327427626,
-0.04689442738890648,
-0.1390378177165985,
-0.05256718769669533,
0.07197131961584091,
0.14000679552555084,
-0.05083558335900307,
0.10273659229278564,
-0.05919664725661278,
0.010524813085794449,
0.036756619811058044,
-0.08212514966726303,
-0.12122766673564911,
-0.10191305726766586,
-0.020765384659171104,
0.0005425070412456989,
-0.06266197562217712,
-0.06429067254066467,
-0.06720346212387085,
0.02525566890835762,
0.11580958962440491,
0.0008829626603983343,
-0.034804023802280426,
-0.14843320846557617,
0.07343725860118866,
0.15462380647659302,
-0.06760654598474503,
0.03256520256400108,
0.0031240435782819986,
0.0792953222990036,
0.036316417157649994,
-0.07750044018030167,
0.0632682666182518,
-0.0668274536728859,
-0.17905402183532715,
-0.04761544242501259,
0.10323850065469742,
0.07118124514818192,
0.040974367409944534,
-0.005991905461996794,
0.04712127149105072,
-0.02732919342815876,
-0.09169322997331619,
0.029637260362505913,
0.0298408605158329,
0.03667472302913666,
0.04283788427710533,
-0.07837862521409988,
0.08608680218458176,
-0.04268690198659897,
-0.01870475336909294,
0.12189633399248123,
0.2296982705593109,
-0.10418146103620529,
0.09453270584344864,
0.055799778550863266,
-0.060314878821372986,
-0.16608062386512756,
0.07212948054075241,
0.10504552721977234,
0.013904502615332603,
0.0586591511964798,
-0.216910257935524,
0.12247567623853683,
0.10092068463563919,
-0.013051627203822136,
0.04027713090181351,
-0.27745819091796875,
-0.11919115483760834,
0.05004312843084335,
0.126220241189003,
0.08808401972055435,
-0.12520171701908112,
-0.01859399862587452,
-0.015277768485248089,
-0.1276179552078247,
0.07788555324077606,
-0.11578702926635742,
0.13155655562877655,
-0.024351149797439575,
0.11132225394248962,
0.011893579736351967,
-0.026843661442399025,
0.10743898153305054,
0.05028621107339859,
0.096908800303936,
-0.04286254942417145,
0.0017846293048933148,
0.059427037835121155,
-0.048411447554826736,
0.000004148526841163402,
-0.0689903125166893,
0.08886831998825073,
-0.13710083067417145,
-0.007683999370783567,
-0.088404081761837,
0.041575878858566284,
-0.04090281203389168,
-0.06566743552684784,
-0.04043911024928093,
0.05674009397625923,
0.043685432523489,
-0.03323013707995415,
0.037847571074962616,
-0.025288263335824013,
0.10174928605556488,
0.02510548010468483,
0.08717626333236694,
0.020568855106830597,
-0.05517401918768883,
0.022138621658086777,
-0.013059111312031746,
0.06347563862800598,
-0.1689758598804474,
0.008912023156881332,
0.09845297783613205,
0.0677175372838974,
0.10118669271469116,
0.04170701280236244,
-0.04661881923675537,
0.017045531421899796,
0.028923822566866875,
-0.09695276618003845,
-0.11468957364559174,
0.040009550750255585,
-0.04049297422170639,
-0.14656661450862885,
0.04348750039935112,
0.12009115517139435,
-0.038564372807741165,
-0.0312308631837368,
-0.020270485430955887,
0.0025330057833343744,
-0.021976912394165993,
0.1807812750339508,
0.06085314229130745,
0.0594375804066658,
-0.10282335430383682,
0.1197650209069252,
0.03341573849320412,
-0.02541504241526127,
0.051576048135757446,
0.08305861055850983,
-0.10144401341676712,
-0.006221548654139042,
0.07752779871225357,
0.127714604139328,
-0.05515923351049423,
-0.00002753112312348094,
-0.10340999811887741,
-0.08731724321842194,
0.05813044309616089,
0.14216269552707672,
0.05007334426045418,
-0.01712176576256752,
-0.04671395570039749,
0.045084185898303986,
-0.13963595032691956,
0.07396192848682404,
0.0328061506152153,
0.06388155370950699,
-0.07787995785474777,
0.06505487114191055,
0.003059371840208769,
0.017779476940631866,
-0.014921565540134907,
0.0032128063030540943,
-0.09677686542272568,
-0.008062980137765408,
-0.08759880065917969,
0.0009718751534819603,
0.00003504600317683071,
0.018835095688700676,
-0.024580093100667,
-0.07008258253335953,
-0.04519181698560715,
0.03764202445745468,
-0.08600468933582306,
-0.050745777785778046,
0.008168132044374943,
0.04295588284730911,
-0.123142309486866,
-0.004488672595471144,
0.027639469131827354,
-0.09766705334186554,
0.09878253936767578,
0.0734136700630188,
0.021137123927474022,
0.02910442464053631,
-0.116468146443367,
-0.034338485449552536,
-0.013768261298537254,
-0.0100805489346385,
0.05983167514204979,
-0.09899390488862991,
-0.004897209350019693,
-0.0459117516875267,
0.06349288672208786,
0.013229651376605034,
0.06297528743743896,
-0.13916856050491333,
0.015323988161981106,
-0.06949715316295624,
-0.04460657015442848,
-0.07936539500951767,
0.039973411709070206,
0.09379862248897552,
0.05998528376221657,
0.14057420194149017,
-0.07523386180400848,
0.024538176134228706,
-0.20335270464420319,
-0.036607835441827774,
-0.01350303366780281,
-0.055777549743652344,
-0.1452852189540863,
-0.04710948094725609,
0.08225822448730469,
-0.0402316190302372,
0.09109203517436981,
-0.027495183050632477,
0.07232614606618881,
0.036750320345163345,
-0.05005843937397003,
-0.0313422828912735,
-0.008285488933324814,
0.20228512585163116,
0.07084402441978455,
-0.015291010960936546,
0.10168168693780899,
0.0009928509825840592,
0.03073311224579811,
0.044915858656167984,
0.17018988728523254,
0.22119516134262085,
0.043201811611652374,
0.049678437411785126,
0.06418552249670029,
-0.07737093418836594,
-0.06947880238294601,
0.17426703870296478,
-0.0104440962895751,
0.0668618381023407,
-0.04651820287108421,
0.19280341267585754,
0.11739438772201538,
-0.16638301312923431,
0.04674206301569939,
-0.04602187126874924,
-0.08249367028474808,
-0.11822760105133057,
-0.0078369015827775,
-0.08344090729951859,
-0.12345033884048462,
0.036977604031562805,
-0.11916794627904892,
0.045082785189151764,
0.10926307737827301,
0.014616642147302628,
0.03474627807736397,
0.12621942162513733,
-0.009994006715714931,
-0.006219435483217239,
0.06796713918447495,
0.004551923368126154,
-0.010372036136686802,
-0.039686284959316254,
-0.07458533346652985,
0.05864992365241051,
0.0008078760001808405,
0.08058005571365356,
-0.04669652506709099,
-0.015372976660728455,
0.030647173523902893,
-0.030316321179270744,
-0.07881137728691101,
0.024922387674450874,
0.04580298811197281,
0.05391785129904747,
0.050031837075948715,
0.04235148802399635,
-0.01185053400695324,
-0.03317929431796074,
0.32438233494758606,
-0.06710174679756165,
-0.10262488573789597,
-0.12512049078941345,
0.2166324108839035,
0.031025921925902367,
-0.027072584256529808,
0.03239494189620018,
-0.08512371778488159,
-0.0004578018852043897,
0.16722598671913147,
0.17762599885463715,
-0.06806100159883499,
-0.020435085520148277,
0.00008089662878774107,
-0.016255710273981094,
-0.03175836801528931,
0.1257488876581192,
0.09736081212759018,
-0.013577328063547611,
-0.06145930662751198,
-0.017152365297079086,
-0.016158493235707283,
-0.03293439373373985,
-0.041922423988580704,
0.048667363822460175,
0.020782597362995148,
-0.026903266087174416,
-0.042748648673295975,
0.07597681134939194,
0.00399530865252018,
-0.25386741757392883,
0.06363601237535477,
-0.15416626632213593,
-0.17710477113723755,
-0.05093511939048767,
0.029968779534101486,
0.0050774915143847466,
0.056948620826005936,
-0.015447426587343216,
0.003451922442764044,
0.08972670137882233,
-0.010762788355350494,
-0.03443284332752228,
-0.1204977035522461,
0.12398770451545715,
-0.049167606979608536,
0.16779100894927979,
-0.031345538794994354,
0.04435598850250244,
0.11659975349903107,
0.029836444184184074,
-0.134189635515213,
0.03733875975012779,
0.061731625348329544,
-0.09732876718044281,
0.020275874063372612,
0.14814306795597076,
-0.047588083893060684,
0.09864719957113266,
0.0455971397459507,
-0.10737992823123932,
0.0005432441830635071,
-0.06588766723871231,
-0.03443165495991707,
-0.08704662322998047,
-0.011437208391726017,
-0.0658266618847847,
0.1668688803911209,
0.22208112478256226,
-0.03778131306171417,
0.009162978269159794,
-0.09767719358205795,
0.012752228416502476,
0.07081875205039978,
0.03545556217432022,
-0.04971013963222504,
-0.1817074418067932,
0.005020522512495518,
0.05954338610172272,
-0.0015342208789661527,
-0.25324389338493347,
-0.07239488512277603,
0.03636655583977699,
-0.028354478999972343,
-0.03313075751066208,
0.10994939506053925,
0.04757631942629814,
0.05115566775202751,
-0.031214825809001923,
-0.1510341912508011,
-0.03417595475912094,
0.1554444134235382,
-0.17372852563858032,
-0.03655309975147247
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08843699842691422,
0.09544404596090317,
-0.002517079468816519,
0.07874304801225662,
0.14037106931209564,
0.030055314302444458,
0.0972435250878334,
0.13649006187915802,
-0.1127064973115921,
0.04514065384864807,
0.09562571346759796,
0.07775098085403442,
0.028897788375616074,
0.14418336749076843,
-0.030119040980935097,
-0.2443312257528305,
-0.007935840636491776,
-0.022191526368260384,
-0.10495684295892715,
0.11337257921695709,
0.09951993823051453,
-0.09898421168327332,
0.066706083714962,
-0.020351119339466095,
-0.1768132746219635,
0.015565712936222553,
-0.01799716241657734,
-0.051708269864320755,
0.11696847528219223,
-0.002744469093158841,
0.07853132486343384,
0.00934427697211504,
0.11710384488105774,
-0.19348639249801636,
0.018300065770745277,
0.07146940380334854,
0.04440363869071007,
0.0937955230474472,
0.00501444423571229,
-0.020589260384440422,
0.12051303684711456,
-0.13019531965255737,
0.09689510613679886,
0.032281890511512756,
-0.097980797290802,
-0.20430348813533783,
-0.09727045148611069,
0.012214625254273415,
0.04282744228839874,
0.08949317783117294,
0.006681571248918772,
0.14918003976345062,
-0.10732632875442505,
0.08062055706977844,
0.2302507907152176,
-0.2675956189632416,
-0.08198506385087967,
0.04749981313943863,
0.0613541305065155,
0.08549152314662933,
-0.12279009073972702,
-0.01342371478676796,
0.010780702345073223,
0.020325900986790657,
0.09993720054626465,
-0.027792232111096382,
-0.07648245245218277,
0.012558589689433575,
-0.11226540803909302,
0.0005034273490309715,
0.1153247207403183,
0.0382721908390522,
-0.05488666892051697,
-0.07840727269649506,
-0.0387040376663208,
-0.058996569365262985,
-0.034482620656490326,
-0.014234081842005253,
0.036476343870162964,
-0.06198347359895706,
-0.14240257441997528,
-0.046361155807971954,
-0.0492406003177166,
-0.09294962882995605,
0.00022432846890296787,
0.21056848764419556,
0.03878689557313919,
0.02211199700832367,
-0.049465250223875046,
0.1053517535328865,
0.015276649966835976,
-0.12398819625377655,
-0.03342283517122269,
-0.0053084599785506725,
-0.0936426967382431,
-0.034772224724292755,
-0.05677175149321556,
0.031203141435980797,
0.03949110954999924,
0.22613084316253662,
-0.02769695594906807,
0.07388971745967865,
0.036937009543180466,
-0.013396348804235458,
-0.027609093114733696,
0.1460065245628357,
-0.027233105152845383,
-0.07648079842329025,
0.00921241007745266,
0.06395317614078522,
0.006183465011417866,
-0.005012911278754473,
-0.06225244328379631,
-0.04465527832508087,
0.061784777790308,
0.05721786245703697,
-0.05217638239264488,
0.027118096128106117,
-0.009823290631175041,
-0.023268692195415497,
0.00010752339585451409,
-0.11907167732715607,
0.008860546164214611,
-0.009180405177175999,
-0.07985153049230576,
-0.05429598316550255,
0.012952777557075024,
-0.012513846158981323,
0.009384037926793098,
0.089134581387043,
-0.07408027350902557,
-0.03640850633382797,
-0.08012411743402481,
-0.07477101683616638,
-0.015032944269478321,
-0.16098171472549438,
0.02222151681780815,
-0.0684180036187172,
-0.15283015370368958,
-0.03214956447482109,
0.05008891969919205,
-0.07990074902772903,
-0.03489059582352638,
-0.03317370265722275,
-0.08096346259117126,
0.0181623212993145,
0.0013049861881881952,
0.2159283459186554,
-0.04883387312293053,
0.09553223103284836,
0.01550988294184208,
0.05450563505291939,
0.004183790180832148,
0.03783616051077843,
-0.07905778288841248,
0.013697922229766846,
-0.17471641302108765,
0.07556967437267303,
-0.08988378196954727,
0.0277653057128191,
-0.1457080990076065,
-0.08759898692369461,
-0.00226980890147388,
-0.019781412556767464,
0.08249904215335846,
0.10505812615156174,
-0.1258888989686966,
-0.022642510011792183,
0.12615276873111725,
-0.056924231350421906,
-0.05623016133904457,
0.06305603682994843,
-0.07392130047082901,
0.08230036497116089,
0.048552706837654114,
0.18798716366291046,
0.09863683581352234,
-0.10618358850479126,
0.024402258917689323,
0.018615858629345894,
0.03813820704817772,
0.000745172263123095,
0.056498195976018906,
0.000753836240619421,
0.021462904289364815,
0.015507608652114868,
-0.08286668360233307,
0.015471586026251316,
-0.09250839054584503,
-0.06171765178442001,
-0.04204120859503746,
-0.09013937413692474,
0.006659332197159529,
0.009045221842825413,
0.0291223656386137,
-0.0799887627363205,
-0.08687818050384521,
0.06316792219877243,
0.14082778990268707,
-0.04535254091024399,
0.010448237881064415,
-0.07965613156557083,
0.0023308000527322292,
-0.030776502564549446,
-0.022528190165758133,
-0.19642138481140137,
-0.061571959406137466,
0.025625668466091156,
0.011719527654349804,
0.04504575580358505,
-0.005125370342284441,
0.08027464896440506,
0.021670952439308167,
-0.05009216442704201,
-0.004611919168382883,
-0.0907294824719429,
-0.011163419112563133,
-0.09104696661233902,
-0.21753984689712524,
-0.05579035356640816,
-0.039553653448820114,
0.15115267038345337,
-0.17135144770145416,
-0.0001709935604594648,
-0.019529933109879494,
0.10951316356658936,
0.041239820420742035,
-0.050270725041627884,
-0.0003836150572169572,
0.029742196202278137,
0.01570993661880493,
-0.09911098331212997,
0.03518086299300194,
0.015689479187130928,
-0.09797324985265732,
-0.02535487525165081,
-0.102598175406456,
0.0012787478044629097,
0.07401705533266068,
0.07488232851028442,
-0.1070210263133049,
-0.0183111522346735,
-0.062480561435222626,
-0.028912069275975227,
-0.046741239726543427,
0.035727955400943756,
0.17247970402240753,
0.01664714328944683,
0.11185760796070099,
-0.07519649714231491,
-0.0856982097029686,
0.018378959968686104,
0.00882041361182928,
0.05947467312216759,
0.11373438686132431,
0.0782490149140358,
-0.10167497396469116,
0.058338746428489685,
0.08889809250831604,
-0.04892827197909355,
0.14018896222114563,
-0.04975683242082596,
-0.07929421961307526,
-0.030019229277968407,
0.002051795832812786,
-0.0019055213779211044,
0.15103739500045776,
-0.046534039080142975,
0.01018158346414566,
0.034279514104127884,
0.030139943584799767,
0.008401293307542801,
-0.16254951059818268,
-0.024022113531827927,
0.019575655460357666,
-0.048356469720602036,
-0.029351545497775078,
0.012328016571700573,
0.012748275883495808,
0.09150976687669754,
0.04907470569014549,
-0.00019140623044222593,
0.005559076555073261,
-0.013735895976424217,
-0.04863825812935829,
0.20195409655570984,
-0.0912800207734108,
-0.043356288224458694,
-0.08380727469921112,
-0.0013586736749857664,
-0.011011007241904736,
-0.03619746118783951,
0.016357416287064552,
-0.10206594318151474,
-0.022792601957917213,
-0.06483937799930573,
0.004932967014610767,
-0.0478455014526844,
0.010898709297180176,
0.00097063829889521,
0.018948940560221672,
0.057174984365701675,
-0.1342637836933136,
0.014284402132034302,
-0.06350106000900269,
-0.11250892281532288,
0.030710510909557343,
0.05213557556271553,
0.08699643611907959,
0.06256251037120819,
-0.026119401678442955,
0.017292218282818794,
-0.04532152786850929,
0.23210473358631134,
-0.08552765846252441,
0.004690759815275669,
0.12460923939943314,
0.024820121005177498,
0.037388451397418976,
0.10104912519454956,
0.02777409739792347,
-0.09998708963394165,
0.0445394404232502,
0.07556159049272537,
-0.044001586735248566,
-0.2545084059238434,
0.008780557662248611,
-0.04358748346567154,
-0.08452807366847992,
0.0884704440832138,
0.048342954367399216,
-0.041801635175943375,
0.06558685004711151,
0.0035003856755793095,
0.007454142905771732,
-0.023231875151395798,
0.08763186633586884,
0.08429073542356491,
0.05537886545062065,
0.10617117583751678,
-0.04154890775680542,
-0.018627207726240158,
0.06435605138540268,
0.02865973673760891,
0.3085970878601074,
-0.047810252755880356,
0.09891177713871002,
0.05303072929382324,
0.14760084450244904,
-0.0214239489287138,
0.038088567554950714,
0.01207063626497984,
-0.005448387004435062,
-0.02900957129895687,
-0.053473275154829025,
-0.024670492857694626,
0.005978001747280359,
-0.06913230568170547,
0.04236733540892601,
-0.05633952096104622,
0.04663464426994324,
0.015148484148085117,
0.2920225262641907,
0.0029255191329866648,
-0.2647104859352112,
-0.09983084350824356,
-0.012291780672967434,
-0.038138147443532944,
-0.04963834583759308,
0.013459058478474617,
0.118961863219738,
-0.1299193799495697,
0.024796949699521065,
-0.06581025570631027,
0.08470289409160614,
-0.026951342821121216,
-0.005908529739826918,
0.03861430287361145,
0.1643192321062088,
-0.021106218919157982,
0.06145806983113289,
-0.22176994383335114,
0.23106476664543152,
0.00815577618777752,
0.12144992500543594,
-0.05382617563009262,
0.006459476891905069,
0.02404589019715786,
-0.0013216349761933088,
0.09555618464946747,
-0.0027185664512217045,
-0.04752637445926666,
-0.13875268399715424,
-0.05185674503445625,
0.07175786793231964,
0.14070288836956024,
-0.05116273835301399,
0.10213980823755264,
-0.05930519849061966,
0.010767667554318905,
0.037485722452402115,
-0.08183418214321136,
-0.12130863219499588,
-0.10193698853254318,
-0.020485106855630875,
-0.00005579256685450673,
-0.06353279203176498,
-0.06419210135936737,
-0.0669684112071991,
0.026638396084308624,
0.1154748946428299,
0.0009834904922172427,
-0.034309271723032,
-0.14829610288143158,
0.0741313025355339,
0.15482376515865326,
-0.06806084513664246,
0.032363418489694595,
0.002711283741518855,
0.0790560245513916,
0.035470906645059586,
-0.07788293063640594,
0.06411757320165634,
-0.06674043834209442,
-0.17941533029079437,
-0.0475924089550972,
0.1028376966714859,
0.07160337269306183,
0.041295524686574936,
-0.006103952415287495,
0.04744233191013336,
-0.027247745543718338,
-0.09147216379642487,
0.029838817194104195,
0.02993389405310154,
0.03648314252495766,
0.043374527245759964,
-0.07792741805315018,
0.08491984009742737,
-0.04333191365003586,
-0.019243545830249786,
0.1214771494269371,
0.23059190809726715,
-0.1040443405508995,
0.09463132917881012,
0.05632487311959267,
-0.060611095279455185,
-0.16664087772369385,
0.07273612171411514,
0.10566117614507675,
0.01342983078211546,
0.059420742094516754,
-0.21681001782417297,
0.12253279983997345,
0.1009102538228035,
-0.013395338319242,
0.03939933329820633,
-0.27812835574150085,
-0.1191755011677742,
0.04990338161587715,
0.12615124881267548,
0.08694697171449661,
-0.12489522993564606,
-0.018459733575582504,
-0.014928067103028297,
-0.1270270198583603,
0.07856699079275131,
-0.11406905949115753,
0.13189882040023804,
-0.024707958102226257,
0.11139342188835144,
0.012036403641104698,
-0.02724057249724865,
0.10631435364484787,
0.05051429197192192,
0.09715785086154938,
-0.04252960905432701,
0.0014249465893954039,
0.05964907258749008,
-0.04854608327150345,
0.00012467970373108983,
-0.0690426304936409,
0.08942167460918427,
-0.13610075414180756,
-0.007325637154281139,
-0.08899924904108047,
0.04195544123649597,
-0.040780209004879,
-0.06560639292001724,
-0.04053560271859169,
0.05699148029088974,
0.04408913850784302,
-0.03346027061343193,
0.03885899484157562,
-0.025111597031354904,
0.10318827629089355,
0.025436609983444214,
0.08747643977403641,
0.02102035842835903,
-0.054310575127601624,
0.021492544561624527,
-0.012271474115550518,
0.06364519894123077,
-0.16982749104499817,
0.00863697100430727,
0.0983639806509018,
0.0685688853263855,
0.10100347548723221,
0.042216137051582336,
-0.04723525792360306,
0.017134495079517365,
0.028101162984967232,
-0.09624050557613373,
-0.11538317054510117,
0.04025213420391083,
-0.036852918565273285,
-0.14717110991477966,
0.04378661885857582,
0.11869388818740845,
-0.03939446434378624,
-0.031226765364408493,
-0.020214976742863655,
0.0028738749679178,
-0.021329781040549278,
0.18143336474895477,
0.06077340990304947,
0.06000159680843353,
-0.10269390046596527,
0.12020363658666611,
0.03314460813999176,
-0.026237212121486664,
0.05158664658665657,
0.08274126052856445,
-0.1008930429816246,
-0.006100249011069536,
0.0788213461637497,
0.12643346190452576,
-0.055094391107559204,
0.001260525779798627,
-0.1024758368730545,
-0.08704819530248642,
0.05837356299161911,
0.14342370629310608,
0.05002042278647423,
-0.01689258962869644,
-0.046514157205820084,
0.04534132778644562,
-0.14013728499412537,
0.0739779844880104,
0.03234977647662163,
0.06407927721738815,
-0.07746369391679764,
0.06367481499910355,
0.0032496508210897446,
0.01798238977789879,
-0.0147391427308321,
0.003507450921460986,
-0.09648805856704712,
-0.008112248033285141,
-0.0854322612285614,
0.0008668474620208144,
0.00021664171072188765,
0.018282825127243996,
-0.025003446266055107,
-0.07043787837028503,
-0.0443945936858654,
0.037705931812524796,
-0.08634165674448013,
-0.05109314247965813,
0.007532937917858362,
0.04261596128344536,
-0.12346361577510834,
-0.00437504705041647,
0.027947288006544113,
-0.0972244068980217,
0.09787047654390335,
0.07299867272377014,
0.02142825908958912,
0.029487086459994316,
-0.11867762356996536,
-0.03405848518013954,
-0.013505886308848858,
-0.009921880438923836,
0.059636496007442474,
-0.09826305508613586,
-0.005177868530154228,
-0.045908425003290176,
0.06363231688737869,
0.01285617146641016,
0.062153901904821396,
-0.13976770639419556,
0.014643983915448189,
-0.07057411223649979,
-0.045248303562402725,
-0.07881765067577362,
0.04032943397760391,
0.09401945024728775,
0.06023179739713669,
0.14040729403495789,
-0.07499255239963531,
0.025117889046669006,
-0.20359420776367188,
-0.036490313708782196,
-0.013330573216080666,
-0.05575188621878624,
-0.14580421149730682,
-0.046528395265340805,
0.08247175067663193,
-0.04006115347146988,
0.08872228860855103,
-0.027074120938777924,
0.07287033647298813,
0.03685986250638962,
-0.04968481883406639,
-0.031470488756895065,
-0.008501662872731686,
0.20229415595531464,
0.07080277800559998,
-0.014766364358365536,
0.10336042195558548,
0.0011553249787539244,
0.03049565851688385,
0.04622720181941986,
0.17185115814208984,
0.22213585674762726,
0.041945334523916245,
0.04991555213928223,
0.0641031563282013,
-0.07764291763305664,
-0.06936671584844589,
0.17417897284030914,
-0.010045601986348629,
0.06656210869550705,
-0.04658569023013115,
0.1933811753988266,
0.11743339896202087,
-0.16617019474506378,
0.047269973903894424,
-0.04635947197675705,
-0.08225101977586746,
-0.11828555911779404,
-0.007877455092966557,
-0.08343925327062607,
-0.12344437092542648,
0.03735871985554695,
-0.11944101005792618,
0.045632198452949524,
0.10983701795339584,
0.014438485726714134,
0.03493388742208481,
0.12696892023086548,
-0.009471521712839603,
-0.006315527483820915,
0.06818989664316177,
0.004395836032927036,
-0.010612421669065952,
-0.03989808261394501,
-0.07440576702356339,
0.05992646887898445,
0.0008099148981273174,
0.08071703463792801,
-0.0465581901371479,
-0.016191665083169937,
0.030332326889038086,
-0.030167950317263603,
-0.07899525761604309,
0.025045614689588547,
0.04594586789608002,
0.0538642555475235,
0.05046011134982109,
0.04225354269146919,
-0.011220023036003113,
-0.033270932734012604,
0.32576239109039307,
-0.06736640632152557,
-0.10223957151174545,
-0.12400991469621658,
0.21754081547260284,
0.03205016255378723,
-0.02692732959985733,
0.03234200179576874,
-0.0857575535774231,
-0.0007866788073442876,
0.16577686369419098,
0.17568454146385193,
-0.0670134499669075,
-0.020192284137010574,
-0.0002320458006579429,
-0.016213703900575638,
-0.03159087523818016,
0.12575672566890717,
0.09768746048212051,
-0.012981262058019638,
-0.061801519244909286,
-0.01694480888545513,
-0.01617581769824028,
-0.03350016474723816,
-0.04094896838068962,
0.04850298538804054,
0.021127140149474144,
-0.027180328965187073,
-0.042916249483823776,
0.07602863013744354,
0.0040382626466453075,
-0.2533597946166992,
0.06295288354158401,
-0.1550413966178894,
-0.1765722781419754,
-0.050888024270534515,
0.029718894511461258,
0.005081530660390854,
0.05705815181136131,
-0.01558011770248413,
0.004096664022654295,
0.08761408925056458,
-0.010657376609742641,
-0.03486694395542145,
-0.12135346978902817,
0.12344875931739807,
-0.04964999109506607,
0.16806338727474213,
-0.0314546637237072,
0.043626196682453156,
0.11644309014081955,
0.030083931982517242,
-0.13479182124137878,
0.037341248244047165,
0.061822857707738876,
-0.09765145927667618,
0.020127970725297928,
0.14894694089889526,
-0.047505397349596024,
0.09886787086725235,
0.045315101742744446,
-0.1091127097606659,
0.0009426118922419846,
-0.06661784648895264,
-0.03369516879320145,
-0.08756396919488907,
-0.00997872930020094,
-0.0657370537519455,
0.16699565947055817,
0.22236165404319763,
-0.03759389743208885,
0.008672795258462429,
-0.0980934351682663,
0.012500466778874397,
0.0699242502450943,
0.03636903688311577,
-0.049476731568574905,
-0.18161405622959137,
0.005215798504650593,
0.05950938165187836,
-0.0018065419280901551,
-0.2532327473163605,
-0.07200147211551666,
0.036526553332805634,
-0.029419779777526855,
-0.03313320875167847,
0.109281025826931,
0.04742247238755226,
0.05070851743221283,
-0.031293079257011414,
-0.15287154912948608,
-0.033907078206539154,
0.1559305340051651,
-0.17384400963783264,
-0.03631192818284035
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-32-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-32-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
45,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0891522541642189,
0.09635492414236069,
-0.0025335673708468676,
0.07914844900369644,
0.14034374058246613,
0.03052675910294056,
0.09659172594547272,
0.13647045195102692,
-0.11281874775886536,
0.044926904141902924,
0.09579838067293167,
0.07716978341341019,
0.02869790606200695,
0.1437007337808609,
-0.030467508360743523,
-0.2441672533750534,
-0.008674286305904388,
-0.021920522674918175,
-0.10355369746685028,
0.11295624822378159,
0.0996590405702591,
-0.09915105253458023,
0.0667346715927124,
-0.020519189536571503,
-0.1760922372341156,
0.015310353599488735,
-0.017636949196457863,
-0.051866564899683,
0.11726441234350204,
-0.002170493360608816,
0.07842939347028732,
0.008488263934850693,
0.11734837293624878,
-0.19397997856140137,
0.01811865344643593,
0.0716266930103302,
0.044381603598594666,
0.09388316422700882,
0.0040459646843373775,
-0.0198359414935112,
0.119957335293293,
-0.13057339191436768,
0.09698464721441269,
0.03203308954834938,
-0.0977046862244606,
-0.20464594662189484,
-0.09682386368513107,
0.013295849785208702,
0.04273444414138794,
0.08883483707904816,
0.007544731255620718,
0.1496783047914505,
-0.1062767431139946,
0.08119678497314453,
0.2307872176170349,
-0.26716718077659607,
-0.0813087597489357,
0.04670379310846329,
0.06114928424358368,
0.0866113007068634,
-0.12200406193733215,
-0.01332859043031931,
0.010681800544261932,
0.020068097859621048,
0.10043526440858841,
-0.02839658409357071,
-0.07780596613883972,
0.012535344809293747,
-0.11199549585580826,
0.00032286212081089616,
0.11560633778572083,
0.038561079651117325,
-0.05512417480349541,
-0.07816331833600998,
-0.03912748396396637,
-0.06011858209967613,
-0.03498631715774536,
-0.014311938546597958,
0.03616093471646309,
-0.06182292103767395,
-0.14137452840805054,
-0.046718183904886246,
-0.04873374104499817,
-0.09208126366138458,
0.0007281096186488867,
0.2104937881231308,
0.03907385841012001,
0.021886684000492096,
-0.04876904934644699,
0.10525000840425491,
0.013707062229514122,
-0.12382641434669495,
-0.03309803456068039,
-0.004742263350635767,
-0.09360954165458679,
-0.03497619554400444,
-0.05674366652965546,
0.030266523361206055,
0.03939995914697647,
0.22753402590751648,
-0.027270730584859848,
0.07367720454931259,
0.03727708384394646,
-0.012895875610411167,
-0.02751704305410385,
0.14697860181331635,
-0.02772713266313076,
-0.07789842039346695,
0.010310450568795204,
0.06375758349895477,
0.0068467240780591965,
-0.005617571994662285,
-0.06312886625528336,
-0.04518958181142807,
0.06216916814446449,
0.05696863681077957,
-0.052139267325401306,
0.026649942621588707,
-0.00982277188450098,
-0.023384137079119682,
0.0006217416375875473,
-0.11914736032485962,
0.009108280763030052,
-0.009292714297771454,
-0.07973068952560425,
-0.05388691648840904,
0.013641154393553734,
-0.011720607057213783,
0.009342397563159466,
0.08868535608053207,
-0.07334887981414795,
-0.035935528576374054,
-0.07998209446668625,
-0.07474171370267868,
-0.014813169836997986,
-0.1604934185743332,
0.022638333961367607,
-0.069443479180336,
-0.15310567617416382,
-0.03200313821434975,
0.050074994564056396,
-0.07929354906082153,
-0.03473399952054024,
-0.03257003799080849,
-0.0799664631485939,
0.017852891236543655,
0.001038864254951477,
0.21458737552165985,
-0.04902924224734306,
0.0951705127954483,
0.01539844274520874,
0.054040417075157166,
0.0030328843276947737,
0.03806532174348831,
-0.0783485621213913,
0.013699290342628956,
-0.1749175637960434,
0.07547585666179657,
-0.08942045271396637,
0.028429917991161346,
-0.14454233646392822,
-0.08742933720350266,
-0.001173820230178535,
-0.019292322918772697,
0.08294656872749329,
0.10403982549905777,
-0.1260276436805725,
-0.02246430143713951,
0.12564599514007568,
-0.05618228390812874,
-0.0565192848443985,
0.06452005356550217,
-0.07398709654808044,
0.08190940320491791,
0.04905356094241142,
0.18767258524894714,
0.09973956644535065,
-0.10611404478549957,
0.025652606040239334,
0.019774002954363823,
0.038795728236436844,
-0.0003647196281235665,
0.056337565183639526,
0.0007654586806893349,
0.02183418907225132,
0.015718013048171997,
-0.08199673891067505,
0.016048185527324677,
-0.09249318391084671,
-0.06209666281938553,
-0.0422460101544857,
-0.09020853787660599,
0.005838470067828894,
0.009271930903196335,
0.02914595790207386,
-0.07983338832855225,
-0.08719043433666229,
0.06420981138944626,
0.14092551171779633,
-0.04562729597091675,
0.010717356577515602,
-0.07922698557376862,
0.0029427059926092625,
-0.02960977889597416,
-0.022629667073488235,
-0.19595909118652344,
-0.06024383008480072,
0.025474591180682182,
0.011789349839091301,
0.04468434303998947,
-0.005089337006211281,
0.07942747324705124,
0.02215912751853466,
-0.049583349376916885,
-0.0036728810518980026,
-0.09006515890359879,
-0.010842074640095234,
-0.0919124037027359,
-0.21686170995235443,
-0.05578088015317917,
-0.03904597833752632,
0.15174663066864014,
-0.1723785698413849,
0.00019728411280084401,
-0.020404288545250893,
0.10900583118200302,
0.040939316153526306,
-0.05028533190488815,
-0.000566652393899858,
0.03046913631260395,
0.015459525398910046,
-0.09932616353034973,
0.03519276902079582,
0.016239721328020096,
-0.09871569275856018,
-0.026555456221103668,
-0.10249455273151398,
0.002094142371788621,
0.0735551118850708,
0.07367242127656937,
-0.1070285364985466,
-0.018896128982305527,
-0.06204141303896904,
-0.028836600482463837,
-0.04612740874290466,
0.03484293073415756,
0.17300398647785187,
0.016902199015021324,
0.11322810500860214,
-0.0750158503651619,
-0.0853712186217308,
0.01804281584918499,
0.00877103116363287,
0.060172874480485916,
0.1133817732334137,
0.07802273333072662,
-0.09965483099222183,
0.057672467082738876,
0.08804681152105331,
-0.05016103759407997,
0.13926614820957184,
-0.049505721777677536,
-0.07904095202684402,
-0.030126284807920456,
0.0018585561774671078,
-0.0017519822577014565,
0.15092168748378754,
-0.04793884605169296,
0.009451130405068398,
0.034444548189640045,
0.029684146866202354,
0.00866020005196333,
-0.16263911128044128,
-0.024040576070547104,
0.020273273810744286,
-0.04797865450382233,
-0.028198199346661568,
0.011700689792633057,
0.012526318430900574,
0.09114455431699753,
0.048838309943675995,
-0.0006760179530829191,
0.006003981921821833,
-0.013603627681732178,
-0.04883107542991638,
0.2013292908668518,
-0.09141430258750916,
-0.0443449541926384,
-0.08504753559827805,
-0.001312400447204709,
-0.011032702401280403,
-0.03639485314488411,
0.016632497310638428,
-0.10043306648731232,
-0.02281232364475727,
-0.06523362547159195,
0.00351528637111187,
-0.04760197177529335,
0.01074914075434208,
0.0012117325095459819,
0.01860758289694786,
0.057713545858860016,
-0.13415959477424622,
0.014178360812366009,
-0.06285999715328217,
-0.11178475618362427,
0.030834903940558434,
0.05232331156730652,
0.08744607865810394,
0.06332670152187347,
-0.02661163918673992,
0.016969691962003708,
-0.04486657679080963,
0.23187197744846344,
-0.08510682731866837,
0.004947258625179529,
0.12473440915346146,
0.023806096985936165,
0.037610433995723724,
0.10031718760728836,
0.02820603922009468,
-0.10009551793336868,
0.04425572603940964,
0.07492202520370483,
-0.044307298958301544,
-0.2537969648838043,
0.008757230825722218,
-0.043804850429296494,
-0.08429496735334396,
0.08833523094654083,
0.04835555702447891,
-0.04115518182516098,
0.06571395695209503,
0.00383811560459435,
0.008903831243515015,
-0.024061357602477074,
0.08740727603435516,
0.08474507182836533,
0.05492755398154259,
0.10588885843753815,
-0.04120835289359093,
-0.01843581162393093,
0.06475929170846939,
0.027745947241783142,
0.30676794052124023,
-0.04801204800605774,
0.09975705295801163,
0.052394039928913116,
0.1481948345899582,
-0.021575674414634705,
0.037580061703920364,
0.011681986041367054,
-0.0056900205090641975,
-0.029061011970043182,
-0.05350267142057419,
-0.025744924321770668,
0.006411911454051733,
-0.0696655884385109,
0.042989540845155716,
-0.05632905662059784,
0.04720933735370636,
0.014861894771456718,
0.29175323247909546,
0.002769261598587036,
-0.2645728886127472,
-0.09999296814203262,
-0.012470231391489506,
-0.03834201768040657,
-0.050025824457407,
0.013618942350149155,
0.11953939497470856,
-0.12977182865142822,
0.02409985661506653,
-0.06518956273794174,
0.08495086431503296,
-0.02819736674427986,
-0.0057047288864851,
0.038286615163087845,
0.16469059884548187,
-0.021047575399279594,
0.06148851662874222,
-0.22270509600639343,
0.2298070639371872,
0.008314506150782108,
0.12156696617603302,
-0.053809754550457,
0.006959525868296623,
0.024081319570541382,
0.0002773957676254213,
0.09466026723384857,
-0.0024200675543397665,
-0.04633382707834244,
-0.13998547196388245,
-0.052324824035167694,
0.07157120853662491,
0.1396302431821823,
-0.04974553361535072,
0.10175302624702454,
-0.05961541831493378,
0.010845569893717766,
0.037480488419532776,
-0.08132476359605789,
-0.12129662930965424,
-0.10221341997385025,
-0.02084612473845482,
0.0009916065027937293,
-0.06315382570028305,
-0.06413684040307999,
-0.06662441790103912,
0.024857526645064354,
0.11565839499235153,
0.0015090068336576223,
-0.03438121825456619,
-0.14830425381660461,
0.07474149018526077,
0.15413211286067963,
-0.06828273087739944,
0.03259076923131943,
0.0027496207039803267,
0.07905098795890808,
0.03556985780596733,
-0.07700999081134796,
0.06390876322984695,
-0.06663762778043747,
-0.17900371551513672,
-0.04787307232618332,
0.1023290604352951,
0.07149697840213776,
0.041348669677972794,
-0.0058715431950986385,
0.047066155821084976,
-0.027597302570939064,
-0.09141609817743301,
0.02861250378191471,
0.031008923426270485,
0.035980015993118286,
0.043448008596897125,
-0.0780085027217865,
0.08584117889404297,
-0.04309260472655296,
-0.01950622908771038,
0.12247911095619202,
0.2302624136209488,
-0.10434935986995697,
0.09385168552398682,
0.05647919327020645,
-0.06069497764110565,
-0.1659679263830185,
0.07255282253026962,
0.10522382706403732,
0.013607025146484375,
0.0596124492585659,
-0.21614599227905273,
0.12281906604766846,
0.10207507014274597,
-0.013123484328389168,
0.0396517738699913,
-0.2782701253890991,
-0.11944873631000519,
0.05081342160701752,
0.12616485357284546,
0.08826880156993866,
-0.12551997601985931,
-0.018513957038521767,
-0.015596447512507439,
-0.12835311889648438,
0.07730473577976227,
-0.11327894032001495,
0.13170720636844635,
-0.024668212980031967,
0.1102992445230484,
0.012099015526473522,
-0.027263130992650986,
0.10680606961250305,
0.051079168915748596,
0.09713638573884964,
-0.04292990639805794,
0.0016177017241716385,
0.05895764008164406,
-0.048720549792051315,
0.0005749400006607175,
-0.06878919154405594,
0.08942882716655731,
-0.13792617619037628,
-0.0074110631830990314,
-0.08775833249092102,
0.042021170258522034,
-0.04065361246466637,
-0.06564100831747055,
-0.04059560224413872,
0.056165050715208054,
0.04381983354687691,
-0.03327765315771103,
0.03994234278798103,
-0.024996666237711906,
0.10249394923448563,
0.02606576681137085,
0.08623459190130234,
0.01827789843082428,
-0.05441780388355255,
0.021288102492690086,
-0.012291052378714085,
0.06353625655174255,
-0.16965435445308685,
0.009225737303495407,
0.09871339052915573,
0.06884903460741043,
0.10156495869159698,
0.04113825410604477,
-0.04660402983427048,
0.017347345128655434,
0.02780027687549591,
-0.09582190960645676,
-0.1143624559044838,
0.039788324385881424,
-0.037687718868255615,
-0.1468833088874817,
0.043028801679611206,
0.11901821941137314,
-0.04007524251937866,
-0.030784495174884796,
-0.020496482029557228,
0.001982701476663351,
-0.021475231274962425,
0.1809244155883789,
0.06168878450989723,
0.05968666821718216,
-0.10252939909696579,
0.11980088800191879,
0.03345385566353798,
-0.02497018128633499,
0.051618292927742004,
0.08264031261205673,
-0.1009562686085701,
-0.006233898922801018,
0.07884135097265244,
0.1264987736940384,
-0.055855050683021545,
0.00034331035567447543,
-0.10297306627035141,
-0.08606085181236267,
0.05820880085229874,
0.1426336169242859,
0.05039047449827194,
-0.017103858292102814,
-0.04676156863570213,
0.044942840933799744,
-0.14006802439689636,
0.07376546412706375,
0.032450754195451736,
0.06444092839956284,
-0.07774201035499573,
0.06538006663322449,
0.003673496190458536,
0.017976293340325356,
-0.014610185287892818,
0.0032461166847497225,
-0.09635688364505768,
-0.007857023738324642,
-0.08724769204854965,
0.001100493362173438,
0.0003995899751316756,
0.018629398196935654,
-0.024965135380625725,
-0.0703497901558876,
-0.0442827045917511,
0.03773130103945732,
-0.0858086496591568,
-0.051057081669569016,
0.007876033894717693,
0.04246365651488304,
-0.12298049032688141,
-0.004606145899742842,
0.027640899643301964,
-0.09692440181970596,
0.09815610945224762,
0.07229170203208923,
0.02121231146156788,
0.02922269143164158,
-0.11906526982784271,
-0.03380369395017624,
-0.013533114455640316,
-0.00925578735768795,
0.05983438715338707,
-0.09693552553653717,
-0.004642761778086424,
-0.045739032328128815,
0.06343435496091843,
0.012651098892092705,
0.062193579971790314,
-0.14004269242286682,
0.014702140353620052,
-0.07039693742990494,
-0.045171938836574554,
-0.07920801639556885,
0.04012598842382431,
0.09357749670743942,
0.059720929712057114,
0.14015451073646545,
-0.07485897839069366,
0.024876300245523453,
-0.2035859078168869,
-0.03663022443652153,
-0.013221746310591698,
-0.05552137643098831,
-0.14578424394130707,
-0.04741688445210457,
0.0823112428188324,
-0.03962450847029686,
0.09028556942939758,
-0.026983048766851425,
0.07305796444416046,
0.036569736897945404,
-0.05037030577659607,
-0.031233113259077072,
-0.00897741038352251,
0.20275302231311798,
0.07136518508195877,
-0.014639904722571373,
0.10252013057470322,
0.001400244189426303,
0.031507786363363266,
0.04560723155736923,
0.17001929879188538,
0.2222408503293991,
0.04140239953994751,
0.04980413243174553,
0.06372646987438202,
-0.07730916142463684,
-0.0696038082242012,
0.17369677126407623,
-0.010239259339869022,
0.06717085093259811,
-0.04669395834207535,
0.1937091052532196,
0.1172008067369461,
-0.16605697572231293,
0.04705817252397537,
-0.04571159556508064,
-0.08243800699710846,
-0.11836209148168564,
-0.007239619269967079,
-0.0834728255867958,
-0.1233949214220047,
0.03683166205883026,
-0.1189839318394661,
0.045704130083322525,
0.11034566164016724,
0.014256451278924942,
0.03470384702086449,
0.12607410550117493,
-0.009595613926649094,
-0.006161858327686787,
0.06820549070835114,
0.004468101542443037,
-0.010268563404679298,
-0.04112518951296806,
-0.07513157278299332,
0.05957992002367973,
0.0008965579909272492,
0.08118832856416702,
-0.046809837222099304,
-0.01489614974707365,
0.030717758461833,
-0.0298993531614542,
-0.0792662650346756,
0.02514616958796978,
0.04526205360889435,
0.05386471003293991,
0.049287065863609314,
0.0425403006374836,
-0.011080718599259853,
-0.033423494547605515,
0.32505545020103455,
-0.06687931716442108,
-0.10268661379814148,
-0.12390195578336716,
0.2163950502872467,
0.03237871825695038,
-0.026941834017634392,
0.03223881497979164,
-0.08588948845863342,
0.0001261866418644786,
0.16629044711589813,
0.17494921386241913,
-0.06791245192289352,
-0.020147230476140976,
0.00006306044815573841,
-0.016337884590029716,
-0.0324733592569828,
0.12609735131263733,
0.09789382666349411,
-0.01409501489251852,
-0.0607345774769783,
-0.01725677214562893,
-0.016286123543977737,
-0.03342796489596367,
-0.042124535888433456,
0.04776861146092415,
0.021048763766884804,
-0.026560908183455467,
-0.042501721531152725,
0.0756290927529335,
0.004672402516007423,
-0.25427281856536865,
0.06293732672929764,
-0.15495064854621887,
-0.1767832338809967,
-0.050865527242422104,
0.030000722035765648,
0.0057207075878977776,
0.056314412504434586,
-0.015321260318160057,
0.004520514979958534,
0.08865088224411011,
-0.011045736260712147,
-0.03494136407971382,
-0.12069854140281677,
0.12302341312170029,
-0.048851821571588516,
0.16806823015213013,
-0.03142106160521507,
0.04385971650481224,
0.11618296056985855,
0.030318159610033035,
-0.13394233584403992,
0.03754451498389244,
0.061517804861068726,
-0.09691169857978821,
0.02012532204389572,
0.14769184589385986,
-0.04731994494795799,
0.0986928641796112,
0.04568320885300636,
-0.10861295461654663,
0.00040896059363149107,
-0.06747785955667496,
-0.034101057797670364,
-0.0873488113284111,
-0.011059444397687912,
-0.06539427489042282,
0.16721341013908386,
0.22188355028629303,
-0.03753472492098808,
0.008356836624443531,
-0.09811878949403763,
0.012156248092651367,
0.07031723856925964,
0.03653609752655029,
-0.04974229261279106,
-0.1815183460712433,
0.005707030650228262,
0.059069957584142685,
-0.001754727796651423,
-0.25256580114364624,
-0.07246705144643784,
0.03636093810200691,
-0.029580263420939445,
-0.03283146768808365,
0.10945906490087509,
0.04776095971465111,
0.05088432505726814,
-0.031279291957616806,
-0.15289609134197235,
-0.034413356333971024,
0.1560264378786087,
-0.17379316687583923,
-0.036445170640945435
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07879433780908585,
0.09980728477239609,
-0.002623827662318945,
0.07672523707151413,
0.1317494660615921,
0.03377878665924072,
0.10729403793811798,
0.12379603087902069,
-0.12692274153232574,
0.06787534803152084,
0.09202756732702255,
0.08603614568710327,
0.033828288316726685,
0.13209135830402374,
-0.0404028482735157,
-0.23211216926574707,
0.007520322222262621,
-0.01732252910733223,
-0.06680455058813095,
0.10207121819257736,
0.08782333880662918,
-0.09770467877388,
0.08116422593593597,
-0.0027911392971873283,
-0.1832943856716156,
0.027748599648475647,
-0.020646413788199425,
-0.051221173256635666,
0.09803745150566101,
-0.004423639737069607,
0.08332253247499466,
0.007371523883193731,
0.11873217672109604,
-0.19249269366264343,
0.013513962738215923,
0.07337410748004913,
0.032907549291849136,
0.09546847641468048,
0.021193617954850197,
-0.0038126641884446144,
0.16390879452228546,
-0.13466131687164307,
0.09924081712961197,
0.02976052276790142,
-0.0868385061621666,
-0.16854813694953918,
-0.09644417464733124,
0.015267579816281796,
0.03607628494501114,
0.08749455213546753,
0.008739179000258446,
0.1786167025566101,
-0.09216268360614777,
0.08010043948888779,
0.23765021562576294,
-0.2770881950855255,
-0.07868573814630508,
0.042919646948575974,
0.048641204833984375,
0.07730372250080109,
-0.1188054010272026,
-0.015744775533676147,
0.02083089016377926,
0.029324054718017578,
0.09192299097776413,
-0.02516329288482666,
-0.08407571911811829,
-0.007386139128357172,
-0.11886562407016754,
0.003863124642521143,
0.11291361600160599,
0.04420469328761101,
-0.05072584003210068,
-0.049598004668951035,
-0.057830583304166794,
-0.07380286604166031,
-0.034833263605833054,
-0.0343967080116272,
0.04329977557063103,
-0.05698323994874954,
-0.12083972990512848,
-0.03338988125324249,
-0.0484706312417984,
-0.07785088568925858,
-0.02095629833638668,
0.20442500710487366,
0.04840986803174019,
0.03331614285707474,
-0.05275619029998779,
0.08191031217575073,
0.010918387211859226,
-0.12879909574985504,
-0.030197951942682266,
0.002839891705662012,
-0.08061347156763077,
-0.0392596460878849,
-0.054547786712646484,
0.01442230399698019,
0.04274918511509895,
0.21400678157806396,
-0.05250467732548714,
0.08363610506057739,
0.03051254339516163,
-0.019636323675513268,
-0.020884500816464424,
0.12511372566223145,
-0.019704176113009453,
-0.0786224827170372,
0.028430158272385597,
0.058321308344602585,
0.025026703253388405,
0.002592360367998481,
-0.05605711415410042,
-0.03260612115263939,
0.08276247978210449,
0.03244572877883911,
-0.05978008732199669,
0.02584456466138363,
0.0020512891933321953,
-0.019793245941400528,
0.005260919686406851,
-0.11360426992177963,
0.017338775098323822,
-0.0062055522575974464,
-0.07443668693304062,
-0.013836863450706005,
0.00987780187278986,
-0.015551568940281868,
0.010859252884984016,
0.10033504664897919,
-0.08912225067615509,
-0.021869337186217308,
-0.0766756683588028,
-0.07058030366897583,
-0.0008697014418430626,
-0.1473134309053421,
0.012574556283652782,
-0.07344947010278702,
-0.15559570491313934,
-0.033246252685785294,
0.04321664199233055,
-0.07075733691453934,
-0.02766926772892475,
-0.03850860148668289,
-0.07655968517065048,
0.026293406262993813,
0.0031867579091340303,
0.18773652613162994,
-0.052773408591747284,
0.08054985105991364,
0.019989721477031708,
0.051098231226205826,
-0.022170290350914,
0.03118763491511345,
-0.08912631869316101,
0.003759908489882946,
-0.16945092380046844,
0.06159704923629761,
-0.07593252509832382,
0.016923412680625916,
-0.12957170605659485,
-0.08649704605340958,
-0.026972046121954918,
-0.028930148109793663,
0.08366966992616653,
0.10114303231239319,
-0.13954615592956543,
-0.027705607935786247,
0.1124851182103157,
-0.07795953750610352,
-0.05792250856757164,
0.06606210768222809,
-0.06678807735443115,
0.05209674313664436,
0.05709443613886833,
0.18329855799674988,
0.07356511056423187,
-0.11637123674154282,
-0.02235189639031887,
0.0009092492982745171,
0.02850898541510105,
-0.01065665204077959,
0.050931666046381,
0.009816058911383152,
0.021800458431243896,
0.015940604731440544,
-0.03841439634561539,
0.003048479091376066,
-0.09762208163738251,
-0.06254386156797409,
-0.056870248168706894,
-0.08201594650745392,
-0.036465246230363846,
0.01322055421769619,
0.03795396536588669,
-0.08047761768102646,
-0.08357162773609161,
0.08097919821739197,
0.141106978058815,
-0.0424160435795784,
0.02114786207675934,
-0.07321105897426605,
0.019514985382556915,
-0.053263310343027115,
-0.031287189573049545,
-0.20433548092842102,
-0.06518340110778809,
0.03292154520750046,
-0.027204137295484543,
0.05620569735765457,
0.003757856087759137,
0.07271289825439453,
0.03724442794919014,
-0.03734292462468147,
0.005748927593231201,
-0.09001358598470688,
-0.0067503987811505795,
-0.09381762146949768,
-0.22212201356887817,
-0.03833065181970596,
-0.030930940061807632,
0.1260913908481598,
-0.16237589716911316,
-0.008433694019913673,
-0.03273790702223778,
0.12005984038114548,
0.030615629628300667,
-0.06314484775066376,
-0.014689687639474869,
0.03074139729142189,
0.0016639948589727283,
-0.09227408468723297,
0.03196856379508972,
0.016566799953579903,
-0.07244345545768738,
-0.057907748967409134,
-0.11689706146717072,
0.006520294118672609,
0.07859497517347336,
0.06362401694059372,
-0.09763152152299881,
0.005942050833255053,
-0.06494081020355225,
-0.03330518305301666,
-0.057863716036081314,
0.040809113532304764,
0.17768695950508118,
0.006782218813896179,
0.10696861147880554,
-0.07935066521167755,
-0.07359636574983597,
0.021726761013269424,
0.005702963098883629,
0.04273911938071251,
0.09147277474403381,
0.11254560947418213,
-0.12682503461837769,
0.06592100113630295,
0.08226508647203445,
-0.06133834272623062,
0.12644967436790466,
-0.038326431065797806,
-0.07988464832305908,
-0.03474457561969757,
-0.01827928051352501,
-0.013754161074757576,
0.13804559409618378,
-0.05127746984362602,
0.026123331859707832,
0.030377497896552086,
0.03971969708800316,
0.020207475870847702,
-0.15255922079086304,
-0.0030015590600669384,
0.008680708706378937,
-0.04292330518364906,
-0.018911326304078102,
0.022013157606124878,
0.019833920523524284,
0.09784609824419022,
0.03407156467437744,
-0.01553086657077074,
-0.007298098877072334,
-0.003986267372965813,
-0.05243712663650513,
0.1905655860900879,
-0.09067784994840622,
-0.04256126657128334,
-0.07711224257946014,
0.0008725055959075689,
-0.03709989786148071,
-0.04225001484155655,
0.02777845785021782,
-0.08852114528417587,
-0.03883513808250427,
-0.0736054852604866,
-0.00130745442584157,
-0.04698812961578369,
0.026318229734897614,
0.03034505620598793,
0.00437470106408,
0.06111453101038933,
-0.1345565766096115,
0.004517021588981152,
-0.07315249741077423,
-0.10466686636209488,
0.016982246190309525,
0.06381475925445557,
0.09123523533344269,
0.05782271549105644,
-0.029502540826797485,
0.022114498540759087,
-0.032235343009233475,
0.2522353231906891,
-0.05642715469002724,
-0.000861810811329633,
0.1071716696023941,
0.025232696905732155,
0.04558490216732025,
0.09357919543981552,
0.034855011850595474,
-0.10192765295505524,
0.030060861259698868,
0.08539710938930511,
-0.03462650254368782,
-0.23926550149917603,
-0.0051353247836232185,
-0.035622868686914444,
-0.11248134076595306,
0.08310603350400925,
0.05023295059800148,
-0.04179217666387558,
0.06378988921642303,
0.008464885875582695,
0.023678649216890335,
-0.04998210817575455,
0.09338754415512085,
0.09893392771482468,
0.07422308623790741,
0.10222014039754868,
-0.04762214049696922,
-0.019931064918637276,
0.06974361091852188,
-0.00394422048702836,
0.2926509976387024,
-0.02624116651713848,
0.06917637586593628,
0.052475910633802414,
0.13773445785045624,
-0.021396063268184662,
0.03682830184698105,
0.006471645087003708,
-0.004264747258275747,
-0.025456715375185013,
-0.0575895793735981,
-0.028868773952126503,
-0.0017335653537884355,
-0.07729405909776688,
0.05584156885743141,
-0.061627499759197235,
0.06297466158866882,
0.020757228136062622,
0.2609296441078186,
-0.0024645496159791946,
-0.2802160084247589,
-0.07895255088806152,
-0.022112354636192322,
-0.03831176832318306,
-0.044777628034353256,
0.01248670183122158,
0.09796424210071564,
-0.10453325510025024,
0.05251685157418251,
-0.05775068327784538,
0.08156322687864304,
-0.02772108092904091,
-0.004698731936514378,
0.030273187905550003,
0.18268486857414246,
-0.01672898605465889,
0.04881025105714798,
-0.2010529637336731,
0.21659567952156067,
0.01866634003818035,
0.13302195072174072,
-0.05024099722504616,
0.008155963383615017,
0.024021951481699944,
-0.00047029025154188275,
0.0752108097076416,
-0.005644663702696562,
-0.07546845078468323,
-0.12641757726669312,
-0.07481250166893005,
0.08020173013210297,
0.13950061798095703,
-0.016293711960315704,
0.10208471119403839,
-0.049480509012937546,
0.019038725644350052,
0.04005826264619827,
-0.069692462682724,
-0.1572706401348114,
-0.09662441909313202,
-0.01617906428873539,
0.03669186681509018,
-0.09251373261213303,
-0.046830132603645325,
-0.07556012272834778,
-0.009491296485066414,
0.11550373584032059,
0.022562798112630844,
-0.01924065314233303,
-0.13754694163799286,
0.08750737458467484,
0.1490672379732132,
-0.07298769801855087,
0.02395295351743698,
-0.007616058457642794,
0.06455215811729431,
0.04249058663845062,
-0.09471812844276428,
0.0477592833340168,
-0.05784673988819122,
-0.1609412133693695,
-0.047333598136901855,
0.09048806130886078,
0.0719648003578186,
0.04023420438170433,
-0.005031404551118612,
0.04983774572610855,
-0.02116408199071884,
-0.09926898032426834,
0.013057602569460869,
0.03723709657788277,
0.0506359338760376,
0.0366332121193409,
-0.08240114897489548,
0.05836832523345947,
-0.033036842942237854,
-0.0051370663568377495,
0.11346657574176788,
0.24332644045352936,
-0.08906049281358719,
0.08664223551750183,
0.057009294629096985,
-0.06767415255308151,
-0.14318405091762543,
0.06377311050891876,
0.10420748591423035,
-0.0004922719672322273,
0.056358836591243744,
-0.19537389278411865,
0.13973121345043182,
0.11462467908859253,
-0.012445586733520031,
0.03912093862891197,
-0.2734343707561493,
-0.11835777014493942,
0.058375295251607895,
0.13134953379631042,
0.12024206668138504,
-0.13337644934654236,
-0.013260651379823685,
-0.017195427790284157,
-0.1259465515613556,
0.1075652539730072,
-0.11160645633935928,
0.13433758914470673,
-0.034072145819664,
0.10996437817811966,
0.005183063447475433,
-0.03039836511015892,
0.10771309584379196,
0.04876434803009033,
0.09765283763408661,
-0.04167017713189125,
0.011533831246197224,
0.06037867069244385,
-0.04877980798482895,
0.01276681013405323,
-0.07241897284984589,
0.08308129757642746,
-0.12152323871850967,
-0.0067305779084563255,
-0.07865557819604874,
0.05125760659575462,
-0.036738160997629166,
-0.0530417263507843,
-0.053528327494859695,
0.03676949068903923,
0.05598711594939232,
-0.036740582436323166,
0.05547759309411049,
-0.0010786900529637933,
0.09271547943353653,
0.02294703759253025,
0.06912648677825928,
-0.00011821782391052693,
-0.04891705885529518,
0.020548202097415924,
-0.009456126019358635,
0.06135552003979683,
-0.13885727524757385,
0.005532585084438324,
0.10664188116788864,
0.05224642902612686,
0.09721759706735611,
0.04350114241242409,
-0.04739580675959587,
0.012435094453394413,
0.03782200440764427,
-0.11334742605686188,
-0.10089432448148727,
0.048444461077451706,
-0.03846260532736778,
-0.13853059709072113,
0.04656246304512024,
0.11395107209682465,
-0.04893195629119873,
-0.022845447063446045,
-0.017932897433638573,
0.006506636738777161,
-0.021148795261979103,
0.18524061143398285,
0.041907407343387604,
0.04095498472452164,
-0.10175976157188416,
0.13061589002609253,
0.02894287370145321,
-0.02305946871638298,
0.059064220637083054,
0.08510088920593262,
-0.09460129588842392,
0.002496575703844428,
0.09477715939283371,
0.17589613795280457,
-0.07143654674291611,
-0.014445466920733452,
-0.10404547303915024,
-0.06999840587377548,
0.06213189661502838,
0.16080529987812042,
0.05556904524564743,
-0.01776663027703762,
-0.05013415589928627,
0.04149400070309639,
-0.1413133591413498,
0.06177595257759094,
0.031910061836242676,
0.06997643411159515,
-0.08696930855512619,
0.05824492126703262,
0.007808128837496042,
0.0034256845247000456,
-0.01692246086895466,
0.014078708365559578,
-0.0933699756860733,
-0.028925715014338493,
-0.07488219439983368,
0.010793047025799751,
-0.012998297810554504,
0.015815652906894684,
-0.01019431371241808,
-0.06827152520418167,
-0.07013878226280212,
0.036562301218509674,
-0.07735287398099899,
-0.052389971911907196,
0.012260285206139088,
0.04247358441352844,
-0.13352009654045105,
0.006199282128363848,
0.01604403741657734,
-0.08933701366186142,
0.0852738618850708,
0.08957023173570633,
0.026681305840611458,
0.0333307608962059,
-0.13085800409317017,
-0.032771527767181396,
0.013567083515226841,
0.0019377616699784994,
0.06517449021339417,
-0.09578784555196762,
-0.0040917894802987576,
-0.021362101659178734,
0.0757846087217331,
0.01082452479749918,
0.08274079859256744,
-0.13150793313980103,
0.009229307994246483,
-0.08288402110338211,
-0.04340264946222305,
-0.06587862968444824,
0.016090355813503265,
0.10188813507556915,
0.05309765040874481,
0.16273976862430573,
-0.07713973522186279,
0.01873123086988926,
-0.20830263197422028,
-0.02753509022295475,
-0.0056768255308270454,
-0.053165074437856674,
-0.13609974086284637,
-0.039861585944890976,
0.0771130844950676,
-0.03873619809746742,
0.09919097274541855,
-0.0213454682379961,
0.061247535049915314,
0.03958698734641075,
-0.03269030898809433,
-0.0634763091802597,
-0.028966302052140236,
0.19766154885292053,
0.07779984176158905,
-0.015795545652508736,
0.10753617435693741,
-0.00503700552508235,
0.05232769995927811,
0.030486857518553734,
0.20488175749778748,
0.20640447735786438,
0.005054190754890442,
0.07026126980781555,
0.060268837958574295,
-0.08186807483434677,
-0.06852486729621887,
0.1783214807510376,
-0.026941414922475815,
0.07263711094856262,
-0.028973698616027832,
0.18799535930156708,
0.11345047503709793,
-0.1507856696844101,
0.031861696392297745,
-0.03190622478723526,
-0.07651782035827637,
-0.1405555009841919,
0.0004828931705560535,
-0.09620188176631927,
-0.11852090060710907,
0.04478222504258156,
-0.11984162777662277,
0.056392718106508255,
0.08030953258275986,
0.012895838357508183,
0.03404269739985466,
0.12643331289291382,
-0.02647947520017624,
0.0042160311713814735,
0.040620286017656326,
0.007633236702531576,
-0.029473889619112015,
-0.04212844371795654,
-0.0773097574710846,
0.049991633743047714,
0.005495171528309584,
0.08794856071472168,
-0.04450983181595802,
-0.009854072704911232,
0.041791245341300964,
-0.02829502895474434,
-0.07695047557353973,
0.02482285350561142,
0.03623690456151962,
0.05631566792726517,
0.04662115126848221,
0.04424610733985901,
-0.005585740320384502,
-0.032777972519397736,
0.2807771563529968,
-0.058605656027793884,
-0.09558548033237457,
-0.1149626150727272,
0.20687483251094818,
0.04095669835805893,
-0.02945122867822647,
0.03799726814031601,
-0.08364863693714142,
-0.012224256061017513,
0.15570156276226044,
0.1558440774679184,
-0.061337608844041824,
-0.024484924972057343,
-0.01229771040380001,
-0.016634738072752953,
-0.03912048041820526,
0.11436444520950317,
0.094927117228508,
0.0014529029140248895,
-0.053512342274188995,
-0.02682086080312729,
-0.035579342395067215,
-0.015271738171577454,
-0.04058969393372536,
0.024958442896604538,
0.014077103696763515,
-0.022923802956938744,
-0.035101860761642456,
0.06298161298036575,
-0.0009172607096843421,
-0.24246859550476074,
0.06275534629821777,
-0.14478261768817902,
-0.16854965686798096,
-0.025291619822382927,
0.04944189637899399,
-0.011083442717790604,
0.05039128661155701,
-0.022899337112903595,
-0.0045812735334038734,
0.08083367347717285,
-0.019977983087301254,
-0.05655217170715332,
-0.12530164420604706,
0.11137498915195465,
-0.0582270473241806,
0.18059417605400085,
-0.017136452719569206,
0.0678299069404602,
0.11803020536899567,
0.04360607638955116,
-0.13948124647140503,
0.04660340026021004,
0.04718143120408058,
-0.11380642652511597,
0.01792064867913723,
0.14436352252960205,
-0.04664007946848869,
0.08801448345184326,
0.043908823281526566,
-0.09647171944379807,
-0.010072331875562668,
-0.04806714132428169,
-0.02615230157971382,
-0.07028744369745255,
-0.011855301447212696,
-0.0673305094242096,
0.17009970545768738,
0.19612711668014526,
-0.025107616558670998,
0.013502119109034538,
-0.09387847781181335,
0.029296236112713814,
0.06853202730417252,
0.03764726221561432,
-0.051251377910375595,
-0.2082255482673645,
0.0190826915204525,
0.04939199239015579,
-0.0030189692042768,
-0.23189041018486023,
-0.07801882177591324,
0.04012114927172661,
-0.0340297557413578,
-0.05607854574918747,
0.09632018208503723,
0.035237498581409454,
0.04773775115609169,
-0.036842916160821915,
-0.14961813390254974,
-0.037304218858480453,
0.15479357540607452,
-0.17862802743911743,
-0.04967663809657097
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07933445274829865,
0.09717181324958801,
-0.002555482555180788,
0.07669408619403839,
0.1320524364709854,
0.034033019095659256,
0.1076090857386589,
0.12205290794372559,
-0.12560544908046722,
0.06692679971456528,
0.09094814956188202,
0.08723532408475876,
0.03404821828007698,
0.13252535462379456,
-0.039763759821653366,
-0.23295623064041138,
0.007395423017442226,
-0.017015304416418076,
-0.06562792509794235,
0.10202254354953766,
0.08813626319169998,
-0.0982276052236557,
0.08056842535734177,
-0.0028356038965284824,
-0.18361595273017883,
0.027879994362592697,
-0.020297924056649208,
-0.050026122480630875,
0.09841710329055786,
-0.006037972867488861,
0.08309127390384674,
0.006940846331417561,
0.12078189849853516,
-0.19071201980113983,
0.013560520485043526,
0.07296141237020493,
0.03305552527308464,
0.0950247198343277,
0.02087114192545414,
-0.0027817650698125362,
0.16329583525657654,
-0.13511742651462555,
0.09957187622785568,
0.029104337096214294,
-0.08692698180675507,
-0.16755633056163788,
-0.09654717147350311,
0.014700109139084816,
0.03766995668411255,
0.08655332773923874,
0.008399292826652527,
0.17863290011882782,
-0.09257879108190536,
0.08025328814983368,
0.2383178174495697,
-0.27816465497016907,
-0.07867498695850372,
0.04477845877408981,
0.05043615400791168,
0.07826948910951614,
-0.11845079809427261,
-0.016498848795890808,
0.021314436569809914,
0.029099322855472565,
0.0929226353764534,
-0.025637686252593994,
-0.08410061150789261,
-0.007718271110206842,
-0.1202811673283577,
0.0056611341424286366,
0.11188112199306488,
0.04448475316166878,
-0.05038008838891983,
-0.049495697021484375,
-0.05857827514410019,
-0.07374747842550278,
-0.0362405888736248,
-0.03529137745499611,
0.043778594583272934,
-0.057206813246011734,
-0.1195121705532074,
-0.03304363042116165,
-0.04751328006386757,
-0.07832470536231995,
-0.020776143297553062,
0.2037287801504135,
0.0487167127430439,
0.03299146145582199,
-0.05253361538052559,
0.08110696077346802,
0.007979664020240307,
-0.1288110613822937,
-0.028889091685414314,
0.003095407271757722,
-0.08184248954057693,
-0.04017644748091698,
-0.05529552698135376,
0.014490733854472637,
0.04199136793613434,
0.21381530165672302,
-0.04997221380472183,
0.08384884893894196,
0.03215381130576134,
-0.019902734085917473,
-0.02083519659936428,
0.12546902894973755,
-0.020091451704502106,
-0.0804111436009407,
0.028287576511502266,
0.05867692083120346,
0.02510198950767517,
0.003533089067786932,
-0.056864187121391296,
-0.032699085772037506,
0.08309539407491684,
0.03211944177746773,
-0.06167025491595268,
0.02615104801952839,
0.0024015873204916716,
-0.019523432478308678,
0.00647336570546031,
-0.1141822561621666,
0.0179299283772707,
-0.00673631438985467,
-0.07522204518318176,
-0.013612615875899792,
0.008291169069707394,
-0.01464842353016138,
0.011262908577919006,
0.09947781264781952,
-0.08928197622299194,
-0.021305209025740623,
-0.07739925384521484,
-0.0716724842786789,
-0.000989283318631351,
-0.14983625710010529,
0.013178586959838867,
-0.07231223583221436,
-0.15782487392425537,
-0.0337585024535656,
0.04252946004271507,
-0.07029999047517776,
-0.027727343142032623,
-0.03950941190123558,
-0.07671099156141281,
0.02495248056948185,
0.003942498005926609,
0.1893187165260315,
-0.05222557485103607,
0.07924427837133408,
0.01990765519440174,
0.05247456952929497,
-0.022915421053767204,
0.03137240558862686,
-0.08832371234893799,
0.003740564687177539,
-0.17037229239940643,
0.06131923943758011,
-0.07613352686166763,
0.018894188106060028,
-0.12888924777507782,
-0.08614428341388702,
-0.027352893725037575,
-0.02856840007007122,
0.08364555984735489,
0.10078464448451996,
-0.1422545313835144,
-0.026579901576042175,
0.11302343755960464,
-0.07765264064073563,
-0.05712040141224861,
0.06464582681655884,
-0.06711573898792267,
0.05287950113415718,
0.05850121006369591,
0.18335701525211334,
0.07361669838428497,
-0.1159132719039917,
-0.021739931777119637,
0.0010698819532990456,
0.028285415843129158,
-0.012430802918970585,
0.050382211804389954,
0.010601495392620564,
0.022800887003540993,
0.015977395698428154,
-0.03726094588637352,
0.002395468298345804,
-0.09737494587898254,
-0.06239175796508789,
-0.05714348331093788,
-0.08294074982404709,
-0.036580927670001984,
0.013262576423585415,
0.038158517330884933,
-0.08090968430042267,
-0.08280112594366074,
0.08245440572500229,
0.14091847836971283,
-0.04225960001349449,
0.020341195166110992,
-0.07283958047628403,
0.018931547179818153,
-0.05365799367427826,
-0.031471673399209976,
-0.2046850025653839,
-0.06424102932214737,
0.03327856957912445,
-0.028052547946572304,
0.056206442415714264,
0.00612990278750658,
0.07325519621372223,
0.0370330810546875,
-0.03665661811828613,
0.006090518552809954,
-0.08918783068656921,
-0.0065843937918543816,
-0.09562662988901138,
-0.2216818928718567,
-0.038819100707769394,
-0.031070156022906303,
0.12542970478534698,
-0.1627795547246933,
-0.00814042892307043,
-0.035153232514858246,
0.11971773952245712,
0.02976207435131073,
-0.0627562552690506,
-0.014939527958631516,
0.03029577247798443,
0.0024832545313984156,
-0.09247755259275436,
0.032200586050748825,
0.01537315733730793,
-0.07165797054767609,
-0.05943355709314346,
-0.11782710999250412,
0.004835099913179874,
0.0780402198433876,
0.06412051618099213,
-0.09733454138040543,
0.006849692668765783,
-0.0645696148276329,
-0.032551154494285583,
-0.05674581602215767,
0.040599275380373,
0.17644312977790833,
0.0061026024632155895,
0.10655549168586731,
-0.07957293093204498,
-0.07436495274305344,
0.021544838324189186,
0.006469871383160353,
0.04416794329881668,
0.09136516600847244,
0.1131809800863266,
-0.12641668319702148,
0.06519544124603271,
0.08429541438817978,
-0.06139301136136055,
0.12466322630643845,
-0.03849530965089798,
-0.08013797551393509,
-0.034592945128679276,
-0.01951625756919384,
-0.014705426059663296,
0.13771681487560272,
-0.05194360390305519,
0.02467016875743866,
0.029704546555876732,
0.03926439955830574,
0.0198979414999485,
-0.1533777266740799,
-0.002668749075382948,
0.008134950883686543,
-0.04213320463895798,
-0.019713839516043663,
0.02166571095585823,
0.018967490643262863,
0.09787939488887787,
0.033166371285915375,
-0.01674545556306839,
-0.008042927831411362,
-0.004184408579021692,
-0.05181088298559189,
0.19085487723350525,
-0.09026286005973816,
-0.04015441611409187,
-0.07516606897115707,
0.0002697127347346395,
-0.03601944446563721,
-0.04232126101851463,
0.02686838060617447,
-0.08857215940952301,
-0.038902487605810165,
-0.07345954328775406,
-0.0018084089970216155,
-0.04637855663895607,
0.026304036378860474,
0.03150872886180878,
0.004204572178423405,
0.06017811968922615,
-0.13472628593444824,
0.004470542073249817,
-0.07436921447515488,
-0.10490935295820236,
0.01638616994023323,
0.06384669989347458,
0.09069542586803436,
0.05945833772420883,
-0.029502876102924347,
0.022029509767889977,
-0.031877532601356506,
0.2529120445251465,
-0.05630415678024292,
0.000884476350620389,
0.10722572356462479,
0.02537626214325428,
0.045529283583164215,
0.09469977766275406,
0.03484285622835159,
-0.10235022753477097,
0.0297203678637743,
0.08476036787033081,
-0.03407617285847664,
-0.23958005011081696,
-0.005214412230998278,
-0.03522220253944397,
-0.11468237638473511,
0.08282267302274704,
0.04978203400969505,
-0.04151090979576111,
0.06483758985996246,
0.009198971092700958,
0.024670874699950218,
-0.049975305795669556,
0.0925358459353447,
0.0976531133055687,
0.07360949367284775,
0.10271115601062775,
-0.04803673177957535,
-0.02079959586262703,
0.06854137033224106,
-0.004209449980407953,
0.2934677302837372,
-0.024735579267144203,
0.06808032095432281,
0.05219617486000061,
0.1371055692434311,
-0.021697549149394035,
0.0387069471180439,
0.006946483161300421,
-0.005225811619311571,
-0.025755075737833977,
-0.05739859491586685,
-0.027042187750339508,
-0.001751488191075623,
-0.07775139808654785,
0.05568438395857811,
-0.0608825758099556,
0.062325406819581985,
0.020314089953899384,
0.259279727935791,
-0.000915368611458689,
-0.28062310814857483,
-0.07770348340272903,
-0.02180400677025318,
-0.039282333105802536,
-0.044177938252687454,
0.012260306626558304,
0.09633196890354156,
-0.1039028912782669,
0.0533888079226017,
-0.05751137435436249,
0.08186596632003784,
-0.02671836130321026,
-0.004041646607220173,
0.029512958601117134,
0.18410207331180573,
-0.016447603702545166,
0.048860643059015274,
-0.20156285166740417,
0.2159920036792755,
0.018718654289841652,
0.1337003856897354,
-0.050864193588495255,
0.007666934281587601,
0.024118542671203613,
-0.0006173931760713458,
0.07522786408662796,
-0.005610577762126923,
-0.07615859061479568,
-0.12642869353294373,
-0.07453001290559769,
0.08019617944955826,
0.14015662670135498,
-0.014933794736862183,
0.1023487001657486,
-0.04883376136422157,
0.01818881183862686,
0.0405416302382946,
-0.07073414325714111,
-0.158425971865654,
-0.09564007073640823,
-0.017004741355776787,
0.03571714088320732,
-0.09408950805664062,
-0.04614151641726494,
-0.07557304203510284,
-0.011941283941268921,
0.11374363303184509,
0.024761583656072617,
-0.019238853827118874,
-0.13703718781471252,
0.08783876150846481,
0.1496034562587738,
-0.07273755967617035,
0.024809954687952995,
-0.007039198186248541,
0.06472133845090866,
0.04262446239590645,
-0.09459099918603897,
0.04809372499585152,
-0.057863298803567886,
-0.16029900312423706,
-0.046791575849056244,
0.09132395684719086,
0.0725187286734581,
0.04034711793065071,
-0.0038156176451593637,
0.0499868281185627,
-0.021564632654190063,
-0.09945008158683777,
0.012389587238430977,
0.03714529424905777,
0.04963015019893646,
0.03676246851682663,
-0.08330727368593216,
0.05712242051959038,
-0.03296319395303726,
-0.0035168041940778494,
0.11302795261144638,
0.2410326451063156,
-0.08902087807655334,
0.08592004328966141,
0.0564422644674778,
-0.06792628765106201,
-0.1430722326040268,
0.06493611633777618,
0.10434069484472275,
-0.00036506980541162193,
0.05693846568465233,
-0.19383957982063293,
0.14124630391597748,
0.11449596285820007,
-0.011995586566627026,
0.038754258304834366,
-0.27402207255363464,
-0.11830242723226547,
0.05884717032313347,
0.13195666670799255,
0.12066163867712021,
-0.13295486569404602,
-0.01286946889013052,
-0.0183388814330101,
-0.12592346966266632,
0.10761137306690216,
-0.11268850415945053,
0.13397301733493805,
-0.03385629877448082,
0.10941329598426819,
0.0049584535881876945,
-0.03047078661620617,
0.10641605406999588,
0.050905827432870865,
0.09826160967350006,
-0.04186297208070755,
0.010527891106903553,
0.06192167103290558,
-0.04810597375035286,
0.01341486070305109,
-0.07161305844783783,
0.08257478475570679,
-0.12006881088018417,
-0.006682395935058594,
-0.07815221697092056,
0.05052211135625839,
-0.036444492638111115,
-0.05262402445077896,
-0.05324598029255867,
0.0364518016576767,
0.05535631999373436,
-0.03676995262503624,
0.0538199320435524,
-0.00006696231139358133,
0.09218768775463104,
0.019574441015720367,
0.06989140808582306,
-0.0009153424762189388,
-0.04765734449028969,
0.021016953513026237,
-0.009088875725865364,
0.06037665903568268,
-0.1396758109331131,
0.005068385973572731,
0.10675226151943207,
0.052311062812805176,
0.09688353538513184,
0.04356672614812851,
-0.047230735421180725,
0.011821059510111809,
0.03745002672076225,
-0.11179962754249573,
-0.1025950163602829,
0.04941808432340622,
-0.03954728692770004,
-0.13849228620529175,
0.0484231673181057,
0.11237392574548721,
-0.04901662841439247,
-0.02346939407289028,
-0.018860332667827606,
0.006098902318626642,
-0.021208126097917557,
0.18664291501045227,
0.04283995181322098,
0.040828365832567215,
-0.10250218957662582,
0.1298077404499054,
0.028720969334244728,
-0.0215433482080698,
0.05800468474626541,
0.08595003187656403,
-0.09545373916625977,
0.0024173071142286062,
0.09641129523515701,
0.17722278833389282,
-0.06975776702165604,
-0.014039596542716026,
-0.1040370762348175,
-0.07078003883361816,
0.062266986817121506,
0.16187553107738495,
0.05633826553821564,
-0.01915687508881092,
-0.04992823675274849,
0.0417841337621212,
-0.14102420210838318,
0.06120257452130318,
0.03203526511788368,
0.07027318328619003,
-0.08661777526140213,
0.05883438512682915,
0.00789385661482811,
0.0038743787445127964,
-0.01708054170012474,
0.015272348187863827,
-0.09318777173757553,
-0.029551731422543526,
-0.07484227418899536,
0.009228142909705639,
-0.013547472655773163,
0.015703976154327393,
-0.010493765585124493,
-0.06830748170614243,
-0.0699184462428093,
0.03602324798703194,
-0.07754313945770264,
-0.05250396579504013,
0.011928089894354343,
0.041282348334789276,
-0.13289031386375427,
0.006077023688703775,
0.015268443152308464,
-0.08851922303438187,
0.08471371978521347,
0.08869683742523193,
0.02735387347638607,
0.03447338193655014,
-0.13175082206726074,
-0.03270933777093887,
0.014171434566378593,
0.0023774155415594578,
0.0658276379108429,
-0.0946708470582962,
-0.004154940135776997,
-0.021044643595814705,
0.07781074941158295,
0.010168257169425488,
0.08037368208169937,
-0.13052593171596527,
0.009513596072793007,
-0.08406370133161545,
-0.04346638172864914,
-0.06608937680721283,
0.015765121206641197,
0.10108833014965057,
0.052379902452230453,
0.1633099615573883,
-0.07599272578954697,
0.018690694123506546,
-0.2090710699558258,
-0.02786681428551674,
-0.0062124053947627544,
-0.053792547434568405,
-0.1358591765165329,
-0.040719173848629,
0.07760701328516006,
-0.03868928551673889,
0.10076594352722168,
-0.020596439018845558,
0.06219274550676346,
0.03891953080892563,
-0.028679950162768364,
-0.06350807845592499,
-0.028403859585523605,
0.19767926633358002,
0.07778263837099075,
-0.015819871798157692,
0.10682736337184906,
-0.004024678375571966,
0.05257108062505722,
0.02844390645623207,
0.20452792942523956,
0.20622849464416504,
0.003927762154489756,
0.07008476555347443,
0.0610373318195343,
-0.081707663834095,
-0.06703770905733109,
0.1800987273454666,
-0.028089160099625587,
0.07126680016517639,
-0.028992576524615288,
0.19055956602096558,
0.11199913918972015,
-0.15060488879680634,
0.03217744827270508,
-0.03163950517773628,
-0.07719893008470535,
-0.1400279700756073,
-0.00003246593041694723,
-0.09677910059690475,
-0.11819568276405334,
0.044692639261484146,
-0.11959689855575562,
0.055977094918489456,
0.08144115656614304,
0.012729410082101822,
0.033554792404174805,
0.1270459145307541,
-0.0266116876155138,
0.005007921252399683,
0.04050256311893463,
0.007429863326251507,
-0.029077183455228806,
-0.04190110042691231,
-0.076472207903862,
0.05006149038672447,
0.0049925558269023895,
0.08769430965185165,
-0.045916348695755005,
-0.01121510099619627,
0.04125478118658066,
-0.027634238824248314,
-0.07667393982410431,
0.025093629956245422,
0.0359552800655365,
0.05580293759703636,
0.045293137431144714,
0.04482175409793854,
-0.005873269401490688,
-0.03312015160918236,
0.2792045474052429,
-0.05852183699607849,
-0.09723641723394394,
-0.11406589299440384,
0.20558835566043854,
0.041309621185064316,
-0.029230251908302307,
0.03828871622681618,
-0.08341061323881149,
-0.010958656668663025,
0.15670554339885712,
0.15601760149002075,
-0.061655253171920776,
-0.024737555533647537,
-0.012449762783944607,
-0.01715129427611828,
-0.03968222066760063,
0.1148698702454567,
0.09543552249670029,
-0.00037061076727695763,
-0.053289905190467834,
-0.02655279077589512,
-0.03497457504272461,
-0.015773560851812363,
-0.0413997620344162,
0.02369437925517559,
0.015319230034947395,
-0.022915709763765335,
-0.03344276174902916,
0.064054474234581,
0.0004335845878813416,
-0.2422257512807846,
0.06087831035256386,
-0.14399217069149017,
-0.1688079833984375,
-0.025694184005260468,
0.04982120916247368,
-0.009578980505466461,
0.05035591125488281,
-0.023196810856461525,
-0.0042572845704853535,
0.08044683933258057,
-0.02000950649380684,
-0.056663963943719864,
-0.1256350874900818,
0.1118563860654831,
-0.060680121183395386,
0.17943356931209564,
-0.017097052186727524,
0.06879795342683792,
0.1177888959646225,
0.04248913750052452,
-0.13869883120059967,
0.047382060438394547,
0.046740297228097916,
-0.11311333626508713,
0.01909768022596836,
0.143819659948349,
-0.0461382158100605,
0.08452415466308594,
0.043377384543418884,
-0.09678639471530914,
-0.009414342232048512,
-0.04722646623849869,
-0.02647445909678936,
-0.07067012041807175,
-0.010659589432179928,
-0.06730809807777405,
0.1704106330871582,
0.19612157344818115,
-0.02513812854886055,
0.014188419096171856,
-0.09424353390932083,
0.028373003005981445,
0.06844151020050049,
0.03790729120373726,
-0.05177849158644676,
-0.208597794175148,
0.018687358126044273,
0.04827835410833359,
-0.0029817000031471252,
-0.2298031747341156,
-0.07720308750867844,
0.038524169474840164,
-0.03510774299502373,
-0.05626419186592102,
0.0953635647892952,
0.03642263263463974,
0.04787259176373482,
-0.036658406257629395,
-0.15045693516731262,
-0.03751304745674133,
0.15541107952594757,
-0.1791541874408722,
-0.04921049252152443
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07843504846096039,
0.09890100359916687,
-0.0026633245870471,
0.07685951888561249,
0.13119302690029144,
0.03437059372663498,
0.10744237899780273,
0.12305132299661636,
-0.12638461589813232,
0.06727360934019089,
0.09142734855413437,
0.08615692704916,
0.03369143232703209,
0.13191913068294525,
-0.03994287922978401,
-0.23312850296497345,
0.0072624776512384415,
-0.01839403621852398,
-0.06756372004747391,
0.10207972675561905,
0.0887027233839035,
-0.0971149355173111,
0.08061031997203827,
-0.0025683955755084753,
-0.18398691713809967,
0.027914930135011673,
-0.019479453563690186,
-0.04977231100201607,
0.09815342724323273,
-0.005142203997820616,
0.08325762301683426,
0.007983389310538769,
0.12045516073703766,
-0.19179628789424896,
0.013248596340417862,
0.07217386364936829,
0.03325579687952995,
0.09531685709953308,
0.021460017189383507,
-0.0027569865342229605,
0.16335812211036682,
-0.13415537774562836,
0.09896192699670792,
0.029261520132422447,
-0.08669300377368927,
-0.16807596385478973,
-0.09680072218179703,
0.015607587061822414,
0.03790850564837456,
0.08573353290557861,
0.009177629835903645,
0.17766538262367249,
-0.09229450672864914,
0.07989971339702606,
0.2374604493379593,
-0.2788220942020416,
-0.07894299924373627,
0.04332556203007698,
0.04930426925420761,
0.07874476909637451,
-0.1175745502114296,
-0.016053233295679092,
0.021112589165568352,
0.02916557900607586,
0.09180645644664764,
-0.025700930505990982,
-0.0840536430478096,
-0.007876994088292122,
-0.12003455311059952,
0.003764895023778081,
0.11227104812860489,
0.04401936009526253,
-0.05139582231640816,
-0.04816562309861183,
-0.05896327644586563,
-0.07165016978979111,
-0.034962452948093414,
-0.03581530600786209,
0.04372667893767357,
-0.05632169544696808,
-0.11931981891393661,
-0.03493525832891464,
-0.04886778071522713,
-0.07885079830884933,
-0.02103501930832863,
0.205043762922287,
0.04880216717720032,
0.03430313989520073,
-0.05232696607708931,
0.08242066949605942,
0.01013255026191473,
-0.12848982214927673,
-0.029550911858677864,
0.0033050784841179848,
-0.08136119693517685,
-0.03966118022799492,
-0.05479780212044716,
0.015254572033882141,
0.04235237091779709,
0.21331696212291718,
-0.052453070878982544,
0.08333688229322433,
0.03101765550673008,
-0.019101357087492943,
-0.020554672926664352,
0.125057190656662,
-0.018517939373850822,
-0.07776456326246262,
0.028358306735754013,
0.05828174576163292,
0.02553926780819893,
0.0033789444714784622,
-0.056685514748096466,
-0.03314739465713501,
0.08398009836673737,
0.0332644023001194,
-0.06096423789858818,
0.024225573986768723,
0.0010724900057539344,
-0.019788172096014023,
0.00502522150054574,
-0.11420871317386627,
0.017908673733472824,
-0.006164638791233301,
-0.07392662018537521,
-0.014786796644330025,
0.009088322520256042,
-0.014568036422133446,
0.011717542074620724,
0.09864307940006256,
-0.08799812197685242,
-0.02069598250091076,
-0.075926274061203,
-0.06973610073328018,
-0.0009187632822431624,
-0.14700917899608612,
0.012577036395668983,
-0.0730871707201004,
-0.15576690435409546,
-0.03235829249024391,
0.043284256011247635,
-0.07166814059019089,
-0.030051138252019882,
-0.03840911388397217,
-0.0757145881652832,
0.026142138987779617,
0.003144602756947279,
0.18890531361103058,
-0.052287667989730835,
0.08027426153421402,
0.0200066976249218,
0.05211939290165901,
-0.022395003587007523,
0.030480561777949333,
-0.08762484043836594,
0.004669526591897011,
-0.17045728862285614,
0.06181645765900612,
-0.07530390471220016,
0.01570662297308445,
-0.1300586760044098,
-0.0855921283364296,
-0.026268528774380684,
-0.029562532901763916,
0.08284252136945724,
0.1010073870420456,
-0.14035587012767792,
-0.0265034232288599,
0.11223814636468887,
-0.07929487526416779,
-0.05691417679190636,
0.06579922884702682,
-0.06690983474254608,
0.05434451624751091,
0.05665484443306923,
0.18320228159427643,
0.07401149719953537,
-0.11631535738706589,
-0.020060652866959572,
0.0024792158510535955,
0.02797732688486576,
-0.010668647475540638,
0.051938582211732864,
0.009759878739714622,
0.020506946370005608,
0.01570962555706501,
-0.03936801478266716,
0.002300615655258298,
-0.09798278659582138,
-0.06316373497247696,
-0.05707431584596634,
-0.08236473798751831,
-0.03543790802359581,
0.011885087937116623,
0.038411945104599,
-0.08014208823442459,
-0.08230318874120712,
0.08154401183128357,
0.1414262056350708,
-0.0417555570602417,
0.02152329310774803,
-0.07319742441177368,
0.01897464133799076,
-0.05424506589770317,
-0.031884659081697464,
-0.20443175733089447,
-0.06503146141767502,
0.033895332366228104,
-0.02811766415834427,
0.056184135377407074,
0.0060962834395468235,
0.07250922918319702,
0.03729439899325371,
-0.03699240833520889,
0.005737438797950745,
-0.08987122774124146,
-0.007138238754123449,
-0.09494351595640182,
-0.22241786122322083,
-0.03850850462913513,
-0.03110843151807785,
0.1262320876121521,
-0.16220565140247345,
-0.009168439544737339,
-0.03394562378525734,
0.11967819184064865,
0.03010842390358448,
-0.06252318620681763,
-0.015581009909510612,
0.02938765101134777,
0.001859569689258933,
-0.09170448035001755,
0.032151270657777786,
0.016631433740258217,
-0.07231792062520981,
-0.057221364229917526,
-0.11594654619693756,
0.0060200137086212635,
0.07673411071300507,
0.0639808177947998,
-0.09748511761426926,
0.005972332786768675,
-0.06495080143213272,
-0.03331686928868294,
-0.057985976338386536,
0.040841348469257355,
0.17700667679309845,
0.0059300572611391544,
0.10679072141647339,
-0.07970461994409561,
-0.073811374604702,
0.021461986005306244,
0.004637867212295532,
0.04291953891515732,
0.09119736403226852,
0.11297180503606796,
-0.12807697057724,
0.0646556094288826,
0.0838632583618164,
-0.06125027313828468,
0.12444476783275604,
-0.038493573665618896,
-0.07987849414348602,
-0.03592808172106743,
-0.017203085124492645,
-0.01416446641087532,
0.13735397160053253,
-0.05086372047662735,
0.02654190920293331,
0.02990121766924858,
0.03998835012316704,
0.019755663350224495,
-0.1539456844329834,
-0.0028324101585894823,
0.008523697033524513,
-0.043723709881305695,
-0.017883647233247757,
0.021158695220947266,
0.019966602325439453,
0.09830024838447571,
0.03390316665172577,
-0.016331521794199944,
-0.006765948608517647,
-0.004160432610660791,
-0.0529610849916935,
0.19017070531845093,
-0.09011979401111603,
-0.04228759557008743,
-0.07696134597063065,
0.0010051093995571136,
-0.03608006611466408,
-0.04211708530783653,
0.027483640238642693,
-0.08757512271404266,
-0.038306765258312225,
-0.0736498311161995,
-0.0007546435226686299,
-0.047283973544836044,
0.026876479387283325,
0.03185448423027992,
0.00446734856814146,
0.06229161471128464,
-0.13409273326396942,
0.004538349341601133,
-0.07380776107311249,
-0.10586827248334885,
0.017368046566843987,
0.06416201591491699,
0.09027687460184097,
0.05838054046034813,
-0.028805511072278023,
0.021872421726584435,
-0.03142615407705307,
0.25316014885902405,
-0.05563948675990105,
0.000004804436684935354,
0.10752246528863907,
0.02400369383394718,
0.046146050095558167,
0.09447448700666428,
0.03418325260281563,
-0.1021641194820404,
0.029934074729681015,
0.084447480738163,
-0.03446643054485321,
-0.239525705575943,
-0.005482450593262911,
-0.034588009119033813,
-0.11367825418710709,
0.08280111104249954,
0.050059106200933456,
-0.044685572385787964,
0.06372988224029541,
0.009502140805125237,
0.023524010553956032,
-0.049668002873659134,
0.09262307733297348,
0.09855670481920242,
0.07371119409799576,
0.10225388407707214,
-0.04734158515930176,
-0.02071092277765274,
0.07003672420978546,
-0.0032168086618185043,
0.2925523817539215,
-0.025226864963769913,
0.06738969683647156,
0.052461571991443634,
0.13841557502746582,
-0.021848833188414574,
0.03692513704299927,
0.007227722089737654,
-0.004354756325483322,
-0.02599889412522316,
-0.05738767236471176,
-0.02816225215792656,
-0.0007191959884949028,
-0.07656188309192657,
0.05550816282629967,
-0.06178414449095726,
0.06466628611087799,
0.020008036866784096,
0.2610921263694763,
-0.0007509373244829476,
-0.27866223454475403,
-0.07763733714818954,
-0.021164264529943466,
-0.03915294632315636,
-0.04401116445660591,
0.012264860793948174,
0.09839022904634476,
-0.10487576574087143,
0.0521928071975708,
-0.05729285627603531,
0.08109380304813385,
-0.027451634407043457,
-0.0049447668716311455,
0.02884751558303833,
0.18216979503631592,
-0.015446996316313744,
0.04941864311695099,
-0.19965411722660065,
0.21629741787910461,
0.018546627834439278,
0.13222230970859528,
-0.049505773931741714,
0.008230984210968018,
0.023368125781416893,
-0.0005848869332112372,
0.0758422389626503,
-0.004708610009402037,
-0.07692215591669083,
-0.126641184091568,
-0.07589340209960938,
0.07996679842472076,
0.14034993946552277,
-0.016360299661755562,
0.1020040363073349,
-0.049504879862070084,
0.01892189122736454,
0.03993893414735794,
-0.06947846710681915,
-0.15779082477092743,
-0.09520190209150314,
-0.016486983746290207,
0.03446927294135094,
-0.0943768247961998,
-0.047300104051828384,
-0.07564441114664078,
-0.007947489619255066,
0.11590294539928436,
0.023509327322244644,
-0.019481966271996498,
-0.1368068903684616,
0.08774664998054504,
0.14903944730758667,
-0.07359965890645981,
0.023641429841518402,
-0.007296026684343815,
0.06522727757692337,
0.042518001049757004,
-0.09482810646295547,
0.04799015820026398,
-0.0576951690018177,
-0.16175520420074463,
-0.046948447823524475,
0.0919746607542038,
0.07218698412179947,
0.04063795879483223,
-0.0037916514556854963,
0.04944407939910889,
-0.020321637392044067,
-0.0991666316986084,
0.013710536062717438,
0.03744226694107056,
0.04956431686878204,
0.03657462075352669,
-0.08293786644935608,
0.05963362008333206,
-0.03270604461431503,
-0.004887801129370928,
0.11451434344053268,
0.244334876537323,
-0.08943255245685577,
0.0879000797867775,
0.05636667087674141,
-0.06852345168590546,
-0.1433490812778473,
0.06273835897445679,
0.10601575672626495,
-0.0011164301540702581,
0.0581718385219574,
-0.1945929229259491,
0.14024248719215393,
0.11426464468240738,
-0.013197780586779118,
0.03784053772687912,
-0.27501970529556274,
-0.11837039887905121,
0.05774380639195442,
0.13161976635456085,
0.12225557118654251,
-0.1324080377817154,
-0.013800877146422863,
-0.01679726503789425,
-0.1259150505065918,
0.10681567341089249,
-0.11078514903783798,
0.13400261104106903,
-0.03386034816503525,
0.1101006343960762,
0.005207358859479427,
-0.02981402538716793,
0.10826802998781204,
0.04920048266649246,
0.09670481830835342,
-0.041359271854162216,
0.011102111078798771,
0.0599336214363575,
-0.048951953649520874,
0.01277061365544796,
-0.07135146856307983,
0.08326645940542221,
-0.12117306888103485,
-0.007021556608378887,
-0.07808162271976471,
0.050334446132183075,
-0.037295132875442505,
-0.05258772894740105,
-0.05315800756216049,
0.03626789525151253,
0.05566177889704704,
-0.03666007146239281,
0.05354103073477745,
0.00045690577826462686,
0.09095916152000427,
0.02424999698996544,
0.06880020350217819,
0.0009389863116666675,
-0.04870684817433357,
0.019546519964933395,
-0.009281067177653313,
0.060752175748348236,
-0.13877132534980774,
0.0064065540209412575,
0.1060328334569931,
0.05169794708490372,
0.09721477329730988,
0.04325014352798462,
-0.04792813956737518,
0.01286146230995655,
0.03736213222146034,
-0.11307737976312637,
-0.10321841388940811,
0.04869246855378151,
-0.03855516389012337,
-0.13883866369724274,
0.0462513342499733,
0.1148795485496521,
-0.048405442386865616,
-0.02341221459209919,
-0.018612122163176537,
0.007151058409363031,
-0.021488917991518974,
0.18538828194141388,
0.04159640893340111,
0.04145272448658943,
-0.1013261005282402,
0.1299784928560257,
0.02878180705010891,
-0.021023957058787346,
0.05811084434390068,
0.08520490676164627,
-0.09449413418769836,
0.002871688222512603,
0.0960283875465393,
0.17601794004440308,
-0.0714530497789383,
-0.013766765594482422,
-0.10384421795606613,
-0.07093688100576401,
0.0617079883813858,
0.1602444350719452,
0.05661633238196373,
-0.01822250708937645,
-0.04979856684803963,
0.04148462414741516,
-0.14003409445285797,
0.06188807263970375,
0.03277143836021423,
0.07019725441932678,
-0.0875573605298996,
0.0584552176296711,
0.007515620440244675,
0.005372277926653624,
-0.017307206988334656,
0.014173328876495361,
-0.09311060607433319,
-0.029453791677951813,
-0.07635967433452606,
0.011359103955328465,
-0.012332498095929623,
0.015659144148230553,
-0.009746940806508064,
-0.06888014078140259,
-0.06984706223011017,
0.03687255084514618,
-0.07745515555143356,
-0.05252053588628769,
0.011232517659664154,
0.041890569031238556,
-0.13309097290039062,
0.005380010232329369,
0.016483934596180916,
-0.08951064199209213,
0.08599353581666946,
0.08953014016151428,
0.027352506294846535,
0.03390682488679886,
-0.12978634238243103,
-0.03292255103588104,
0.014478967525064945,
0.002323180204257369,
0.06522875279188156,
-0.09649614989757538,
-0.004805820528417826,
-0.020874470472335815,
0.07692601531744003,
0.010364766232669353,
0.08356329053640366,
-0.13134872913360596,
0.009699663147330284,
-0.08430476486682892,
-0.045165129005908966,
-0.06579307466745377,
0.015210739336907864,
0.10050856322050095,
0.053829409182071686,
0.16328968107700348,
-0.07721708714962006,
0.019032318145036697,
-0.2087598294019699,
-0.02760697901248932,
-0.006077596452087164,
-0.05171367526054382,
-0.13632093369960785,
-0.040465474128723145,
0.07651441544294357,
-0.03862191364169121,
0.09914472699165344,
-0.02122485637664795,
0.06086914241313934,
0.03922772780060768,
-0.031539950519800186,
-0.062007881700992584,
-0.028612473979592323,
0.19607242941856384,
0.07786458730697632,
-0.015485957264900208,
0.10720865428447723,
-0.005645796190947294,
0.053134892135858536,
0.028450744226574898,
0.2058940827846527,
0.20624665915966034,
0.004588594660162926,
0.07014358788728714,
0.060953497886657715,
-0.08127540349960327,
-0.06918129324913025,
0.17852236330509186,
-0.027690034359693527,
0.07144784182310104,
-0.02802751027047634,
0.1886226385831833,
0.11310483515262604,
-0.15135331451892853,
0.03095255047082901,
-0.0316663421690464,
-0.07728557288646698,
-0.14129170775413513,
0.0007647775928489864,
-0.09748990833759308,
-0.11843287199735641,
0.04456670582294464,
-0.11956297606229782,
0.056450486183166504,
0.08039819449186325,
0.012070740573108196,
0.03438745066523552,
0.12471211701631546,
-0.02586541324853897,
0.004756101872771978,
0.039829473942518234,
0.007748124189674854,
-0.028339512646198273,
-0.0403270497918129,
-0.07710489630699158,
0.04991622269153595,
0.006820371840149164,
0.08744753152132034,
-0.044686637818813324,
-0.010008960962295532,
0.04077243432402611,
-0.02846873365342617,
-0.07699595391750336,
0.024670254439115524,
0.035826824605464935,
0.05599663406610489,
0.04435334727168083,
0.045089226216077805,
-0.005504407919943333,
-0.03270070627331734,
0.28057336807250977,
-0.05828377231955528,
-0.09642425179481506,
-0.11433204263448715,
0.20775221288204193,
0.03971951827406883,
-0.02836880460381508,
0.039389412850141525,
-0.08325253427028656,
-0.012681832537055016,
0.15493841469287872,
0.15535444021224976,
-0.0632348582148552,
-0.024565663188695908,
-0.012665444053709507,
-0.016869859769940376,
-0.03873591497540474,
0.1157582625746727,
0.09465215355157852,
0.0014761878410354257,
-0.05401862785220146,
-0.027126427739858627,
-0.036249272525310516,
-0.01512169186025858,
-0.041599780321121216,
0.024760711938142776,
0.014079096727073193,
-0.0216097142547369,
-0.03491721674799919,
0.06308808922767639,
0.0004393417329993099,
-0.2416013926267624,
0.06149608641862869,
-0.143407940864563,
-0.16928395628929138,
-0.024699587374925613,
0.050508711487054825,
-0.011457542888820171,
0.050421230494976044,
-0.02406732179224491,
-0.004679592791944742,
0.08093830943107605,
-0.02014489471912384,
-0.05705616995692253,
-0.12459968030452728,
0.11257316917181015,
-0.05995332449674606,
0.1806900054216385,
-0.01622704230248928,
0.06938843429088593,
0.11724782735109329,
0.043024662882089615,
-0.13967493176460266,
0.046175938099622726,
0.04693753272294998,
-0.11334069073200226,
0.018274830654263496,
0.1448100507259369,
-0.04642368108034134,
0.0865587517619133,
0.04486139863729477,
-0.096351757645607,
-0.011143670417368412,
-0.0470186285674572,
-0.02645209990441799,
-0.07004475593566895,
-0.012444966472685337,
-0.06850504875183105,
0.16941942274570465,
0.19544732570648193,
-0.025408929213881493,
0.014543074183166027,
-0.09373408555984497,
0.029200170189142227,
0.06844940036535263,
0.039492588490247726,
-0.050600506365299225,
-0.20869337022304535,
0.0184231735765934,
0.04918507859110832,
-0.002513215411454439,
-0.23104093968868256,
-0.07901611924171448,
0.039162080734968185,
-0.034003447741270065,
-0.056080613285303116,
0.09682940691709518,
0.03484969213604927,
0.04728572443127632,
-0.03634178638458252,
-0.1514756977558136,
-0.038053855299949646,
0.15477991104125977,
-0.17861749231815338,
-0.04937344044446945
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07873482257127762,
0.09782089293003082,
-0.0026264761108905077,
0.07774025201797485,
0.13241396844387054,
0.03475649282336235,
0.10780248790979385,
0.12241142243146896,
-0.12641656398773193,
0.06687899678945541,
0.09151739627122879,
0.086184561252594,
0.03300252929329872,
0.13154271245002747,
-0.03982881084084511,
-0.2331240028142929,
0.006844268646091223,
-0.017806554213166237,
-0.06708002835512161,
0.10212523490190506,
0.08783932775259018,
-0.09768413007259369,
0.0811188668012619,
-0.002850441262125969,
-0.184867262840271,
0.028262292966246605,
-0.020128142088651657,
-0.04955274984240532,
0.098368339240551,
-0.0050194342620670795,
0.08361640572547913,
0.007233727257698774,
0.12026362121105194,
-0.1904817372560501,
0.01353044155985117,
0.07244978100061417,
0.03280908241868019,
0.0949053168296814,
0.020981954410672188,
-0.003156806342303753,
0.1621718853712082,
-0.13455404341220856,
0.09890559315681458,
0.029191235080361366,
-0.08696358650922775,
-0.16876661777496338,
-0.09629052132368088,
0.014300910755991936,
0.03693058341741562,
0.08668088912963867,
0.008844215422868729,
0.17712481319904327,
-0.0924520492553711,
0.08027343451976776,
0.23598486185073853,
-0.2794466018676758,
-0.07929588854312897,
0.043819326907396317,
0.04937197268009186,
0.07897918671369553,
-0.11809725314378738,
-0.015279512852430344,
0.021326303482055664,
0.0299964789301157,
0.09206975996494293,
-0.026003604754805565,
-0.08410212397575378,
-0.007439407054334879,
-0.11984618008136749,
0.005198419559746981,
0.11287254840135574,
0.04420178383588791,
-0.051015447825193405,
-0.04842434078454971,
-0.05825342610478401,
-0.07296594977378845,
-0.035363078117370605,
-0.0348438024520874,
0.043813012540340424,
-0.057179488241672516,
-0.11967424303293228,
-0.03336937725543976,
-0.048444170504808426,
-0.07778201252222061,
-0.021168744191527367,
0.20415239036083221,
0.04859711602330208,
0.034001097083091736,
-0.0520966574549675,
0.08167040348052979,
0.009795279242098331,
-0.1283378303050995,
-0.02857455238699913,
0.002771932166069746,
-0.08092629909515381,
-0.039575766772031784,
-0.05541021004319191,
0.016408884897828102,
0.042579732835292816,
0.2135729044675827,
-0.05249536409974098,
0.08398742973804474,
0.03157924488186836,
-0.019695810973644257,
-0.020523836836218834,
0.12408895045518875,
-0.019930876791477203,
-0.07925660163164139,
0.028405344113707542,
0.05829701945185661,
0.024915413931012154,
0.003143715439364314,
-0.056922852993011475,
-0.03228924423456192,
0.08271708339452744,
0.03282419592142105,
-0.06191194802522659,
0.0261724591255188,
0.0019868265371769667,
-0.019407257437705994,
0.004910332150757313,
-0.11400865018367767,
0.017614778131246567,
-0.006407279055565596,
-0.07405083626508713,
-0.013893074356019497,
0.008650301024317741,
-0.015381267294287682,
0.011400469578802586,
0.09919710457324982,
-0.08887358754873276,
-0.021322811022400856,
-0.07648257166147232,
-0.07073504477739334,
-0.001447812421247363,
-0.14696387946605682,
0.01353232841938734,
-0.07286833971738815,
-0.15605340898036957,
-0.033561889082193375,
0.04311765357851982,
-0.07146070897579193,
-0.028709014877676964,
-0.03841780498623848,
-0.07655005156993866,
0.02581026218831539,
0.0033112061209976673,
0.18945838510990143,
-0.05265795812010765,
0.07926607131958008,
0.02088683657348156,
0.05207445099949837,
-0.022921260446310043,
0.030634989961981773,
-0.0874309241771698,
0.004006264731287956,
-0.1709524244070053,
0.06109117344021797,
-0.07601076364517212,
0.017526859417557716,
-0.12894105911254883,
-0.08665986359119415,
-0.02540159784257412,
-0.02842782437801361,
0.0824853777885437,
0.10039431601762772,
-0.14015308022499084,
-0.026562556624412537,
0.11157221347093582,
-0.0780261754989624,
-0.05706212669610977,
0.06571750342845917,
-0.06720544397830963,
0.053318094462156296,
0.05702309310436249,
0.18344034254550934,
0.07401005178689957,
-0.115885429084301,
-0.020904503762722015,
0.0016071118880063295,
0.028290798887610435,
-0.011163728311657906,
0.05041416734457016,
0.010685504414141178,
0.020668910816311836,
0.01622285321354866,
-0.03814737871289253,
0.002279196633026004,
-0.09754915535449982,
-0.06270509213209152,
-0.05649220570921898,
-0.08222420513629913,
-0.03613380342721939,
0.0123783890157938,
0.03822452202439308,
-0.0810292661190033,
-0.08272337913513184,
0.08086361736059189,
0.1410733312368393,
-0.04221557453274727,
0.020427824929356575,
-0.07345689833164215,
0.019537609070539474,
-0.0541207492351532,
-0.031362585723400116,
-0.20416423678398132,
-0.06593798100948334,
0.032898545265197754,
-0.02613009698688984,
0.05635808780789375,
0.006289295386523008,
0.07306372374296188,
0.03750799596309662,
-0.037264756858348846,
0.005513923242688179,
-0.08866517245769501,
-0.0066338940523564816,
-0.09467236697673798,
-0.2228609174489975,
-0.038727790117263794,
-0.03093266487121582,
0.1262674480676651,
-0.16270741820335388,
-0.008807818405330181,
-0.03378321975469589,
0.11921107023954391,
0.029764413833618164,
-0.06198900192975998,
-0.014915965497493744,
0.03084491938352585,
0.002605454297736287,
-0.09170173108577728,
0.03269345313310623,
0.01631380245089531,
-0.07118470966815948,
-0.05849820375442505,
-0.11645971983671188,
0.007307701278477907,
0.0778423473238945,
0.06281027942895889,
-0.09795203059911728,
0.006295208353549242,
-0.06448552012443542,
-0.033145975321531296,
-0.05626412481069565,
0.040760587900877,
0.17801780998706818,
0.005427549593150616,
0.10669206827878952,
-0.07902304083108902,
-0.07343411445617676,
0.02162831462919712,
0.005743047688156366,
0.04402799531817436,
0.09118591994047165,
0.11228293925523758,
-0.12649081647396088,
0.06502880901098251,
0.08329130709171295,
-0.0615975558757782,
0.12517786026000977,
-0.03861379623413086,
-0.07928227633237839,
-0.035398930311203,
-0.018316222354769707,
-0.014496383257210255,
0.13801269233226776,
-0.05134419724345207,
0.025510616600513458,
0.029567379504442215,
0.039568617939949036,
0.02004869468510151,
-0.15297994017601013,
-0.0028036052826792,
0.00782637856900692,
-0.04271465912461281,
-0.01859356090426445,
0.021808341145515442,
0.01946067251265049,
0.09795348346233368,
0.03381161764264107,
-0.01732656918466091,
-0.006765891797840595,
-0.004080073442310095,
-0.05206676945090294,
0.19049857556819916,
-0.09060999006032944,
-0.041235972195863724,
-0.07672170549631119,
-0.0008274169522337615,
-0.03716427460312843,
-0.04253796860575676,
0.027186106890439987,
-0.08879038691520691,
-0.038673631846904755,
-0.07332520186901093,
-0.0013618969824165106,
-0.047195982187986374,
0.026680832728743553,
0.031220747157931328,
0.003965915646404028,
0.0617644339799881,
-0.1339528113603592,
0.004653644748032093,
-0.07401919364929199,
-0.10585208237171173,
0.017811406403779984,
0.06477472931146622,
0.09099813550710678,
0.05802586302161217,
-0.02880106307566166,
0.021927425637841225,
-0.03129861131310463,
0.25397026538848877,
-0.05583648756146431,
0.0002262004854856059,
0.10730265080928802,
0.024733806028962135,
0.04521781578660011,
0.09455809742212296,
0.035152144730091095,
-0.10254044830799103,
0.029624033719301224,
0.0849759578704834,
-0.034087274223566055,
-0.23946866393089294,
-0.005088073201477528,
-0.0350879542529583,
-0.11376360803842545,
0.08239661902189255,
0.05001061037182808,
-0.044080235064029694,
0.06428474932909012,
0.010312782600522041,
0.024229364469647408,
-0.050379008054733276,
0.09235288947820663,
0.1002068743109703,
0.0732589140534401,
0.10293780267238617,
-0.04766405373811722,
-0.02098073996603489,
0.06938303261995316,
-0.003085086354985833,
0.2932308614253998,
-0.025446390733122826,
0.06722290068864822,
0.052934639155864716,
0.13785098493099213,
-0.021266916766762733,
0.03749147057533264,
0.006619683932512999,
-0.004992810543626547,
-0.025964627042412758,
-0.05714935064315796,
-0.027166660875082016,
-0.0016963524976745248,
-0.07799804210662842,
0.0552431158721447,
-0.06175189092755318,
0.06323470175266266,
0.020399849861860275,
0.26037806272506714,
-0.0013921012869104743,
-0.28069621324539185,
-0.0781308263540268,
-0.02183404564857483,
-0.03867755085229874,
-0.04366932064294815,
0.012580404058098793,
0.09775137901306152,
-0.10439721494913101,
0.05284293368458748,
-0.05722611024975777,
0.08099043369293213,
-0.027490966022014618,
-0.003918907605111599,
0.030811434611678123,
0.18363916873931885,
-0.016394130885601044,
0.04895945265889168,
-0.20032748579978943,
0.21529355645179749,
0.0188146885484457,
0.13277682662010193,
-0.04966491833329201,
0.008043463341891766,
0.024200761690735817,
0.0007765482296235859,
0.07547596096992493,
-0.005307202693074942,
-0.07671121507883072,
-0.12673506140708923,
-0.07477737963199615,
0.0808839425444603,
0.13990692794322968,
-0.015311356633901596,
0.10264253616333008,
-0.048823822289705276,
0.018449721857905388,
0.039819635450839996,
-0.07063476741313934,
-0.1578332781791687,
-0.09587255120277405,
-0.01729041337966919,
0.03587379306554794,
-0.09405545145273209,
-0.04645292088389397,
-0.07592914253473282,
-0.009619170799851418,
0.11517959088087082,
0.024939807131886482,
-0.019558116793632507,
-0.13724949955940247,
0.08679850399494171,
0.1490546017885208,
-0.07274259626865387,
0.023839017376303673,
-0.006971451919525862,
0.06419119983911514,
0.0434243269264698,
-0.0946681797504425,
0.048217978328466415,
-0.05832492560148239,
-0.16052842140197754,
-0.04707708954811096,
0.09104350209236145,
0.07174529880285263,
0.0399310477077961,
-0.004316185601055622,
0.04949509724974632,
-0.021127188578248024,
-0.09977038949728012,
0.013832884840667248,
0.03596392646431923,
0.05033223703503609,
0.03654661774635315,
-0.08388865739107132,
0.05948837101459503,
-0.03208121284842491,
-0.004033774137496948,
0.11312926560640335,
0.24200080335140228,
-0.08901724219322205,
0.08588665723800659,
0.05643597990274429,
-0.06806609034538269,
-0.1430623084306717,
0.06406920403242111,
0.10451296716928482,
-0.0007803107728250325,
0.056671347469091415,
-0.19521492719650269,
0.14115522801876068,
0.11378078907728195,
-0.012188704684376717,
0.03962305933237076,
-0.2725788354873657,
-0.11765673756599426,
0.05796046182513237,
0.13222770392894745,
0.1232791319489479,
-0.13301768898963928,
-0.013135701417922974,
-0.016996046528220177,
-0.1256953477859497,
0.10626289993524551,
-0.1131768450140953,
0.13419212400913239,
-0.03425001725554466,
0.11065685003995895,
0.004536670166999102,
-0.029839864000678062,
0.1074104830622673,
0.05000993609428406,
0.09780263900756836,
-0.0418139211833477,
0.012041812762618065,
0.059946246445178986,
-0.048216890543699265,
0.012450549751520157,
-0.0721258670091629,
0.08274751156568527,
-0.12139631062746048,
-0.006861014757305384,
-0.0786423310637474,
0.050348229706287384,
-0.037146709859371185,
-0.05245867371559143,
-0.052557170391082764,
0.03642342984676361,
0.054696932435035706,
-0.036899447441101074,
0.052554868161678314,
-0.00045800674706697464,
0.09154252707958221,
0.023016633465886116,
0.0694224089384079,
-0.0005295440787449479,
-0.049115173518657684,
0.02080133743584156,
-0.0098204230889678,
0.06074654683470726,
-0.13843348622322083,
0.00515756756067276,
0.10662467777729034,
0.05103600025177002,
0.09679888933897018,
0.0440036915242672,
-0.047073472291231155,
0.01223226822912693,
0.03817399591207504,
-0.11339245736598969,
-0.10221899300813675,
0.04892556741833687,
-0.04156644269824028,
-0.13820797204971313,
0.04747443646192551,
0.1146462932229042,
-0.048206932842731476,
-0.023105314001441002,
-0.018539372831583023,
0.006463191006332636,
-0.021449286490678787,
0.1858399510383606,
0.042124100029468536,
0.040938008576631546,
-0.1022830531001091,
0.12964998185634613,
0.028671301901340485,
-0.021516254171729088,
0.05811215192079544,
0.08627171069383621,
-0.09534169733524323,
0.001967897405847907,
0.09526511281728745,
0.1776612251996994,
-0.07091882079839706,
-0.014428790658712387,
-0.10488762706518173,
-0.07129190862178802,
0.06160749867558479,
0.15986722707748413,
0.05638158321380615,
-0.019365999847650528,
-0.04997281730175018,
0.0410744771361351,
-0.14074675738811493,
0.061558134853839874,
0.03167468309402466,
0.07061393558979034,
-0.0875358134508133,
0.057712677866220474,
0.007389797829091549,
0.004634913522750139,
-0.017255274578928947,
0.014738397672772408,
-0.09370989352464676,
-0.029535632580518723,
-0.07624918967485428,
0.010440660640597343,
-0.012330141849815845,
0.016206840053200722,
-0.010055542923510075,
-0.06792432814836502,
-0.0704350620508194,
0.036948300898075104,
-0.07758297771215439,
-0.05218379572033882,
0.012898639775812626,
0.04238160327076912,
-0.13245666027069092,
0.005683474708348513,
0.015421036630868912,
-0.08914287388324738,
0.08570525795221329,
0.08931338042020798,
0.02703920751810074,
0.03413126617670059,
-0.1289956122636795,
-0.033418331295251846,
0.013894788920879364,
0.0017381019424647093,
0.06597800552845001,
-0.09544810652732849,
-0.004283459857106209,
-0.021172530949115753,
0.0773324966430664,
0.010559255257248878,
0.08269274234771729,
-0.1307293027639389,
0.010316651314496994,
-0.08345943689346313,
-0.0437752865254879,
-0.06639467924833298,
0.01554729975759983,
0.10129866749048233,
0.05337173864245415,
0.16325834393501282,
-0.07715630531311035,
0.018478186801075935,
-0.20875081419944763,
-0.027953919023275375,
-0.006198251619935036,
-0.053224191069602966,
-0.13579061627388,
-0.040493838489055634,
0.07711341977119446,
-0.03927598521113396,
0.10082028061151505,
-0.02109995298087597,
0.06097607687115669,
0.03881311044096947,
-0.03125034272670746,
-0.06308245658874512,
-0.02831214852631092,
0.1956324279308319,
0.0773269385099411,
-0.016040628775954247,
0.10698146373033524,
-0.004436241928488016,
0.05236751213669777,
0.029609082266688347,
0.20434701442718506,
0.20567552745342255,
0.0055219633504748344,
0.0700974091887474,
0.0614674873650074,
-0.08172169327735901,
-0.06783640384674072,
0.17942963540554047,
-0.027271592989563942,
0.0720047652721405,
-0.029036615043878555,
0.187421053647995,
0.11226259917020798,
-0.1503552943468094,
0.03130552917718887,
-0.032590679824352264,
-0.07730259746313095,
-0.1402568519115448,
0.001965827075764537,
-0.096903957426548,
-0.11894138902425766,
0.04459299519658089,
-0.12000393867492676,
0.055686432868242264,
0.08146528899669647,
0.012583981268107891,
0.03389742597937584,
0.12619929015636444,
-0.025132181122899055,
0.005474540404975414,
0.04004991054534912,
0.007408421952277422,
-0.0287534948438406,
-0.04037950187921524,
-0.07663407176733017,
0.04926460608839989,
0.005367759149521589,
0.08676958829164505,
-0.04512209817767143,
-0.010464781895279884,
0.04135526716709137,
-0.028145359829068184,
-0.07664594799280167,
0.02452409267425537,
0.0363956019282341,
0.05559337139129639,
0.04540259763598442,
0.044670309871435165,
-0.0063382904045283794,
-0.03275997191667557,
0.2796759009361267,
-0.0582742840051651,
-0.0954233855009079,
-0.11468193680047989,
0.20733949542045593,
0.03946898877620697,
-0.028877612203359604,
0.03823421150445938,
-0.08243624120950699,
-0.01169853936880827,
0.1560978889465332,
0.15721173584461212,
-0.06347447633743286,
-0.024788007140159607,
-0.01223737746477127,
-0.01686134934425354,
-0.03891654312610626,
0.11615355312824249,
0.09503060579299927,
0.0007161173853091896,
-0.05393558368086815,
-0.026983140036463737,
-0.03570152819156647,
-0.014885486103594303,
-0.04169439896941185,
0.024253306910395622,
0.01509858388453722,
-0.022050879895687103,
-0.03419715166091919,
0.06302865594625473,
-0.0002840989618562162,
-0.24160777032375336,
0.06211064010858536,
-0.1432521492242813,
-0.16951650381088257,
-0.02575008012354374,
0.05006488785147667,
-0.010702831670641899,
0.050943970680236816,
-0.024031126871705055,
-0.004904939793050289,
0.0815589502453804,
-0.02051246538758278,
-0.05611731857061386,
-0.12535220384597778,
0.11236067861318588,
-0.0603054016828537,
0.17971031367778778,
-0.016741931438446045,
0.06897002458572388,
0.11788447201251984,
0.04270818457007408,
-0.1393698751926422,
0.04671763256192207,
0.0463746003806591,
-0.11391070485115051,
0.018690137192606926,
0.14389772713184357,
-0.046282801777124405,
0.08594831079244614,
0.04381914436817169,
-0.09553846716880798,
-0.010262113064527512,
-0.04694288969039917,
-0.026397906243801117,
-0.07000892609357834,
-0.012392244301736355,
-0.06827151030302048,
0.1698199063539505,
0.19680964946746826,
-0.025223590433597565,
0.01442290935665369,
-0.09371763467788696,
0.02909708023071289,
0.06891906261444092,
0.038173213601112366,
-0.051376912742853165,
-0.20850539207458496,
0.018775740638375282,
0.05005907639861107,
-0.0028798473067581654,
-0.23188146948814392,
-0.07790304720401764,
0.03904306888580322,
-0.03380247950553894,
-0.05629006400704384,
0.09662376344203949,
0.03579806536436081,
0.04816724359989166,
-0.036789532750844955,
-0.14940766990184784,
-0.03801804408431053,
0.15464620292186737,
-0.17866529524326324,
-0.04997328668832779
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.07868917286396027,
0.09754304587841034,
-0.0025933629367500544,
0.07748042792081833,
0.13259045779705048,
0.03426005691289902,
0.10760094225406647,
0.12263980507850647,
-0.12588730454444885,
0.06726237386465073,
0.09106683731079102,
0.08672957122325897,
0.03346852585673332,
0.13131965696811676,
-0.03968023508787155,
-0.23271837830543518,
0.00737078906968236,
-0.0176044013351202,
-0.06672320514917374,
0.1019921600818634,
0.0878804475069046,
-0.097921222448349,
0.08049743622541428,
-0.003174285404384136,
-0.18498846888542175,
0.028523076325654984,
-0.020400075241923332,
-0.04933782294392586,
0.09816444665193558,
-0.005436845123767853,
0.08320365846157074,
0.00744987465441227,
0.12005347013473511,
-0.19089408218860626,
0.01348885428160429,
0.0728008970618248,
0.032943245023489,
0.09506922960281372,
0.021904151886701584,
-0.002712701680138707,
0.16339434683322906,
-0.13406439125537872,
0.0986769050359726,
0.02959357760846615,
-0.08687444031238556,
-0.1672084927558899,
-0.09679137170314789,
0.01376092154532671,
0.037465453147888184,
0.08727391809225082,
0.008370881900191307,
0.17788726091384888,
-0.09303287416696548,
0.08030164241790771,
0.23744741082191467,
-0.278245210647583,
-0.07922153174877167,
0.04434783384203911,
0.04927738010883331,
0.0781230479478836,
-0.11888056248426437,
-0.016370700672268867,
0.021433066576719284,
0.02967027574777603,
0.09130322933197021,
-0.025391509756445885,
-0.08520299196243286,
-0.007808334659785032,
-0.12017025053501129,
0.004909573122859001,
0.11184751242399216,
0.044351331889629364,
-0.05061747506260872,
-0.048307426273822784,
-0.058574676513671875,
-0.0723673403263092,
-0.03503967449069023,
-0.03473290055990219,
0.043937452137470245,
-0.057036809623241425,
-0.11925089359283447,
-0.0332636833190918,
-0.048397473990917206,
-0.07864765077829361,
-0.02076353318989277,
0.20376300811767578,
0.04869403690099716,
0.0338481143116951,
-0.05260353162884712,
0.0815919041633606,
0.010113674215972424,
-0.12850306928157806,
-0.029412662610411644,
0.003433198668062687,
-0.08096630126237869,
-0.039687320590019226,
-0.055195700377225876,
0.01544828712940216,
0.04208239167928696,
0.21192091703414917,
-0.05230101943016052,
0.08421025425195694,
0.030914820730686188,
-0.01966397650539875,
-0.021137377247214317,
0.1241784542798996,
-0.01901301182806492,
-0.07745871692895889,
0.027753638103604317,
0.058473628014326096,
0.02461470663547516,
0.003533125389367342,
-0.056373145431280136,
-0.03242878243327141,
0.08331210166215897,
0.032494716346263885,
-0.06159188598394394,
0.025636838749051094,
0.001564681762829423,
-0.01960335485637188,
0.0055756596848368645,
-0.11389417201280594,
0.01754917949438095,
-0.006805673241615295,
-0.07432303577661514,
-0.014680810272693634,
0.008768977597355843,
-0.015619576908648014,
0.011142771691083908,
0.09924966096878052,
-0.08936591446399689,
-0.021283693611621857,
-0.07718529552221298,
-0.07043113559484482,
-0.001074237166903913,
-0.1484222710132599,
0.01319466345012188,
-0.07206892967224121,
-0.15623889863491058,
-0.03358762338757515,
0.042664170265197754,
-0.07149387151002884,
-0.028545431792736053,
-0.03892609104514122,
-0.0770048052072525,
0.02606690116226673,
0.003622043412178755,
0.19012700021266937,
-0.05231308937072754,
0.07989371567964554,
0.02083522081375122,
0.051952578127384186,
-0.022238556295633316,
0.031189551576972008,
-0.08825011551380157,
0.0038665623869746923,
-0.17028047144412994,
0.0612395741045475,
-0.07665938138961792,
0.017385095357894897,
-0.12964323163032532,
-0.08608449250459671,
-0.026543250307440758,
-0.02902502566576004,
0.08316056430339813,
0.10111743956804276,
-0.14006638526916504,
-0.026337917894124985,
0.11192864179611206,
-0.07878861576318741,
-0.05701916292309761,
0.06485973298549652,
-0.06693214923143387,
0.05302014946937561,
0.05655815824866295,
0.18343810737133026,
0.0734810084104538,
-0.1154358834028244,
-0.02203274331986904,
0.0011271685361862183,
0.028866611421108246,
-0.011749351397156715,
0.05041992664337158,
0.010485761798918247,
0.0214998722076416,
0.016175655648112297,
-0.03821214661002159,
0.0021638015750795603,
-0.09761704504489899,
-0.062106624245643616,
-0.05708301067352295,
-0.08231509476900101,
-0.03643856570124626,
0.013328145258128643,
0.038088828325271606,
-0.08069710433483124,
-0.08220764994621277,
0.08100711554288864,
0.14104090631008148,
-0.04172702133655548,
0.020637718960642815,
-0.0728931874036789,
0.018472250550985336,
-0.0549640953540802,
-0.031428005546331406,
-0.20507940649986267,
-0.06641972064971924,
0.033501651138067245,
-0.026429904624819756,
0.05624435096979141,
0.006087490823119879,
0.07287837564945221,
0.03685547411441803,
-0.03723815456032753,
0.00539372069761157,
-0.08951204270124435,
-0.007212374359369278,
-0.0949716717004776,
-0.22261923551559448,
-0.038946326822042465,
-0.0314544141292572,
0.12451981753110886,
-0.16230477392673492,
-0.008984488435089588,
-0.03377082943916321,
0.11960174888372421,
0.02974623814225197,
-0.06252336502075195,
-0.014910857193171978,
0.030209893360733986,
0.0022508481051772833,
-0.09203321486711502,
0.032571882009506226,
0.015725424513220787,
-0.07083262503147125,
-0.05839976668357849,
-0.11668307334184647,
0.005898990202695131,
0.07755468785762787,
0.06415314972400665,
-0.09797797352075577,
0.00642332062125206,
-0.06461725383996964,
-0.03351682052016258,
-0.05736643448472023,
0.04140012338757515,
0.17733712494373322,
0.005668249446898699,
0.1062847226858139,
-0.07944432646036148,
-0.07368334382772446,
0.022200072184205055,
0.005823894403874874,
0.04341283068060875,
0.09180708974599838,
0.11371342092752457,
-0.12788240611553192,
0.06565972417593002,
0.08386532217264175,
-0.06094538792967796,
0.12551039457321167,
-0.03877574950456619,
-0.07980413734912872,
-0.03439365699887276,
-0.01774478331208229,
-0.014368824660778046,
0.13761864602565765,
-0.051026858389377594,
0.02585071325302124,
0.02947334572672844,
0.03994099795818329,
0.020079180598258972,
-0.15325555205345154,
-0.0028774484526365995,
0.007672016974538565,
-0.042704708874225616,
-0.019007716327905655,
0.02171219326555729,
0.019398974254727364,
0.09806504845619202,
0.03349227085709572,
-0.01706514321267605,
-0.007152476813644171,
-0.004084675572812557,
-0.05216800421476364,
0.19118310511112213,
-0.09034180641174316,
-0.040323734283447266,
-0.07594654709100723,
-0.00010636400111252442,
-0.03696673363447189,
-0.042538005858659744,
0.02732793055474758,
-0.08974337577819824,
-0.038673486560583115,
-0.07329132407903671,
-0.001374785671941936,
-0.047312475740909576,
0.02595360577106476,
0.03077993169426918,
0.004215900786221027,
0.061232924461364746,
-0.13456954061985016,
0.0047080558724701405,
-0.074397012591362,
-0.10619012266397476,
0.017504394054412842,
0.06427010893821716,
0.0906580314040184,
0.05810993164777756,
-0.02896999381482601,
0.02179918810725212,
-0.03144167736172676,
0.2530093193054199,
-0.0559818372130394,
-0.00024598557502031326,
0.10746579617261887,
0.02558991126716137,
0.04580123350024223,
0.09462326765060425,
0.034461576491594315,
-0.10261823982000351,
0.030157634988427162,
0.0853499099612236,
-0.03435564786195755,
-0.2404426634311676,
-0.005087607074528933,
-0.03526775911450386,
-0.11404377222061157,
0.08275561034679413,
0.050262700766325,
-0.04422168433666229,
0.06434626877307892,
0.009389790706336498,
0.023290231823921204,
-0.05050419643521309,
0.09280621260404587,
0.09870944172143936,
0.0738927349448204,
0.10287295281887054,
-0.04764312878251076,
-0.020465446636080742,
0.06875087320804596,
-0.002991778776049614,
0.29438212513923645,
-0.025433940812945366,
0.06748626381158829,
0.05257757380604744,
0.13803084194660187,
-0.021725621074438095,
0.038073621690273285,
0.006500616669654846,
-0.005026575177907944,
-0.025875980034470558,
-0.05702947825193405,
-0.027374159544706345,
-0.0017371908761560917,
-0.07852527499198914,
0.05551835149526596,
-0.06181475892663002,
0.0635613352060318,
0.019562507048249245,
0.2609805464744568,
-0.001195582328364253,
-0.27995774149894714,
-0.07760334014892578,
-0.022029230371117592,
-0.03902502357959747,
-0.044270556420087814,
0.01235125306993723,
0.09751810133457184,
-0.10406723618507385,
0.05230169743299484,
-0.05755292996764183,
0.08163976669311523,
-0.026410870254039764,
-0.004576227627694607,
0.029885631054639816,
0.18372410535812378,
-0.016210146248340607,
0.04948147013783455,
-0.20078128576278687,
0.21701264381408691,
0.01871444657444954,
0.1326628178358078,
-0.049953438341617584,
0.008082916960120201,
0.023529769852757454,
-0.0010977439815178514,
0.07578808814287186,
-0.005212975200265646,
-0.07728924602270126,
-0.1263163685798645,
-0.0742838904261589,
0.08062773197889328,
0.1405564844608307,
-0.015663158148527145,
0.10212850570678711,
-0.04891839250922203,
0.018676089122891426,
0.040522169321775436,
-0.07031430304050446,
-0.15807479619979858,
-0.09601038694381714,
-0.01689932867884636,
0.03528421372175217,
-0.09482582658529282,
-0.04639079421758652,
-0.07583881169557571,
-0.008641080930829048,
0.11504659801721573,
0.0250173918902874,
-0.019204065203666687,
-0.1371205598115921,
0.08739922940731049,
0.14929215610027313,
-0.07304993271827698,
0.023830680176615715,
-0.007266001310199499,
0.064121313393116,
0.042612843215465546,
-0.09487196803092957,
0.04883824288845062,
-0.058332815766334534,
-0.16103169322013855,
-0.047002825886011124,
0.09086941182613373,
0.07211248576641083,
0.04019041731953621,
-0.00444234861060977,
0.049878329038619995,
-0.021082783117890358,
-0.09958291798830032,
0.014079281128942966,
0.03617946058511734,
0.050413694232702255,
0.03680377081036568,
-0.08350019156932831,
0.05861927196383476,
-0.03272897005081177,
-0.004578863736242056,
0.11286890506744385,
0.24299553036689758,
-0.0888775959610939,
0.08617500960826874,
0.056785427033901215,
-0.06834877282381058,
-0.1433205008506775,
0.06440453976392746,
0.10508133471012115,
-0.0011638945434242487,
0.05737420916557312,
-0.1952097862958908,
0.1412513107061386,
0.11377473920583725,
-0.012464255094528198,
0.038817841559648514,
-0.2729892134666443,
-0.11767926067113876,
0.05787098780274391,
0.13204289972782135,
0.12189431488513947,
-0.13269473612308502,
-0.01305528823286295,
-0.016667041927576065,
-0.125209242105484,
0.10697963088750839,
-0.11162508279085159,
0.1344815343618393,
-0.03457900881767273,
0.110673688352108,
0.00465795211493969,
-0.03022269532084465,
0.10645964741706848,
0.05020420625805855,
0.09785738587379456,
-0.04144483804702759,
0.01178855262696743,
0.06017787754535675,
-0.04835169389843941,
0.012640239670872688,
-0.0722382441163063,
0.08314747363328934,
-0.12059742212295532,
-0.006537002976983786,
-0.07910554111003876,
0.05062953010201454,
-0.037102553993463516,
-0.052328284829854965,
-0.05259961262345314,
0.03662003576755524,
0.05521288514137268,
-0.037095390260219574,
0.05360127240419388,
-0.00006589962868019938,
0.0928225964307785,
0.023350916802883148,
0.06970629841089249,
-0.00003229038338758983,
-0.04832262173295021,
0.020031463354825974,
-0.009165174327790737,
0.06083165854215622,
-0.1391855925321579,
0.004946357104927301,
0.10647857934236526,
0.051813915371894836,
0.09671130776405334,
0.04442662373185158,
-0.047687847167253494,
0.012259544804692268,
0.03738509491086006,
-0.11282513290643692,
-0.10294437408447266,
0.0490824393928051,
-0.03839116171002388,
-0.13887682557106018,
0.04764533415436745,
0.11338245123624802,
-0.049016792327165604,
-0.023215675726532936,
-0.01846463233232498,
0.00685887411236763,
-0.0209072045981884,
0.18648099899291992,
0.04210299625992775,
0.0414767749607563,
-0.10208434611558914,
0.1300916075706482,
0.028429226949810982,
-0.022122662514448166,
0.058241069316864014,
0.08598046004772186,
-0.09481163322925568,
0.0021567370276898146,
0.09631908684968948,
0.17644433677196503,
-0.07069822400808334,
-0.013279767706990242,
-0.1041104719042778,
-0.07095327973365784,
0.06193055212497711,
0.16074210405349731,
0.0562746524810791,
-0.019141066819429398,
-0.04978993535041809,
0.041456375271081924,
-0.1411615014076233,
0.06169811263680458,
0.03140180557966232,
0.07075180858373642,
-0.08718724548816681,
0.05651845037937164,
0.007623282261192799,
0.004852850455790758,
-0.01704302802681923,
0.015026979148387909,
-0.09340064227581024,
-0.029662955552339554,
-0.07433020323514938,
0.010337399318814278,
-0.012386651709675789,
0.01569271832704544,
-0.010460075922310352,
-0.06817274540662766,
-0.06963763386011124,
0.037053413689136505,
-0.07779163122177124,
-0.05265199393033981,
0.01234403531998396,
0.04211386665701866,
-0.13276784121990204,
0.005845266859978437,
0.01572694443166256,
-0.088922418653965,
0.08494327962398529,
0.08889181911945343,
0.027356106787919998,
0.034348659217357635,
-0.1305960714817047,
-0.03309754654765129,
0.014168376103043556,
0.001814161310903728,
0.06572054326534271,
-0.09505598247051239,
-0.004597218241542578,
-0.021214211359620094,
0.07739280164241791,
0.010189765132963657,
0.08197163045406342,
-0.1312842220067978,
0.009463238529860973,
-0.08443580567836761,
-0.044400252401828766,
-0.06583037227392197,
0.015825068578124046,
0.10143467783927917,
0.05349178984761238,
0.16312116384506226,
-0.07688865065574646,
0.019074777141213417,
-0.20890986919403076,
-0.027794642373919487,
-0.006099554244428873,
-0.05316944792866707,
-0.13614153861999512,
-0.03986174613237381,
0.07727496325969696,
-0.03908836841583252,
0.09870848804712296,
-0.021010195836424828,
0.06148236617445946,
0.038939476013183594,
-0.030856173485517502,
-0.0632474347949028,
-0.028485862538218498,
0.19576983153820038,
0.07720818370580673,
-0.015604819171130657,
0.1084083765745163,
-0.004430860746651888,
0.05228039249777794,
0.03058028593659401,
0.20582878589630127,
0.206467866897583,
0.0043068439699709415,
0.07035724073648453,
0.06139468401670456,
-0.08215869218111038,
-0.067807637155056,
0.17949004471302032,
-0.026702940464019775,
0.07188098877668381,
-0.029064668342471123,
0.18786399066448212,
0.11242078244686127,
-0.15024271607398987,
0.03169197589159012,
-0.032750025391578674,
-0.07700539380311966,
-0.14029768109321594,
0.002028742106631398,
-0.0968288779258728,
-0.11889887601137161,
0.045034363865852356,
-0.1203475072979927,
0.05615689978003502,
0.08175304532051086,
0.012498748488724232,
0.034006617963314056,
0.12680892646312714,
-0.0245736725628376,
0.005300790537148714,
0.040243279188871384,
0.007352979388087988,
-0.028861520811915398,
-0.04060766100883484,
-0.07650398463010788,
0.05038931965827942,
0.005410167388617992,
0.08697016537189484,
-0.04509243741631508,
-0.011106003075838089,
0.04108230024576187,
-0.027964599430561066,
-0.07682483643293381,
0.024687865749001503,
0.0365147665143013,
0.05561455339193344,
0.04597138240933418,
0.04451317340135574,
-0.005712391342967749,
-0.0328991673886776,
0.2806883752346039,
-0.058535922318696976,
-0.0949360653758049,
-0.11384100466966629,
0.20773723721504211,
0.04042337089776993,
-0.028888408094644547,
0.03821701928973198,
-0.08305978029966354,
-0.012011348269879818,
0.15480726957321167,
0.15537501871585846,
-0.06233556196093559,
-0.02457376755774021,
-0.012482554651796818,
-0.016850387677550316,
-0.038672830909490585,
0.11597785353660583,
0.0952877327799797,
0.0010954298777505755,
-0.05417391285300255,
-0.0269228033721447,
-0.035707391798496246,
-0.015446861274540424,
-0.040938593447208405,
0.024154674261808395,
0.015312429517507553,
-0.02233124151825905,
-0.03425855562090874,
0.06299925595521927,
-0.00014233437832444906,
-0.24108818173408508,
0.06143892556428909,
-0.14398959279060364,
-0.16909681260585785,
-0.025613248348236084,
0.04982205107808113,
-0.010751007124781609,
0.050951987504959106,
-0.02397707663476467,
-0.004362963140010834,
0.08003117889165878,
-0.02039811573922634,
-0.056532807648181915,
-0.12603959441184998,
0.11184203624725342,
-0.06067555770277977,
0.1800091713666916,
-0.016803037375211716,
0.06832306832075119,
0.11778191477060318,
0.04296485707163811,
-0.14004026353359222,
0.04649275541305542,
0.046561092138290405,
-0.11409783363342285,
0.018492717295885086,
0.14469853043556213,
-0.04619462415575981,
0.0860675498843193,
0.04347851872444153,
-0.0971231684088707,
-0.010051649995148182,
-0.04751291498541832,
-0.025744736194610596,
-0.07048777490854263,
-0.010991684161126614,
-0.06821534037590027,
0.16996276378631592,
0.19700507819652557,
-0.025205383077263832,
0.013822871260344982,
-0.09397270530462265,
0.028917549178004265,
0.06813377141952515,
0.03897114843130112,
-0.05104900151491165,
-0.20858344435691833,
0.018937652930617332,
0.04990173503756523,
-0.0030470890924334526,
-0.23174047470092773,
-0.07763858139514923,
0.039303701370954514,
-0.034752316772937775,
-0.056444790214300156,
0.09599867463111877,
0.03566616401076317,
0.04766756296157837,
-0.036812055855989456,
-0.15095533430576324,
-0.03789721801877022,
0.1550082266330719,
-0.17878127098083496,
-0.049615852534770966
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-512-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-512-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0792461410164833,
0.0981430634856224,
-0.00260428199544549,
0.07796822488307953,
0.13280458748340607,
0.03473405912518501,
0.1068539172410965,
0.12244701385498047,
-0.12598086893558502,
0.06695561110973358,
0.09114725887775421,
0.0863460898399353,
0.03331783413887024,
0.13076475262641907,
-0.040054403245449066,
-0.23259328305721283,
0.006673464551568031,
-0.017484026029706,
-0.0654190257191658,
0.10158059746026993,
0.08791418373584747,
-0.09811598062515259,
0.08048839122056961,
-0.00357448379509151,
-0.18469485640525818,
0.028343435376882553,
-0.019980942830443382,
-0.04936714097857475,
0.09844478219747543,
-0.004898997023701668,
0.08322984725236893,
0.006665900349617004,
0.12035240978002548,
-0.19122977554798126,
0.013339986093342304,
0.07302526384592056,
0.03297562524676323,
0.09507022798061371,
0.020986665040254593,
-0.0019977744668722153,
0.16288384795188904,
-0.13446268439292908,
0.09865448623895645,
0.029353037476539612,
-0.08663630485534668,
-0.16748948395252228,
-0.09629636257886887,
0.014530923217535019,
0.03731626644730568,
0.08665062487125397,
0.009216981939971447,
0.17840763926506042,
-0.09205236285924911,
0.08089747279882431,
0.2378382533788681,
-0.27771034836769104,
-0.07860492169857025,
0.04386258125305176,
0.04911349341273308,
0.07930848747491837,
-0.11806266754865646,
-0.01625174470245838,
0.021254802122712135,
0.029442669823765755,
0.09172271192073822,
-0.02611580863595009,
-0.086864173412323,
-0.007883221842348576,
-0.1199621707201004,
0.004850609228014946,
0.11213931441307068,
0.044659897685050964,
-0.0507526732981205,
-0.047670938074588776,
-0.05901633948087692,
-0.07295148819684982,
-0.035518065094947815,
-0.0347701795399189,
0.04365503415465355,
-0.05688050016760826,
-0.11825789511203766,
-0.03352128341794014,
-0.04789421707391739,
-0.0778212696313858,
-0.020435329526662827,
0.20370304584503174,
0.048939842730760574,
0.03371789678931236,
-0.05189947783946991,
0.08134178817272186,
0.008481545373797417,
-0.12832501530647278,
-0.02896704524755478,
0.003912587184458971,
-0.08113322407007217,
-0.0399085208773613,
-0.05527792498469353,
0.014450127258896828,
0.04197251796722412,
0.21281351149082184,
-0.05198188126087189,
0.08405577391386032,
0.03113621287047863,
-0.019083021208643913,
-0.02109433338046074,
0.12498102337121964,
-0.019522404298186302,
-0.07860689610242844,
0.02876758761703968,
0.05818251520395279,
0.02516748197376728,
0.002976930234581232,
-0.05740135908126831,
-0.03281407058238983,
0.08341485261917114,
0.03218033164739609,
-0.0616471953690052,
0.025171682238578796,
0.0015172011917456985,
-0.019705673679709435,
0.005824941676110029,
-0.11391081660985947,
0.01777116395533085,
-0.0068246154114604,
-0.0742105096578598,
-0.014253702946007252,
0.009318685159087181,
-0.014927496202290058,
0.011117419227957726,
0.09870043396949768,
-0.08858808875083923,
-0.020699074491858482,
-0.07718422263860703,
-0.07064016908407211,
-0.0009695022599771619,
-0.14792563021183014,
0.013720816932618618,
-0.07302185893058777,
-0.15651428699493408,
-0.03345981612801552,
0.04277912527322769,
-0.07100586593151093,
-0.028382712975144386,
-0.0383199006319046,
-0.07615429162979126,
0.025606533512473106,
0.003432035678997636,
0.18915899097919464,
-0.05249478667974472,
0.07947996258735657,
0.020706528797745705,
0.051579173654317856,
-0.02343381755053997,
0.03141032159328461,
-0.08753585815429688,
0.0037728159222751856,
-0.1706235557794571,
0.061293892562389374,
-0.07617031037807465,
0.01797735132277012,
-0.12862280011177063,
-0.08589351922273636,
-0.02538827620446682,
-0.028589220717549324,
0.08358566462993622,
0.10016198456287384,
-0.14034980535507202,
-0.026042331010103226,
0.11132398247718811,
-0.07809486985206604,
-0.05706649273633957,
0.06620785593986511,
-0.06707248091697693,
0.05269978195428848,
0.057068996131420135,
0.1833069622516632,
0.07438716292381287,
-0.11513476073741913,
-0.020739682018756866,
0.0023112583439797163,
0.029464613646268845,
-0.012763142585754395,
0.05017197132110596,
0.010675378143787384,
0.021565275266766548,
0.016358723863959312,
-0.037431202828884125,
0.00273846834897995,
-0.09769098460674286,
-0.06239742413163185,
-0.057311393320560455,
-0.08234462887048721,
-0.03722352162003517,
0.01363102626055479,
0.03810661658644676,
-0.08049304783344269,
-0.08237021416425705,
0.08207498490810394,
0.14102457463741302,
-0.04193561151623726,
0.020817700773477554,
-0.07236094772815704,
0.018945693969726562,
-0.053876619786024094,
-0.03146306425333023,
-0.20478950440883636,
-0.06516537815332413,
0.03345838189125061,
-0.026378609240055084,
0.05594320595264435,
0.006331843789666891,
0.07199464738368988,
0.03729385510087013,
-0.036761265248060226,
0.00633959798142314,
-0.08885589241981506,
-0.006869616452604532,
-0.09588407725095749,
-0.22209520637989044,
-0.038912367075681686,
-0.031030263751745224,
0.12489176541566849,
-0.16338546574115753,
-0.00869300402700901,
-0.03471677377820015,
0.11905331909656525,
0.029284048825502396,
-0.062411896884441376,
-0.015160150825977325,
0.03091157041490078,
0.0021041675936430693,
-0.09224096685647964,
0.03261281177401543,
0.0162903293967247,
-0.07155740261077881,
-0.059494517743587494,
-0.1165703609585762,
0.00637639919295907,
0.07705666869878769,
0.06272592395544052,
-0.098062664270401,
0.005941334646195173,
-0.06411022692918777,
-0.03343755751848221,
-0.05675594508647919,
0.04068395867943764,
0.17798341810703278,
0.005822973791509867,
0.10753683000802994,
-0.07919810712337494,
-0.07322165369987488,
0.021963413804769516,
0.005805595777928829,
0.04426724836230278,
0.09158816933631897,
0.11355317384004593,
-0.1261293590068817,
0.06501112133264542,
0.08308517932891846,
-0.062102753669023514,
0.1245456337928772,
-0.038547489792108536,
-0.07944998890161514,
-0.03457055613398552,
-0.017931686714291573,
-0.014365222305059433,
0.13740180432796478,
-0.052212394773960114,
0.025187328457832336,
0.029568970203399658,
0.03954729810357094,
0.020248854532837868,
-0.15336373448371887,
-0.002871960401535034,
0.008288775570690632,
-0.04228599742054939,
-0.017857788130640984,
0.020969098433852196,
0.01915641315281391,
0.09776103496551514,
0.03326759487390518,
-0.017512084916234016,
-0.006683177314698696,
-0.004011610522866249,
-0.052303850650787354,
0.19078074395656586,
-0.09041278809309006,
-0.041081905364990234,
-0.07691603899002075,
-0.00008396415796596557,
-0.036974526941776276,
-0.04277842119336128,
0.02756589464843273,
-0.08831338584423065,
-0.03866874426603317,
-0.07368795573711395,
-0.00274142948910594,
-0.047210291028022766,
0.025726592168211937,
0.03088383376598358,
0.0038274743128567934,
0.06162528693675995,
-0.1344091296195984,
0.004619887564331293,
-0.07391533255577087,
-0.10567972809076309,
0.01776599884033203,
0.06442471593618393,
0.09097354859113693,
0.05883313715457916,
-0.029499413445591927,
0.021436244249343872,
-0.031026145443320274,
0.2527567148208618,
-0.05546370893716812,
0.00007079379429342225,
0.10751276463270187,
0.024642804637551308,
0.04600942134857178,
0.09406816959381104,
0.03489786386489868,
-0.10264062881469727,
0.029911432415246964,
0.08473750203847885,
-0.03471359238028526,
-0.23992112278938293,
-0.005091956816613674,
-0.035479072481393814,
-0.11400317400693893,
0.08239761739969254,
0.05022722855210304,
-0.043841127306222916,
0.06446048617362976,
0.009874294511973858,
0.02466760389506817,
-0.051373425871133804,
0.09248681366443634,
0.09918157756328583,
0.07347329705953598,
0.10260564088821411,
-0.04734627529978752,
-0.020262336358428,
0.069071464240551,
-0.0037948773242533207,
0.2930160164833069,
-0.02560492604970932,
0.0679447129368782,
0.05199875682592392,
0.1384715586900711,
-0.021896887570619583,
0.037561528384685516,
0.006169262807816267,
-0.005263802595436573,
-0.026065263897180557,
-0.05701975151896477,
-0.028225494548678398,
-0.0013152625178918242,
-0.0792284682393074,
0.056167878210544586,
-0.061779092997312546,
0.06417742371559143,
0.019266773015260696,
0.26082098484039307,
-0.0013879216276109219,
-0.2797555923461914,
-0.07771173864603043,
-0.022201597690582275,
-0.039179131388664246,
-0.044555895030498505,
0.012418235652148724,
0.09830204397439957,
-0.10389140248298645,
0.0515696182847023,
-0.05688522011041641,
0.08187387883663177,
-0.02742195315659046,
-0.004241445567458868,
0.029432697221636772,
0.18429115414619446,
-0.016106069087982178,
0.049585532397031784,
-0.20184163749217987,
0.21597206592559814,
0.018944889307022095,
0.13278767466545105,
-0.049861956387758255,
0.008479208685457706,
0.023552481085062027,
0.000540623557753861,
0.07495222240686417,
-0.004856682848185301,
-0.07638560980558395,
-0.12754108011722565,
-0.07451100647449493,
0.08056104183197021,
0.13960762321949005,
-0.014157162979245186,
0.10172008723020554,
-0.049176111817359924,
0.018739016726613045,
0.04057290405035019,
-0.06972623616456985,
-0.15807761251926422,
-0.0961543545126915,
-0.017386052757501602,
0.03617667779326439,
-0.09466646611690521,
-0.04623245447874069,
-0.07545457035303116,
-0.010166238062083721,
0.11528678238391876,
0.025599785149097443,
-0.01920417509973049,
-0.13706360757350922,
0.0882001668214798,
0.148540660738945,
-0.07327580451965332,
0.023941144347190857,
-0.007353347260504961,
0.06388535350561142,
0.042736295610666275,
-0.094136543571949,
0.04885251820087433,
-0.05819970369338989,
-0.16047485172748566,
-0.047284577041864395,
0.09013182669878006,
0.07207947224378586,
0.04030404984951019,
-0.004175737500190735,
0.049597494304180145,
-0.021428681910037994,
-0.09957528859376907,
0.012940441258251667,
0.03702443465590477,
0.049990780651569366,
0.037201616913080215,
-0.08376704901456833,
0.05937126278877258,
-0.03245977684855461,
-0.004851137287914753,
0.11389163136482239,
0.2426322102546692,
-0.08922199159860611,
0.08524972945451736,
0.057104598730802536,
-0.06852317601442337,
-0.14281192421913147,
0.06457576155662537,
0.1046760082244873,
-0.0010663785506039858,
0.05759837105870247,
-0.1946476399898529,
0.1415678858757019,
0.11483118683099747,
-0.012160257436335087,
0.03887609764933586,
-0.2730678617954254,
-0.11783973127603531,
0.058561213314533234,
0.13212460279464722,
0.12344632297754288,
-0.13334910571575165,
-0.013005988672375679,
-0.01728757657110691,
-0.1261400282382965,
0.1057802066206932,
-0.11110067367553711,
0.13431192934513092,
-0.034411992877721786,
0.10961828380823135,
0.004694555886089802,
-0.030247481539845467,
0.1067972332239151,
0.05082762986421585,
0.09795462340116501,
-0.041822806000709534,
0.012019405141472816,
0.05960352346301079,
-0.048512816429138184,
0.012933761812746525,
-0.07196033746004105,
0.0832558423280716,
-0.12220636755228043,
-0.006554919760674238,
-0.0780123919248581,
0.0507366769015789,
-0.0369255430996418,
-0.05233568698167801,
-0.052579011768102646,
0.035870883613824844,
0.0549972727894783,
-0.03689105436205864,
0.05439002811908722,
-0.00001708369563857559,
0.09226372092962265,
0.024081770330667496,
0.06852834671735764,
-0.002529972931370139,
-0.04827617108821869,
0.019986670464277267,
-0.00917284656316042,
0.06075596436858177,
-0.1391836702823639,
0.005461892578750849,
0.10682759433984756,
0.05195685848593712,
0.0972973182797432,
0.0434926375746727,
-0.047024138271808624,
0.012487035244703293,
0.03693735972046852,
-0.11236508935689926,
-0.10206993669271469,
0.04878958687186241,
-0.03910676762461662,
-0.13851729035377502,
0.04689858853816986,
0.11361989378929138,
-0.04969139024615288,
-0.022743916139006615,
-0.01876061223447323,
0.006004588212817907,
-0.021060260012745857,
0.18592138588428497,
0.04297516494989395,
0.04117932170629501,
-0.10198847204446793,
0.12968195974826813,
0.028776347637176514,
-0.0210026316344738,
0.05828919634222984,
0.08605565130710602,
-0.09490713477134705,
0.0019027612870559096,
0.09652639925479889,
0.17660987377166748,
-0.07141928374767303,
-0.014084828086197376,
-0.10451401770114899,
-0.07009252160787582,
0.061620671302080154,
0.16012941300868988,
0.056624624878168106,
-0.019487231969833374,
-0.049980029463768005,
0.041011545807123184,
-0.14119775593280792,
0.0615282766520977,
0.03131585195660591,
0.07111919671297073,
-0.08731598407030106,
0.0581827387213707,
0.008097441866993904,
0.004981767386198044,
-0.01695409044623375,
0.014814537949860096,
-0.09342501312494278,
-0.029408637434244156,
-0.07607832551002502,
0.010579902678728104,
-0.011994755826890469,
0.015993693843483925,
-0.010486924089491367,
-0.06809396296739578,
-0.06950001418590546,
0.03711354359984398,
-0.0773746445775032,
-0.05252854898571968,
0.01268517505377531,
0.042016398161649704,
-0.1321491003036499,
0.005497490055859089,
0.015307172201573849,
-0.08848338574171066,
0.08516770601272583,
0.08821527659893036,
0.027067426592111588,
0.03410058468580246,
-0.1312243491411209,
-0.032867174595594406,
0.014158181846141815,
0.0025179756339639425,
0.0659046620130539,
-0.09362222999334335,
-0.004087068140506744,
-0.02100924775004387,
0.07735014706850052,
0.009927931241691113,
0.08208368718624115,
-0.13146287202835083,
0.00960554089397192,
-0.08448424190282822,
-0.0444285087287426,
-0.06617109477519989,
0.015662845224142075,
0.10091574490070343,
0.05304059386253357,
0.16282396018505096,
-0.07677149772644043,
0.018895631656050682,
-0.20892132818698883,
-0.02802233025431633,
-0.005978250410407782,
-0.05300552770495415,
-0.13622739911079407,
-0.040864914655685425,
0.0771549791097641,
-0.03875334560871124,
0.10020321607589722,
-0.020709160715341568,
0.061762500554323196,
0.03867734223604202,
-0.03144500404596329,
-0.06306944787502289,
-0.028914805501699448,
0.19594159722328186,
0.07780808210372925,
-0.015421910211443901,
0.1077033057808876,
-0.004139773081988096,
0.053223516792058945,
0.030147500336170197,
0.20410820841789246,
0.20669801533222198,
0.003786899149417877,
0.07014550268650055,
0.0611160509288311,
-0.08183027058839798,
-0.06791575253009796,
0.1791197955608368,
-0.026887184008955956,
0.07252002507448196,
-0.029196497052907944,
0.1881898045539856,
0.11201924085617065,
-0.15009698271751404,
0.03153156861662865,
-0.03223475068807602,
-0.07724573463201523,
-0.14039158821105957,
0.0028290499467402697,
-0.09694938361644745,
-0.11879254877567291,
0.04446310177445412,
-0.11987441778182983,
0.05635147541761398,
0.08252722024917603,
0.012222562916576862,
0.03382690250873566,
0.12585657835006714,
-0.024425046518445015,
0.00551246851682663,
0.04024942219257355,
0.007247021421790123,
-0.02854788862168789,
-0.041703276336193085,
-0.07719128578901291,
0.05014114826917648,
0.005587213672697544,
0.0873907208442688,
-0.0452410951256752,
-0.009788135066628456,
0.04146827384829521,
-0.02783995307981968,
-0.07711190730333328,
0.024778403341770172,
0.035933803766965866,
0.055520474910736084,
0.04474176466464996,
0.04481130838394165,
-0.005675151944160461,
-0.03297426179051399,
0.27998265624046326,
-0.05808081105351448,
-0.09535212814807892,
-0.11352428793907166,
0.20701099932193756,
0.04060971364378929,
-0.028850089758634567,
0.038090094923973083,
-0.0831223800778389,
-0.011145330034196377,
0.1552988439798355,
0.15465447306632996,
-0.06335152685642242,
-0.024579407647252083,
-0.012223010882735252,
-0.017002133652567863,
-0.03952965885400772,
0.11646997928619385,
0.09555333107709885,
-0.00020974539802409708,
-0.05327143147587776,
-0.027134541422128677,
-0.03585589677095413,
-0.015378582291305065,
-0.04216255620121956,
0.02331061102449894,
0.01536552980542183,
-0.02173473872244358,
-0.03379238024353981,
0.06261900812387466,
0.0004424665530677885,
-0.24200507998466492,
0.06144201010465622,
-0.1438988894224167,
-0.16922685503959656,
-0.025712212547659874,
0.050026047974824905,
-0.010120484977960587,
0.05023922771215439,
-0.02396509051322937,
-0.0039061291608959436,
0.0808020755648613,
-0.020825551822781563,
-0.056601185351610184,
-0.125524640083313,
0.11146532744169235,
-0.0601494126021862,
0.1797783225774765,
-0.016769254580140114,
0.06862442940473557,
0.1175115704536438,
0.043152932077646255,
-0.13919338583946228,
0.046769071370363235,
0.04620291665196419,
-0.11355922371149063,
0.018479904159903526,
0.14350540935993195,
-0.04617798328399658,
0.08589667826890945,
0.04386648163199425,
-0.09664846956729889,
-0.010517296381294727,
-0.04837201535701752,
-0.026181410998106003,
-0.07021753489971161,
-0.012160218320786953,
-0.06795293837785721,
0.1701480746269226,
0.19667664170265198,
-0.02503673918545246,
0.01358733419328928,
-0.09416400641202927,
0.02842153050005436,
0.06845083832740784,
0.03921227157115936,
-0.05142676830291748,
-0.20849142968654633,
0.01949566975235939,
0.04960241541266441,
-0.0029054046608507633,
-0.23112310469150543,
-0.07800581306219101,
0.039052702486515045,
-0.03488864749670029,
-0.056164007633924484,
0.09612057358026505,
0.03597351163625717,
0.04786616563796997,
-0.03689930960536003,
-0.15134403109550476,
-0.03835725039243698,
0.15511398017406464,
-0.17865094542503357,
-0.04986125975847244
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08084115386009216,
0.08616447448730469,
-0.002476227469742298,
0.08399706333875656,
0.13642668724060059,
0.03282758221030235,
0.10022404044866562,
0.14015138149261475,
-0.11443416774272919,
0.04338802769780159,
0.089536152780056,
0.08236527442932129,
0.03114420548081398,
0.13653434813022614,
-0.03401406481862068,
-0.2532704472541809,
-0.006501068361103535,
-0.018913516774773598,
-0.09601575881242752,
0.10926149785518646,
0.09763915836811066,
-0.09995775669813156,
0.0725686177611351,
-0.014351662248373032,
-0.17909985780715942,
0.017583757638931274,
-0.012875339016318321,
-0.051831942051649094,
0.11816087365150452,
-0.008841908536851406,
0.0758061483502388,
0.009081326425075531,
0.12135124206542969,
-0.19019092619419098,
0.016227224841713905,
0.07620275765657425,
0.04593677073717117,
0.09753146022558212,
0.009006928652524948,
-0.012054920196533203,
0.13175977766513824,
-0.12770499289035797,
0.09950663894414902,
0.03218337893486023,
-0.09483525902032852,
-0.20983459055423737,
-0.09510024636983871,
0.0073756868951022625,
0.04296860843896866,
0.08425543457269669,
0.010181857272982597,
0.15970908105373383,
-0.09759171307086945,
0.0829043909907341,
0.22873015701770782,
-0.27398768067359924,
-0.07961772382259369,
0.0528373122215271,
0.0585443414747715,
0.08105386793613434,
-0.12720859050750732,
-0.009017369709908962,
0.006168450228869915,
0.02537553943693638,
0.10387679934501648,
-0.031737785786390305,
-0.08614738285541534,
0.0031171466689556837,
-0.1039346233010292,
0.006455323193222284,
0.11068582534790039,
0.03301984444260597,
-0.05265773460268974,
-0.07192424684762955,
-0.04338924586772919,
-0.05325326323509216,
-0.0343564972281456,
-0.018957026302814484,
0.038872428238391876,
-0.05834370106458664,
-0.14045870304107666,
-0.04877428337931633,
-0.04929640516638756,
-0.09016003459692001,
-0.007046690676361322,
0.2190982848405838,
0.03552686423063278,
0.03183601796627045,
-0.05092252790927887,
0.1029437929391861,
0.012103002518415451,
-0.12737539410591125,
-0.029950469732284546,
-0.005049725528806448,
-0.09119275212287903,
-0.03521941974759102,
-0.06004536896944046,
0.022007185965776443,
0.03552988916635513,
0.2169666737318039,
-0.04318215325474739,
0.07787945121526718,
0.03143937885761261,
-0.017209848389029503,
-0.02745962329208851,
0.13849952816963196,
-0.021623359993100166,
-0.07441864907741547,
0.012632962316274643,
0.0623677596449852,
0.01034509390592575,
-0.005420204252004623,
-0.0631774440407753,
-0.03968669846653938,
0.0646999254822731,
0.04672379791736603,
-0.06033266708254814,
0.030782198533415794,
-0.0070068128407001495,
-0.023515379056334496,
0.00007402092160191387,
-0.11488847434520721,
0.016246503219008446,
-0.008893469348549843,
-0.08151089400053024,
-0.046020299196243286,
0.008294877596199512,
-0.01165608037263155,
0.011575466953217983,
0.09654475003480911,
-0.07343742251396179,
-0.02480306103825569,
-0.08061236888170242,
-0.07493964582681656,
-0.01697329804301262,
-0.15560026466846466,
0.01920250430703163,
-0.06285751610994339,
-0.16151869297027588,
-0.030204808339476585,
0.05606450140476227,
-0.0819678083062172,
-0.026509558781981468,
-0.031840018928050995,
-0.07830621302127838,
0.021825341507792473,
0.0021339082159101963,
0.21670520305633545,
-0.04970600828528404,
0.08948573470115662,
0.01000701729208231,
0.056988075375556946,
-0.008970508351922035,
0.03652453050017357,
-0.08724682778120041,
0.00931687280535698,
-0.17540256679058075,
0.07654180377721786,
-0.0799344852566719,
0.016374951228499413,
-0.13972783088684082,
-0.08610105514526367,
-0.011262398213148117,
-0.02234620787203312,
0.08048854768276215,
0.10383967310190201,
-0.1418968290090561,
-0.022438060492277145,
0.11640813201665878,
-0.06470146775245667,
-0.05480789765715599,
0.05759914964437485,
-0.07625260204076767,
0.08467533439397812,
0.0537777803838253,
0.19254103302955627,
0.08617880940437317,
-0.10607343167066574,
0.013285485096275806,
0.013749913312494755,
0.034878455102443695,
0.003682096488773823,
0.05078426003456116,
0.00555915804579854,
0.028757596388459206,
0.016425199806690216,
-0.07326535880565643,
0.007785596884787083,
-0.0913204774260521,
-0.059359967708587646,
-0.04885760694742203,
-0.08381124585866928,
-0.006147865671664476,
0.013555503450334072,
0.03550088778138161,
-0.07991458475589752,
-0.08505357801914215,
0.07522962987422943,
0.13666830956935883,
-0.04785412922501564,
0.01751592382788658,
-0.07626473158597946,
-0.0019208170706406236,
-0.030965564772486687,
-0.023628100752830505,
-0.20323368906974792,
-0.05671992897987366,
0.03240683302283287,
-0.004785329103469849,
0.04522141441702843,
0.0002636863791849464,
0.08485507220029831,
0.027441365644335747,
-0.054785002022981644,
-0.0007473428850062191,
-0.08636406809091568,
-0.007877225987613201,
-0.09323114901781082,
-0.22030633687973022,
-0.05305168405175209,
-0.038890521973371506,
0.1377800554037094,
-0.1660013049840927,
-0.003824705956503749,
-0.017287110909819603,
0.11655783653259277,
0.043085768818855286,
-0.052220866084098816,
-0.004530041012912989,
0.029080532491207123,
0.01224367506802082,
-0.09811905771493912,
0.03855174779891968,
0.016096225008368492,
-0.09465532004833221,
-0.028909100219607353,
-0.10367792844772339,
-0.007596904411911964,
0.07103970646858215,
0.06981217861175537,
-0.1019257977604866,
-0.015468548983335495,
-0.06264512240886688,
-0.028534842655062675,
-0.05354989320039749,
0.037918124347925186,
0.18259544670581818,
0.01799379102885723,
0.11034667491912842,
-0.0744267925620079,
-0.08341147750616074,
0.01843085326254368,
0.009278431534767151,
0.06118616461753845,
0.1015729233622551,
0.07475258409976959,
-0.11113785952329636,
0.058791402727365494,
0.09109384566545486,
-0.05292591452598572,
0.1388387829065323,
-0.04643642157316208,
-0.07313232868909836,
-0.03193718194961548,
-0.008620340377092361,
-0.006133228074759245,
0.15033526718616486,
-0.0380704402923584,
0.01748069003224373,
0.03456132858991623,
0.03604875132441521,
0.00458131916821003,
-0.16119737923145294,
-0.014870439656078815,
0.013986659236252308,
-0.04798637330532074,
-0.022634319961071014,
0.015575080178678036,
0.016346585005521774,
0.09503716975450516,
0.041804179549217224,
-0.0027889625634998083,
0.007402033545076847,
-0.009600020945072174,
-0.04200899228453636,
0.20035135746002197,
-0.09157732129096985,
-0.043709613382816315,
-0.07696856558322906,
-0.0019758332055062056,
-0.023605622351169586,
-0.03973177447915077,
0.015350996516644955,
-0.09486784785985947,
-0.02615063451230526,
-0.07101210206747055,
0.0003111977130174637,
-0.042357299476861954,
0.015485693700611591,
0.00393249373883009,
0.014217848889529705,
0.05925252288579941,
-0.13259705901145935,
0.010508791543543339,
-0.06667965650558472,
-0.11129722744226456,
0.02898699790239334,
0.06051040440797806,
0.07965436577796936,
0.05566830933094025,
-0.032047245651483536,
0.01938813365995884,
-0.04188850149512291,
0.2306634932756424,
-0.07839635759592056,
0.011546026915311813,
0.12339777499437332,
0.027124417945742607,
0.03963276743888855,
0.10302294790744781,
0.033517852425575256,
-0.1002371683716774,
0.03861343115568161,
0.07726898789405823,
-0.04148337244987488,
-0.24351409077644348,
0.008109711110591888,
-0.03928208351135254,
-0.09535064548254013,
0.08956167101860046,
0.050752345472574234,
-0.043456919491291046,
0.0638023242354393,
0.00950943399220705,
0.01166667602956295,
-0.024619482457637787,
0.08882264792919159,
0.09197286516427994,
0.06601795554161072,
0.1080707535147667,
-0.03790943697094917,
-0.017017530277371407,
0.06485531479120255,
0.029139289632439613,
0.3034255802631378,
-0.04989905282855034,
0.08755222707986832,
0.04800379276275635,
0.13980388641357422,
-0.021546747535467148,
0.045261211693286896,
0.006713456008583307,
-0.005709828808903694,
-0.02947533130645752,
-0.054163482040166855,
-0.020969076082110405,
0.002381043042987585,
-0.07772788405418396,
0.04483845829963684,
-0.050266630947589874,
0.04834870994091034,
0.019604947417974472,
0.29075464606285095,
0.00047242347500286996,
-0.264508455991745,
-0.09528553485870361,
-0.01583917625248432,
-0.03698524460196495,
-0.052870236337184906,
0.010401119478046894,
0.12209783494472504,
-0.12283743172883987,
0.03777682036161423,
-0.07137677073478699,
0.08056817203760147,
-0.029171835631132126,
-0.002231539459899068,
0.0432942733168602,
0.17230811715126038,
-0.02139287255704403,
0.05523421987891197,
-0.2215903103351593,
0.22673439979553223,
0.01533450186252594,
0.1284988671541214,
-0.059157468378543854,
0.009328781627118587,
0.027096020057797432,
0.003971974365413189,
0.08789008110761642,
-0.00454065902158618,
-0.064003586769104,
-0.13570177555084229,
-0.05251627787947655,
0.07314502447843552,
0.14279019832611084,
-0.04298451170325279,
0.09691376984119415,
-0.05874554440379143,
0.014618953689932823,
0.036872319877147675,
-0.0886656790971756,
-0.1312158703804016,
-0.09919583797454834,
-0.02213364839553833,
0.015885552391409874,
-0.06975755840539932,
-0.05723537504673004,
-0.06891410797834396,
0.03274811431765556,
0.10670396685600281,
0.011310014873743057,
-0.03153330460190773,
-0.1481870412826538,
0.0799371674656868,
0.156485915184021,
-0.0661366879940033,
0.025840871036052704,
-0.004442022647708654,
0.07549452781677246,
0.039190370589494705,
-0.08315448462963104,
0.06036750599741936,
-0.06504710018634796,
-0.17883886396884918,
-0.048386845737695694,
0.09439641237258911,
0.06868281960487366,
0.04158569127321243,
-0.0035338771995157003,
0.05313766747713089,
-0.027057472616434097,
-0.09316831827163696,
0.02507340908050537,
0.02126806601881981,
0.03700964152812958,
0.03683391213417053,
-0.08523931354284286,
0.07616269588470459,
-0.036597270518541336,
-0.014465726912021637,
0.11349684745073318,
0.2309494912624359,
-0.1015014573931694,
0.09846341609954834,
0.06089917942881584,
-0.06099194288253784,
-0.16092248260974884,
0.07101133465766907,
0.1036067008972168,
0.008267841301858425,
0.06910276412963867,
-0.21271675825119019,
0.12863409519195557,
0.10412941873073578,
-0.0163552388548851,
0.042553823441267014,
-0.27037501335144043,
-0.11988995969295502,
0.04370603710412979,
0.12803801894187927,
0.09692436456680298,
-0.1259366124868393,
-0.014217507094144821,
-0.015508796088397503,
-0.11217353492975235,
0.09523402154445648,
-0.11497320234775543,
0.13653458654880524,
-0.02957729622721672,
0.11153525859117508,
0.010832605883479118,
-0.025841010734438896,
0.10339198261499405,
0.044618796557188034,
0.10228780657052994,
-0.04185744374990463,
0.006348795257508755,
0.06520810723304749,
-0.04777757078409195,
0.0021859980188310146,
-0.07789603620767593,
0.08763130754232407,
-0.13193006813526154,
-0.003974078223109245,
-0.09054741263389587,
0.04669325053691864,
-0.038161568343639374,
-0.06888160109519958,
-0.04182346165180206,
0.05637555569410324,
0.047605618834495544,
-0.03614635020494461,
0.04592180997133255,
-0.01888391561806202,
0.10297545790672302,
0.038794420659542084,
0.08574382960796356,
0.015156712383031845,
-0.046276550740003586,
0.023972459137439728,
-0.010058147832751274,
0.06502727419137955,
-0.1644134223461151,
0.01124068908393383,
0.09912151098251343,
0.06074849143624306,
0.09717310220003128,
0.044400688260793686,
-0.04504906013607979,
0.017758814617991447,
0.030261119827628136,
-0.10184267163276672,
-0.10659018158912659,
0.04694400727748871,
-0.027194326743483543,
-0.13895145058631897,
0.04870012030005455,
0.12157699465751648,
-0.04321354627609253,
-0.026567023247480392,
-0.01639825850725174,
0.004271442536264658,
-0.023098032921552658,
0.18201690912246704,
0.047897566109895706,
0.05488822981715202,
-0.10215632617473602,
0.12767745554447174,
0.02876127138733864,
-0.020534362643957138,
0.05274048447608948,
0.08324932307004929,
-0.10264552384614944,
-0.0029428587295114994,
0.08106842637062073,
0.1388712078332901,
-0.05874674767255783,
-0.006295311264693737,
-0.10411893576383591,
-0.08222296088933945,
0.05063188448548317,
0.14222219586372375,
0.0486140176653862,
-0.015564561821520329,
-0.05445732921361923,
0.0399162694811821,
-0.14080016314983368,
0.07083892077207565,
0.023485049605369568,
0.06315421313047409,
-0.07531823217868805,
0.059588272124528885,
0.007484589237719774,
0.011476012878119946,
-0.017286594957113266,
0.007675470318645239,
-0.09303735196590424,
-0.01608111523091793,
-0.07871957868337631,
-0.0016600239323452115,
0.0005613957182504237,
0.017051029950380325,
-0.02022477798163891,
-0.07140779495239258,
-0.04881158098578453,
0.037432555109262466,
-0.08702006191015244,
-0.050323691219091415,
0.009283234365284443,
0.04046313837170601,
-0.12415628135204315,
-0.00531876552850008,
0.021821897476911545,
-0.09345021098852158,
0.09572076797485352,
0.07406125962734222,
0.016810404136776924,
0.02930668368935585,
-0.12429476529359818,
-0.0335538387298584,
-0.010642113164067268,
-0.008497851900756359,
0.06145979091525078,
-0.09541831165552139,
-0.008214768953621387,
-0.03784385696053505,
0.06945815682411194,
0.01364758238196373,
0.071448914706707,
-0.1341419368982315,
0.01853412762284279,
-0.07541599869728088,
-0.046881053596735,
-0.07532983273267746,
0.03640938177704811,
0.09778448939323425,
0.060614489018917084,
0.1506892442703247,
-0.07693833112716675,
0.022952178493142128,
-0.20597022771835327,
-0.03432932496070862,
-0.005904071033000946,
-0.061282653361558914,
-0.15116766095161438,
-0.04687394201755524,
0.08086245507001877,
-0.03731151670217514,
0.0933956429362297,
-0.018786989152431488,
0.07531243562698364,
0.037698354572057724,
-0.04795749485492706,
-0.05024692788720131,
-0.016700763255357742,
0.2004425823688507,
0.073117695748806,
-0.016065474599599838,
0.11091076582670212,
0.0014301016926765442,
0.029545903205871582,
0.052831169217824936,
0.17922115325927734,
0.21368291974067688,
0.03238959610462189,
0.054675836116075516,
0.06333833187818527,
-0.07601292431354523,
-0.07210594415664673,
0.18005946278572083,
-0.015458828769624233,
0.07100570946931839,
-0.04957689344882965,
0.19775548577308655,
0.10937399417161942,
-0.1680237352848053,
0.045921340584754944,
-0.04257464036345482,
-0.08038588613271713,
-0.1251520812511444,
-0.01127143856137991,
-0.08612094074487686,
-0.1267487108707428,
0.03769994154572487,
-0.11666323244571686,
0.0537792332470417,
0.10615944117307663,
0.01349672582000494,
0.036930277943611145,
0.1266167163848877,
-0.020090533420443535,
0.0019073864677920938,
0.06419949978590012,
0.005562605336308479,
-0.013727199286222458,
-0.03658141940832138,
-0.08070334792137146,
0.05002051591873169,
0.0009315076749771833,
0.07941370457410812,
-0.04638521373271942,
-0.01451835036277771,
0.026466811075806618,
-0.028669185936450958,
-0.08034541457891464,
0.027520835399627686,
0.04131825268268585,
0.05586600303649902,
0.04902106523513794,
0.04500957950949669,
-0.009011751972138882,
-0.03345296159386635,
0.3188135325908661,
-0.06710439920425415,
-0.0983584076166153,
-0.12194644659757614,
0.22040650248527527,
0.02910125069320202,
-0.03134113550186157,
0.03337188810110092,
-0.08269770443439484,
-0.01004036981612444,
0.15793558955192566,
0.1667773574590683,
-0.07085441052913666,
-0.022794084623456,
-0.005115605890750885,
-0.018039673566818237,
-0.03677038475871086,
0.1267857849597931,
0.08932476490736008,
-0.023670131340622902,
-0.06266909837722778,
-0.013910145498812199,
-0.018532592803239822,
-0.03159844130277634,
-0.03980065509676933,
0.0418151319026947,
0.014448970556259155,
-0.02580803819000721,
-0.042529620230197906,
0.07318764179944992,
0.004951911512762308,
-0.2577698528766632,
0.07044921815395355,
-0.1581527590751648,
-0.1712045967578888,
-0.043651264160871506,
0.03611206263303757,
-0.0019693055655807257,
0.05705690383911133,
-0.01665583811700344,
0.008951709605753422,
0.07765592634677887,
-0.017792468890547752,
-0.03242640569806099,
-0.12435434013605118,
0.12362875789403915,
-0.06417413055896759,
0.17128241062164307,
-0.029032133519649506,
0.048632729798555374,
0.11624567210674286,
0.028819821774959564,
-0.1357285976409912,
0.04171330854296684,
0.05394359678030014,
-0.10640554130077362,
0.015093967318534851,
0.15298768877983093,
-0.046902429312467575,
0.09491703659296036,
0.04198741540312767,
-0.10813835263252258,
0.004148964304476976,
-0.056562576442956924,
-0.034593768417835236,
-0.08081343024969101,
-0.014645134098827839,
-0.060131944715976715,
0.1688317060470581,
0.21952196955680847,
-0.029561234638094902,
0.01098969578742981,
-0.10213189572095871,
0.01742687076330185,
0.07049316167831421,
0.027559902518987656,
-0.05749271437525749,
-0.18887926638126373,
0.011645575985312462,
0.06996715068817139,
-0.007501260377466679,
-0.2421528398990631,
-0.07593849301338196,
0.03948679193854332,
-0.035112638026475906,
-0.040430936962366104,
0.10349295288324356,
0.03960893675684929,
0.05205712094902992,
-0.0300099179148674,
-0.1606951504945755,
-0.030665306374430656,
0.15263132750988007,
-0.17486470937728882,
-0.035719528794288635
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08124256879091263,
0.08361920714378357,
-0.0024102525785565376,
0.0839153379201889,
0.13662581145763397,
0.03312721103429794,
0.10065598040819168,
0.13857686519622803,
-0.11324849724769592,
0.04256902635097504,
0.088520847260952,
0.08354952186346054,
0.03145357966423035,
0.13692599534988403,
-0.03340497240424156,
-0.2540235221385956,
-0.006525433622300625,
-0.01859956607222557,
-0.09468170255422592,
0.10913720726966858,
0.09786155819892883,
-0.100407175719738,
0.07220350950956345,
-0.014326602220535278,
-0.17928668856620789,
0.017798351123929024,
-0.012587822042405605,
-0.05071749910712242,
0.11847054958343506,
-0.010479746386408806,
0.07560277730226517,
0.008570430800318718,
0.1233585774898529,
-0.18856249749660492,
0.016275672242045403,
0.0757446140050888,
0.04596712067723274,
0.09706650674343109,
0.008675355464220047,
-0.010958925820887089,
0.1314920037984848,
-0.12837845087051392,
0.09985917061567307,
0.03149198368191719,
-0.09488599747419357,
-0.2085527926683426,
-0.09522676467895508,
0.0068060909397900105,
0.04443858563899994,
0.08327163755893707,
0.009897775016725063,
0.15987110137939453,
-0.09789451956748962,
0.08309212327003479,
0.2295546978712082,
-0.2752769887447357,
-0.07956735044717789,
0.054600976407527924,
0.06019707769155502,
0.08195691555738449,
-0.12686114013195038,
-0.009795628488063812,
0.006643427535891533,
0.02518540248274803,
0.10476656258106232,
-0.03219550848007202,
-0.08609051257371902,
0.0028147445991635323,
-0.1053486242890358,
0.008172184228897095,
0.10976526141166687,
0.033368345350027084,
-0.05236639082431793,
-0.07176163792610168,
-0.04417818784713745,
-0.05342067778110504,
-0.03562223166227341,
-0.019902052357792854,
0.03933621942996979,
-0.058528292924165726,
-0.13915328681468964,
-0.0484355129301548,
-0.04838723689317703,
-0.09063541144132614,
-0.00695960083976388,
0.21840795874595642,
0.035895805805921555,
0.03157477453351021,
-0.0507839061319828,
0.10208159685134888,
0.009274391457438469,
-0.12734998762607574,
-0.028737571090459824,
-0.004743869416415691,
-0.09234963357448578,
-0.03616293892264366,
-0.06062545254826546,
0.021752195432782173,
0.03479394316673279,
0.21677057445049286,
-0.04078365117311478,
0.07817500829696655,
0.032962728291749954,
-0.017470108345150948,
-0.02732199616730213,
0.13873928785324097,
-0.021961793303489685,
-0.07619490474462509,
0.012573694810271263,
0.0627126395702362,
0.010662980377674103,
-0.004478296730667353,
-0.06394344568252563,
-0.03982901945710182,
0.06519345939159393,
0.04633026570081711,
-0.0622628815472126,
0.03108028694987297,
-0.006632502190768719,
-0.023172924295067787,
0.0012430788483470678,
-0.11546406149864197,
0.016885902732610703,
-0.009405643679201603,
-0.08223610371351242,
-0.04554840922355652,
0.006830962840467691,
-0.010812775231897831,
0.01200071256607771,
0.09594368934631348,
-0.07366921752691269,
-0.024261655285954475,
-0.08130986988544464,
-0.07595369964838028,
-0.016918187960982323,
-0.1579001098871231,
0.01967712491750717,
-0.061768725514411926,
-0.16371288895606995,
-0.03067801520228386,
0.05537669360637665,
-0.08145249634981155,
-0.02644362486898899,
-0.03296300023794174,
-0.07846692949533463,
0.020630601793527603,
0.002969520166516304,
0.21801652014255524,
-0.04921841621398926,
0.08826642483472824,
0.009933275170624256,
0.058384720236063004,
-0.009840073063969612,
0.03658248484134674,
-0.08649526536464691,
0.009230797179043293,
-0.1763227880001068,
0.0760950893163681,
-0.08003533631563187,
0.018231602385640144,
-0.1389615833759308,
-0.08581431210041046,
-0.011970299296081066,
-0.021985264495015144,
0.08043971657752991,
0.1035056784749031,
-0.14461001753807068,
-0.021388530731201172,
0.11701604723930359,
-0.06460452824831009,
-0.05397357791662216,
0.0563771054148674,
-0.0765075534582138,
0.08533035963773727,
0.0551881343126297,
0.19249379634857178,
0.08614176511764526,
-0.10581822693347931,
0.013518575578927994,
0.013759922236204147,
0.034456025809049606,
0.0017876372439786792,
0.05024169385433197,
0.0063856011256575584,
0.029848147183656693,
0.016375070437788963,
-0.0718371570110321,
0.0071394476108253,
-0.09098224341869354,
-0.05928473174571991,
-0.04921384155750275,
-0.08464933186769485,
-0.006460277363657951,
0.013592858798801899,
0.03579792007803917,
-0.08038216829299927,
-0.0843578651547432,
0.0766027495265007,
0.13655753433704376,
-0.047660935670137405,
0.016843555495142937,
-0.07597901672124863,
-0.0023845792748034,
-0.03142385184764862,
-0.023795420303940773,
-0.20348325371742249,
-0.05577670782804489,
0.03276403620839119,
-0.0060102324932813644,
0.045348770916461945,
0.0026884526014328003,
0.08542387187480927,
0.02724584750831127,
-0.0540841743350029,
-0.00037096254527568817,
-0.08566106110811234,
-0.00773871224373579,
-0.09486029297113419,
-0.21989624202251434,
-0.05342242494225502,
-0.03903166577219963,
0.1368544101715088,
-0.16627271473407745,
-0.003539196215569973,
-0.019494399428367615,
0.11626217514276505,
0.04231381043791771,
-0.05193541198968887,
-0.004775773733854294,
0.02873167209327221,
0.013012600131332874,
-0.09825441986322403,
0.03874519467353821,
0.01496269553899765,
-0.09375560283660889,
-0.030673563480377197,
-0.10474955290555954,
-0.009295598603785038,
0.07055720686912537,
0.07039220631122589,
-0.10165635496377945,
-0.014529462903738022,
-0.06228422746062279,
-0.027880223467946053,
-0.05247707664966583,
0.037580449134111404,
0.18135198950767517,
0.01738123781979084,
0.10989993065595627,
-0.07465403527021408,
-0.0841744989156723,
0.018283061683177948,
0.010083195753395557,
0.06240251660346985,
0.1012958362698555,
0.07555238902568817,
-0.11094260960817337,
0.05810459703207016,
0.09299729019403458,
-0.05309676378965378,
0.13735277950763702,
-0.04654867574572563,
-0.07342494279146194,
-0.0318586491048336,
-0.009691604413092136,
-0.007049106992781162,
0.15002572536468506,
-0.038797929883003235,
0.016028646379709244,
0.033846426755189896,
0.03561927005648613,
0.004373079165816307,
-0.16192549467086792,
-0.01442745141685009,
0.013399459421634674,
-0.04725074768066406,
-0.02333594486117363,
0.015463585034012794,
0.015531539916992188,
0.09502077847719193,
0.040950145572423935,
-0.0040353131480515,
0.0066764880903065205,
-0.009729074314236641,
-0.04140605032444,
0.2004808783531189,
-0.09120193868875504,
-0.041414111852645874,
-0.0752943605184555,
-0.0023852090816944838,
-0.022708483040332794,
-0.03993744030594826,
0.014514456503093243,
-0.0949472039937973,
-0.026251045987010002,
-0.07086971402168274,
-0.00020583822333719581,
-0.04181379824876785,
0.015561833046376705,
0.005227339453995228,
0.014053351245820522,
0.058274708688259125,
-0.13283205032348633,
0.010366901755332947,
-0.06784497946500778,
-0.11152607202529907,
0.028306741267442703,
0.06058620661497116,
0.07913906127214432,
0.057088691741228104,
-0.032117728143930435,
0.019392607733607292,
-0.04148872569203377,
0.23130933940410614,
-0.07806772738695145,
0.013192923739552498,
0.123287133872509,
0.02731495164334774,
0.03971398249268532,
0.10399511456489563,
0.03356165066361427,
-0.10060309618711472,
0.038224734365940094,
0.07678157836198807,
-0.04087193310260773,
-0.24374991655349731,
0.008038858883082867,
-0.038899991661310196,
-0.09748734533786774,
0.0892840102314949,
0.05043792352080345,
-0.043248072266578674,
0.06475019454956055,
0.010271829552948475,
0.012640902772545815,
-0.024701345711946487,
0.08816511183977127,
0.09075465798377991,
0.06554742902517319,
0.10840686410665512,
-0.03833938390016556,
-0.01796974427998066,
0.06371387839317322,
0.02884739264845848,
0.3040688931941986,
-0.048212017863988876,
0.08640820533037186,
0.047610245645046234,
0.13908717036247253,
-0.02188171073794365,
0.0472068265080452,
0.007014777977019548,
-0.006588503252714872,
-0.029793286696076393,
-0.05405513569712639,
-0.019264718517661095,
0.0024834591895341873,
-0.07823919504880905,
0.04482724890112877,
-0.049490850418806076,
0.04790182039141655,
0.01924216002225876,
0.289089560508728,
0.0019077210454270244,
-0.26488059759140015,
-0.09392573684453964,
-0.015548672527074814,
-0.0380939245223999,
-0.05224353075027466,
0.010212427005171776,
0.12043098360300064,
-0.12213968485593796,
0.03870980441570282,
-0.07123000919818878,
0.08075720816850662,
-0.028368255123496056,
-0.001677773310802877,
0.04239312559366226,
0.17380087077617645,
-0.021114369854331017,
0.055237650871276855,
-0.2219470739364624,
0.22612795233726501,
0.015489301644265652,
0.1292957365512848,
-0.059845227748155594,
0.008896342478692532,
0.027235420420765877,
0.0037043490447103977,
0.08797552436590195,
-0.004532530438154936,
-0.06486135721206665,
-0.1356661319732666,
-0.05239414796233177,
0.0731506198644638,
0.14343403279781342,
-0.04131666570901871,
0.0971461609005928,
-0.058071035891771317,
0.013805555179715157,
0.03728713467717171,
-0.0895300805568695,
-0.1325055956840515,
-0.0981907993555069,
-0.02284618280827999,
0.01520394068211317,
-0.07134118676185608,
-0.05651069059967995,
-0.06900887191295624,
0.030250465497374535,
0.10505791753530502,
0.01349662709981203,
-0.031494371592998505,
-0.14754655957221985,
0.08011171966791153,
0.15689823031425476,
-0.06592341512441635,
0.026603106409311295,
-0.003915619570761919,
0.07564844936132431,
0.039211712777614594,
-0.08311144262552261,
0.06058848276734352,
-0.06495895236730576,
-0.17807330191135406,
-0.04785975068807602,
0.09521365910768509,
0.06928326934576035,
0.04162691906094551,
-0.0023867604322731495,
0.05324413999915123,
-0.027383113279938698,
-0.09339454770088196,
0.02436722442507744,
0.02125769481062889,
0.03609085828065872,
0.03691211715340614,
-0.08621759712696075,
0.0749935656785965,
-0.03642649948596954,
-0.012794851325452328,
0.11310412734746933,
0.22878551483154297,
-0.10140793770551682,
0.09776172041893005,
0.06029406189918518,
-0.061191216111183167,
-0.16064295172691345,
0.07198985666036606,
0.10364221036434174,
0.008366088382899761,
0.0695977434515953,
-0.21120157837867737,
0.12997309863567352,
0.10404784232378006,
-0.015934452414512634,
0.04217399284243584,
-0.270890474319458,
-0.11976972967386246,
0.044246282428503036,
0.1286233365535736,
0.09765106439590454,
-0.1255744993686676,
-0.01381378062069416,
-0.016637714579701424,
-0.1121644452214241,
0.0954446792602539,
-0.11599766463041306,
0.13622000813484192,
-0.029490483924746513,
0.1111879050731659,
0.01061266753822565,
-0.025970803573727608,
0.10204877704381943,
0.04665638133883476,
0.10289303958415985,
-0.042126089334487915,
0.005325256381183863,
0.06676941365003586,
-0.047244928777217865,
0.0029875857289880514,
-0.07723837345838547,
0.08706381171941757,
-0.1306517869234085,
-0.003852222114801407,
-0.0900094136595726,
0.04604554921388626,
-0.03784363344311714,
-0.06845898181200027,
-0.041693855077028275,
0.05601486936211586,
0.04699964076280594,
-0.036185480654239655,
0.04448020085692406,
-0.017700379714369774,
0.10239191353321075,
0.03544206917285919,
0.08654669672250748,
0.01428946666419506,
-0.04501043260097504,
0.024500258266925812,
-0.009728649631142616,
0.06412461400032043,
-0.16494452953338623,
0.010715382173657417,
0.09926258027553558,
0.06075708568096161,
0.09689855575561523,
0.044439997524023056,
-0.04503829404711723,
0.01709817722439766,
0.030073486268520355,
-0.10052083432674408,
-0.10808464139699936,
0.047902561724185944,
-0.028229104354977608,
-0.13886688649654388,
0.050521232187747955,
0.11999325454235077,
-0.04332979395985603,
-0.02715958096086979,
-0.017182333394885063,
0.003910236060619354,
-0.02319258078932762,
0.18324516713619232,
0.04872441291809082,
0.054704222828149796,
-0.10274919122457504,
0.12693902850151062,
0.02845899574458599,
-0.018983667716383934,
0.051902931183576584,
0.08410773426294327,
-0.10342874377965927,
-0.0029085171408951283,
0.08262437582015991,
0.1405879557132721,
-0.057164572179317474,
-0.005997546948492527,
-0.10417286306619644,
-0.08273010700941086,
0.05071476474404335,
0.14336414635181427,
0.04932302609086037,
-0.016843212768435478,
-0.05426898971199989,
0.0402897484600544,
-0.14044983685016632,
0.07018402218818665,
0.02361788973212242,
0.06340385973453522,
-0.07510089874267578,
0.06017449125647545,
0.0074889627285301685,
0.011835461482405663,
-0.017363140359520912,
0.008917470462620258,
-0.09289220720529556,
-0.016825150698423386,
-0.07840999960899353,
-0.003184867324307561,
-0.00015899060235824436,
0.01695364899933338,
-0.02046210877597332,
-0.07144472002983093,
-0.04874458536505699,
0.03685129061341286,
-0.08717361837625504,
-0.05043157562613487,
0.009037278592586517,
0.039314016699790955,
-0.1236157938838005,
-0.005387564189732075,
0.021035362035036087,
-0.09265805780887604,
0.09503119438886642,
0.07324092090129852,
0.017516721040010452,
0.030502479523420334,
-0.1250636875629425,
-0.03341894596815109,
-0.009898727759718895,
-0.008093299344182014,
0.06200344115495682,
-0.09442977607250214,
-0.008293169550597668,
-0.037449948489665985,
0.07147156447172165,
0.013026868924498558,
0.06930190324783325,
-0.13314183056354523,
0.018839554861187935,
-0.07663164287805557,
-0.046958282589912415,
-0.07551176100969315,
0.03603556752204895,
0.09719021618366241,
0.05984669178724289,
0.15133850276470184,
-0.07589743286371231,
0.022861149162054062,
-0.20666693150997162,
-0.034555114805698395,
-0.006321310997009277,
-0.06182999163866043,
-0.15078914165496826,
-0.04764750972390175,
0.08125496655702591,
-0.03732573986053467,
0.09492360055446625,
-0.018112510442733765,
0.07592424005270004,
0.037107016891241074,
-0.04413069412112236,
-0.050596702843904495,
-0.016309574246406555,
0.20042283833026886,
0.07309699803590775,
-0.016151588410139084,
0.1101899966597557,
0.0022691073827445507,
0.029787926003336906,
0.05073447525501251,
0.1790723353624344,
0.21335668861865997,
0.03110429272055626,
0.054738011211156845,
0.06422160565853119,
-0.07582678645849228,
-0.07063248753547668,
0.18185459077358246,
-0.01649911142885685,
0.06959061324596405,
-0.04952600598335266,
0.20034581422805786,
0.10817506164312363,
-0.16773806512355804,
0.04612414911389351,
-0.04210049286484718,
-0.0810389593243599,
-0.12479974329471588,
-0.011907272972166538,
-0.08669020980596542,
-0.12633851170539856,
0.03772556036710739,
-0.11640577018260956,
0.05351199954748154,
0.1067989319562912,
0.01336744800209999,
0.03644949942827225,
0.12726013362407684,
-0.02040959522128105,
0.002711878390982747,
0.06399351358413696,
0.005419216584414244,
-0.01340066734701395,
-0.03627417981624603,
-0.07991038262844086,
0.04997621476650238,
0.00046462559839710593,
0.07920041680335999,
-0.04767753556370735,
-0.015673398971557617,
0.02593536674976349,
-0.027981674298644066,
-0.08000263571739197,
0.027742743492126465,
0.04103543981909752,
0.05536014959216118,
0.04775029793381691,
0.04558362811803818,
-0.009329751133918762,
-0.033800553530454636,
0.3169286251068115,
-0.0670500174164772,
-0.09989555180072784,
-0.12106867879629135,
0.21891242265701294,
0.029471108689904213,
-0.031129708513617516,
0.03357616811990738,
-0.08252155035734177,
-0.00885495264083147,
0.15884965658187866,
0.16681982576847076,
-0.07089382410049438,
-0.02299613691866398,
-0.005308352876454592,
-0.018509484827518463,
-0.037338145077228546,
0.1270550638437271,
0.08966196328401566,
-0.02534617856144905,
-0.06239088997244835,
-0.013700142502784729,
-0.01806808076798916,
-0.031941965222358704,
-0.04056227207183838,
0.04058371111750603,
0.015618574805557728,
-0.02585647441446781,
-0.04094487428665161,
0.0741029754281044,
0.006374027580022812,
-0.257528156042099,
0.06857362389564514,
-0.15734489262104034,
-0.17132064700126648,
-0.04385589808225632,
0.03648846596479416,
-0.0006151308189146221,
0.0570148266851902,
-0.016965419054031372,
0.009135225787758827,
0.07719740271568298,
-0.017802171409130096,
-0.032787565141916275,
-0.12475637346506119,
0.12393448501825333,
-0.06649889051914215,
0.17016847431659698,
-0.028854185715317726,
0.04961515590548515,
0.11604945361614227,
0.02778763696551323,
-0.13494330644607544,
0.04250095412135124,
0.05340612679719925,
-0.10584387183189392,
0.01608515903353691,
0.1525295078754425,
-0.04643024504184723,
0.09153657406568527,
0.041395291686058044,
-0.10826695710420609,
0.004706867504864931,
-0.05564899370074272,
-0.03487788140773773,
-0.08119408041238785,
-0.013338950462639332,
-0.06009820103645325,
0.16920579969882965,
0.219399094581604,
-0.02949412912130356,
0.011671352200210094,
-0.10250170528888702,
0.016626540571451187,
0.07046382129192352,
0.027569659054279327,
-0.058033064007759094,
-0.18935807049274445,
0.01124462392181158,
0.06879571080207825,
-0.007424946408718824,
-0.24002718925476074,
-0.0752226859331131,
0.037990596145391464,
-0.03604154661297798,
-0.04077322408556938,
0.10245007276535034,
0.04072028771042824,
0.05219811946153641,
-0.029804695397615433,
-0.161386176943779,
-0.030821828171610832,
0.15325458347797394,
-0.17551293969154358,
-0.035309817641973495
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08023901283740997,
0.08532371371984482,
-0.002517696702852845,
0.0840517207980156,
0.1357480138540268,
0.03348245844244957,
0.10059090703725815,
0.13943132758140564,
-0.11397736519575119,
0.042848508805036545,
0.08892413973808289,
0.08247502148151398,
0.031050117686390877,
0.13629160821437836,
-0.033487554639577866,
-0.25428667664527893,
-0.006635651458054781,
-0.019939498975872993,
-0.09650839865207672,
0.10921181738376617,
0.09843369573354721,
-0.09935681521892548,
0.07216642796993256,
-0.01404271088540554,
-0.17971666157245636,
0.017854461446404457,
-0.01175750233232975,
-0.05041366443037987,
0.11819988489151001,
-0.009642599150538445,
0.07580398768186569,
0.009659767150878906,
0.1230349987745285,
-0.18960262835025787,
0.015963418409228325,
0.0749860405921936,
0.04615682363510132,
0.09734640270471573,
0.009284008294343948,
-0.010959414765238762,
0.13156570494174957,
-0.12739509344100952,
0.0993354320526123,
0.03163158893585205,
-0.09461861848831177,
-0.2091055065393448,
-0.0954701378941536,
0.007694622036069632,
0.04467106983065605,
0.0824468657374382,
0.010657289996743202,
0.15896815061569214,
-0.09774608910083771,
0.08270227909088135,
0.22860194742679596,
-0.2759523391723633,
-0.07984543591737747,
0.05311903357505798,
0.059124656021595,
0.08247798681259155,
-0.12590761482715607,
-0.00931570678949356,
0.006422766018658876,
0.025293882936239243,
0.10374993085861206,
-0.03226979076862335,
-0.08617152273654938,
0.0026598197873681784,
-0.10514039546251297,
0.006444149184972048,
0.11003439873456955,
0.032797012478113174,
-0.05331531912088394,
-0.07041262835264206,
-0.04456692188978195,
-0.051058873534202576,
-0.0344458632171154,
-0.02052137441933155,
0.03930825740098953,
-0.05766426399350166,
-0.13894447684288025,
-0.05034240707755089,
-0.0496341735124588,
-0.0911320149898529,
-0.007286677602678537,
0.2198624312877655,
0.03593386709690094,
0.03291460871696472,
-0.050499387085437775,
0.10329462587833405,
0.011249137111008167,
-0.12695647776126862,
-0.02928328700363636,
-0.004589076619595289,
-0.09188003838062286,
-0.035634685307741165,
-0.060196228325366974,
0.022568460553884506,
0.03518330678343773,
0.21626704931259155,
-0.04321957752108574,
0.07762417197227478,
0.031856853514909744,
-0.01670951582491398,
-0.027067013084888458,
0.13829685747623444,
-0.02042466402053833,
-0.07360833138227463,
0.012624764814972878,
0.06231304630637169,
0.011146017350256443,
-0.004691852256655693,
-0.06380192190408707,
-0.04022447019815445,
0.06595677882432938,
0.04740825667977333,
-0.061541758477687836,
0.029119960963726044,
-0.007946815341711044,
-0.023473037406802177,
-0.00007467700925190002,
-0.11545254290103912,
0.0169037077575922,
-0.008870303630828857,
-0.08097150921821594,
-0.04676800221204758,
0.00750128086656332,
-0.010632610879838467,
0.012482798658311367,
0.09506075084209442,
-0.0723782554268837,
-0.023570185527205467,
-0.0799349918961525,
-0.07408321648836136,
-0.016949668526649475,
-0.15512455999851227,
0.0191393680870533,
-0.06251759827136993,
-0.16173529624938965,
-0.02926057204604149,
0.056110966950654984,
-0.08284507691860199,
-0.028758501634001732,
-0.03186115622520447,
-0.07741814851760864,
0.021708957850933075,
0.0021592779085040092,
0.2175775170326233,
-0.04930685833096504,
0.08913809806108475,
0.009965073317289352,
0.058063969016075134,
-0.00945657305419445,
0.03571593016386032,
-0.08576368540525436,
0.010163304395973682,
-0.17648810148239136,
0.07659851014614105,
-0.07925716787576675,
0.015029783360660076,
-0.14006070792675018,
-0.08529790490865707,
-0.010847958736121655,
-0.022939633578062057,
0.07967832684516907,
0.10352945327758789,
-0.1428692787885666,
-0.021322807297110558,
0.11619197577238083,
-0.0662359818816185,
-0.05380193144083023,
0.057490743696689606,
-0.07630821317434311,
0.08670367300510406,
0.05339822173118591,
0.19251157343387604,
0.08659209311008453,
-0.1061706393957138,
0.015282714739441872,
0.015158380381762981,
0.03420567512512207,
0.003573263995349407,
0.05177943781018257,
0.005622876342386007,
0.02755836211144924,
0.016148971393704414,
-0.07390997558832169,
0.007050638552755117,
-0.09158888459205627,
-0.059998031705617905,
-0.0490945465862751,
-0.08408989757299423,
-0.005289931315928698,
0.01223659235984087,
0.036005210131406784,
-0.07965636253356934,
-0.08384286612272263,
0.07592407613992691,
0.13699014484882355,
-0.04714573919773102,
0.017977822571992874,
-0.07634617388248444,
-0.002358023077249527,
-0.03198079764842987,
-0.02422170899808407,
-0.20321069657802582,
-0.056598905473947525,
0.033365435898303986,
-0.005859894212335348,
0.04538435861468315,
0.002621253952383995,
0.08467087149620056,
0.02751333639025688,
-0.05438849329948425,
-0.0006785595323890448,
-0.08626876026391983,
-0.008255582302808762,
-0.09420303255319595,
-0.2206188291311264,
-0.05309855937957764,
-0.03904702514410019,
0.13777387142181396,
-0.16569219529628754,
-0.004577221814543009,
-0.018392493948340416,
0.1162685826420784,
0.04261269420385361,
-0.05165563523769379,
-0.00547008728608489,
0.027850117534399033,
0.012461538426578045,
-0.09744860976934433,
0.038772501051425934,
0.016199380159378052,
-0.09436593949794769,
-0.02843027375638485,
-0.10297288000583649,
-0.008081729523837566,
0.06926538050174713,
0.07013063132762909,
-0.10167964547872543,
-0.015370454639196396,
-0.06263086944818497,
-0.028583761304616928,
-0.05367019400000572,
0.03793634846806526,
0.1819518804550171,
0.01711212657392025,
0.11006614565849304,
-0.07477668672800064,
-0.08362308889627457,
0.01814642734825611,
0.008238599635660648,
0.06125633791089058,
0.10108058899641037,
0.07530111819505692,
-0.11255938559770584,
0.05751495808362961,
0.092662014067173,
-0.05301603674888611,
0.13703420758247375,
-0.04648924246430397,
-0.07312287390232086,
-0.03318116441369057,
-0.007638663053512573,
-0.006569644436240196,
0.14966309070587158,
-0.03773041069507599,
0.01790342852473259,
0.034010086208581924,
0.03632907569408417,
0.004270182456821203,
-0.16253896057605743,
-0.014585188589990139,
0.013741469010710716,
-0.04875355213880539,
-0.021590672433376312,
0.01482963003218174,
0.01650124229490757,
0.09545501321554184,
0.041685670614242554,
-0.003636639565229416,
0.007839183323085308,
-0.009723702445626259,
-0.04255286604166031,
0.19973617792129517,
-0.09108835458755493,
-0.04335413873195648,
-0.07688919454813004,
-0.0017430160660296679,
-0.022766439244151115,
-0.039736226201057434,
0.015092318877577782,
-0.09387929737567902,
-0.02570039965212345,
-0.0710635781288147,
0.0007886118837632239,
-0.042673513293266296,
0.01611238531768322,
0.005628375336527824,
0.014276535250246525,
0.06033608689904213,
-0.1321648508310318,
0.0104390699416399,
-0.06736386567354202,
-0.1124487891793251,
0.02927217073738575,
0.060921162366867065,
0.0787472203373909,
0.05617797374725342,
-0.031424712389707565,
0.019225917756557465,
-0.041011992841959,
0.23158632218837738,
-0.07746081799268723,
0.012322020716965199,
0.12354326993227005,
0.02589673548936844,
0.04026765376329422,
0.1038040742278099,
0.03291212022304535,
-0.10044966638088226,
0.03843977302312851,
0.07643596827983856,
-0.04127134382724762,
-0.2436371147632599,
0.007686898577958345,
-0.03824537247419357,
-0.09653590619564056,
0.08914853632450104,
0.05062827467918396,
-0.04641205817461014,
0.06365688890218735,
0.010618660598993301,
0.011502438224852085,
-0.02429053746163845,
0.08812331408262253,
0.09171564877033234,
0.06563616544008255,
0.10805395245552063,
-0.03770921006798744,
-0.01785692572593689,
0.0651615709066391,
0.02978222630918026,
0.3030675947666168,
-0.048744551837444305,
0.08555670827627182,
0.04796284809708595,
0.1403084695339203,
-0.02200148068368435,
0.045391324907541275,
0.007317407056689262,
-0.005758936516940594,
-0.030005235224962234,
-0.05402090772986412,
-0.020358912646770477,
0.0035154353827238083,
-0.07702794671058655,
0.044578664004802704,
-0.0503004789352417,
0.050228483974933624,
0.019013842567801476,
0.29077911376953125,
0.002085615647956729,
-0.2629886269569397,
-0.0938270315527916,
-0.014883953146636486,
-0.03783170506358147,
-0.0519905649125576,
0.010227535851299763,
0.12243387848138809,
-0.12298282980918884,
0.03767498955130577,
-0.0709371343255043,
0.08002746850252151,
-0.02898634783923626,
-0.002515509957447648,
0.041815515607595444,
0.17197997868061066,
-0.02007407695055008,
0.055808402597904205,
-0.21999500691890717,
0.22629649937152863,
0.015339924022555351,
0.1278550773859024,
-0.05844366177916527,
0.00938491616398096,
0.026474833488464355,
0.0038471927400678396,
0.08849447965621948,
-0.003675672225654125,
-0.06563536822795868,
-0.13583128154277802,
-0.05368999019265175,
0.07299594581127167,
0.14371949434280396,
-0.042662542313337326,
0.09680798649787903,
-0.05876808613538742,
0.014539070427417755,
0.03676316514611244,
-0.08836613595485687,
-0.13192139565944672,
-0.09778453409671783,
-0.0224301815032959,
0.013934369198977947,
-0.07182037085294724,
-0.05762966349720955,
-0.06904866546392441,
0.034073397517204285,
0.10716241598129272,
0.012334411963820457,
-0.03169647604227066,
-0.1473126858472824,
0.08002350479364395,
0.15630203485488892,
-0.06671421229839325,
0.025436321273446083,
-0.004106747917830944,
0.07616410404443741,
0.039162199944257736,
-0.08336193859577179,
0.060532938688993454,
-0.06480399519205093,
-0.17943790555000305,
-0.04795864596962929,
0.09574596583843231,
0.06896543502807617,
0.04196375608444214,
-0.0023499818053096533,
0.052763983607292175,
-0.026120001450181007,
-0.09312792122364044,
0.02565889246761799,
0.021615730598568916,
0.03595610707998276,
0.03665263205766678,
-0.0858500525355339,
0.07742844521999359,
-0.03617570921778679,
-0.014050851576030254,
0.11454914510250092,
0.23194891214370728,
-0.10182838141918182,
0.09979677200317383,
0.06029871106147766,
-0.06179282069206238,
-0.1610351949930191,
0.06986962258815765,
0.10523945093154907,
0.007607916835695505,
0.07088446617126465,
-0.21183502674102783,
0.1291368156671524,
0.10383941233158112,
-0.01709521934390068,
0.041097451001405716,
-0.2717690169811249,
-0.1198568120598793,
0.043126482516527176,
0.12833242118358612,
0.09905233979225159,
-0.12499456852674484,
-0.01473125722259283,
-0.015110092237591743,
-0.11208946257829666,
0.09467799961566925,
-0.11438003927469254,
0.13616585731506348,
-0.02942771650850773,
0.11171510815620422,
0.01082755345851183,
-0.025275832042098045,
0.10387896746397018,
0.04489756003022194,
0.1013154610991478,
-0.04163959622383118,
0.00595996156334877,
0.06483175605535507,
-0.04800769314169884,
0.0022862597834318876,
-0.07688972353935242,
0.08773858100175858,
-0.131573885679245,
-0.004216426983475685,
-0.0898960530757904,
0.04576563090085983,
-0.03862329199910164,
-0.06843701750040054,
-0.04154188930988312,
0.055851321667432785,
0.04732881486415863,
-0.03609069436788559,
0.044185131788253784,
-0.017230821773409843,
0.10122368484735489,
0.04006671905517578,
0.0855778232216835,
0.01595303975045681,
-0.045824043452739716,
0.023083563894033432,
-0.009917888790369034,
0.06446290016174316,
-0.16406650841236115,
0.012091239914298058,
0.09854021668434143,
0.06010917201638222,
0.09711258113384247,
0.04418308660387993,
-0.04569585248827934,
0.018194857984781265,
0.029946118593215942,
-0.1017346903681755,
-0.10883501917123795,
0.04722142592072487,
-0.02723984234035015,
-0.13915285468101501,
0.04839159920811653,
0.12250108271837234,
-0.042710475623607635,
-0.027105500921607018,
-0.01695011556148529,
0.00491111958399415,
-0.02356562949717045,
0.18202047049999237,
0.04750582203269005,
0.05526357144117355,
-0.101612888276577,
0.12705251574516296,
0.02852834202349186,
-0.01846194826066494,
0.05193566158413887,
0.08338149636983871,
-0.10254864394664764,
-0.0024927002377808094,
0.08231686800718307,
0.13939401507377625,
-0.058751001954078674,
-0.0058072772808372974,
-0.1040012389421463,
-0.08299141377210617,
0.050152525305747986,
0.14177072048187256,
0.049638532102108,
-0.016045374795794487,
-0.05410366505384445,
0.04004238173365593,
-0.13946028053760529,
0.07088497281074524,
0.024207351729273796,
0.06332552433013916,
-0.07597542554140091,
0.05971495062112808,
0.00719679007306695,
0.013345480896532536,
-0.01769634708762169,
0.007851003669202328,
-0.09283427894115448,
-0.01668677106499672,
-0.08006307482719421,
-0.0011165555333718657,
0.0010221924167126417,
0.016953937709331512,
-0.019749853760004044,
-0.07201942801475525,
-0.048753522336483,
0.03768254071474075,
-0.08705481886863708,
-0.05036930739879608,
0.008299443870782852,
0.03987899050116539,
-0.12383097410202026,
-0.006072127725929022,
0.0221872441470623,
-0.0935521051287651,
0.09625857323408127,
0.0740380510687828,
0.017562754452228546,
0.02999209053814411,
-0.12317643314599991,
-0.03369215875864029,
-0.009574737399816513,
-0.008118446916341782,
0.06151224300265312,
-0.09615406394004822,
-0.008959684520959854,
-0.037258490920066833,
0.0705886259675026,
0.013244707137346268,
0.07243403792381287,
-0.13383640348911285,
0.01908057928085327,
-0.07692301273345947,
-0.04872722178697586,
-0.07518969476222992,
0.0354742594063282,
0.09654518216848373,
0.061307139694690704,
0.15131360292434692,
-0.07708962261676788,
0.023160021752119064,
-0.20636644959449768,
-0.03430967405438423,
-0.006199914030730724,
-0.05974975600838661,
-0.15123941004276276,
-0.04741542413830757,
0.08018375933170319,
-0.037221845239400864,
0.09351220726966858,
-0.018617672845721245,
0.07471645623445511,
0.03742500767111778,
-0.04678044095635414,
-0.04912891983985901,
-0.016444627195596695,
0.19875475764274597,
0.07317007333040237,
-0.015794288367033005,
0.11053697764873505,
0.0007104991236701608,
0.030346661806106567,
0.05068057402968407,
0.1803746074438095,
0.2133781760931015,
0.03183349221944809,
0.05468206852674484,
0.06418589502573013,
-0.0753643810749054,
-0.07278183102607727,
0.18040534853935242,
-0.016217555850744247,
0.06972331553697586,
-0.048600465059280396,
0.1984187364578247,
0.10920503735542297,
-0.16849187016487122,
0.044923752546310425,
-0.04217012971639633,
-0.08109155297279358,
-0.12605342268943787,
-0.01103727426379919,
-0.08742688596248627,
-0.1266958862543106,
0.037543319165706635,
-0.11637977510690689,
0.05384741351008415,
0.10598691552877426,
0.012651648372411728,
0.03731703758239746,
0.12489743530750275,
-0.019708871841430664,
0.002611251547932625,
0.06326570361852646,
0.005701940506696701,
-0.01268518902361393,
-0.034782275557518005,
-0.08049493283033371,
0.049823109060525894,
0.0022545321844518185,
0.07893721759319305,
-0.0465206615626812,
-0.014531513676047325,
0.025420477613806725,
-0.028818275779485703,
-0.0803261548280716,
0.027367113158106804,
0.04090789705514908,
0.05553167313337326,
0.04665454104542732,
0.04584856703877449,
-0.009026835672557354,
-0.033347297459840775,
0.3183007836341858,
-0.06678764522075653,
-0.0991612896323204,
-0.12134383618831635,
0.2212219536304474,
0.027841972187161446,
-0.03019060008227825,
0.034723225980997086,
-0.08233904838562012,
-0.010534771718084812,
0.15706714987754822,
0.166286438703537,
-0.0726330429315567,
-0.022819891571998596,
-0.005516988690942526,
-0.01823290064930916,
-0.03646296635270119,
0.1280255913734436,
0.08898596465587616,
-0.02351837046444416,
-0.06309620290994644,
-0.014309519901871681,
-0.019357595592737198,
-0.03136393055319786,
-0.04074445739388466,
0.04151042923331261,
0.014466345310211182,
-0.024513939395546913,
-0.04235678166151047,
0.0732019767165184,
0.006271726451814175,
-0.25679731369018555,
0.06916496902704239,
-0.15680739283561707,
-0.17183056473731995,
-0.04289821535348892,
0.03730477765202522,
-0.002571833785623312,
0.05712883919477463,
-0.017865480855107307,
0.008723038248717785,
0.07762446999549866,
-0.01798953488469124,
-0.033081941306591034,
-0.12360996007919312,
0.12467952072620392,
-0.06580941379070282,
0.17142607271671295,
-0.028060026466846466,
0.050239674746990204,
0.11551639437675476,
0.02832110784947872,
-0.13585250079631805,
0.041337449103593826,
0.053603462874889374,
-0.10603565722703934,
0.015250510536134243,
0.15343576669692993,
-0.04674915596842766,
0.09349115192890167,
0.04297492280602455,
-0.10789250582456589,
0.0030360626988112926,
-0.05549832060933113,
-0.03492894396185875,
-0.0805354118347168,
-0.015128177590668201,
-0.06135847419500351,
0.168192520737648,
0.21863986551761627,
-0.029720770195126534,
0.012051914818584919,
-0.10201411694288254,
0.01741715334355831,
0.07039999216794968,
0.029377074912190437,
-0.05684669688344002,
-0.18943195044994354,
0.01097894087433815,
0.06973092257976532,
-0.007000093813985586,
-0.24131883680820465,
-0.07699087262153625,
0.03860323876142502,
-0.034996021538972855,
-0.04055893421173096,
0.10392949730157852,
0.03924938663840294,
0.05158844217658043,
-0.02957790531218052,
-0.162637397646904,
-0.031379107385873795,
0.15253742039203644,
-0.17491793632507324,
-0.03547542542219162
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08062123507261276,
0.08416745066642761,
-0.0024736877530813217,
0.0849875882267952,
0.13697010278701782,
0.033841878175735474,
0.10092262923717499,
0.13866369426250458,
-0.11402437090873718,
0.04241468757390976,
0.08906381577253342,
0.08248081058263779,
0.030224042013287544,
0.1359824538230896,
-0.03340746834874153,
-0.2541303336620331,
-0.007090881932526827,
-0.019356612116098404,
-0.09598001092672348,
0.10917629301548004,
0.09752266854047775,
-0.09998884797096252,
0.07273636013269424,
-0.014306738041341305,
-0.18076109886169434,
0.01824941113591194,
-0.012369051575660706,
-0.05016699060797691,
0.11842489242553711,
-0.009537865407764912,
0.07611528038978577,
0.00885868351906538,
0.12288864701986313,
-0.1880091428756714,
0.0162445567548275,
0.07514043897390366,
0.0457395538687706,
0.09697892516851425,
0.00874787662178278,
-0.011233541183173656,
0.1303708255290985,
-0.12774239480495453,
0.09914176911115646,
0.03150113299489021,
-0.09483416378498077,
-0.2097155600786209,
-0.09493747353553772,
0.006305807735770941,
0.04368869587779045,
0.08329974859952927,
0.010363134555518627,
0.15847690403461456,
-0.09780572354793549,
0.08312824368476868,
0.2270815223455429,
-0.27659550309181213,
-0.08022336661815643,
0.053706031292676926,
0.059129748493433,
0.08278747648000717,
-0.1264258176088333,
-0.008535215631127357,
0.006735104601830244,
0.026106927543878555,
0.10402682423591614,
-0.03266540914773941,
-0.0863105058670044,
0.0030678249895572662,
-0.10501234233379364,
0.007965102791786194,
0.1107684075832367,
0.03302675113081932,
-0.05299420282244682,
-0.07051641494035721,
-0.043936002999544144,
-0.052338723093271255,
-0.03488076478242874,
-0.01957656629383564,
0.03947843983769417,
-0.05851934850215912,
-0.13932739198207855,
-0.04874396696686745,
-0.04923476278781891,
-0.09004950523376465,
-0.007459952961653471,
0.2189689725637436,
0.035778723657131195,
0.03262285888195038,
-0.050329551100730896,
0.10250668972730637,
0.010865213349461555,
-0.1267956644296646,
-0.02825349196791649,
-0.005001217592507601,
-0.09134231507778168,
-0.03562209755182266,
-0.06087544187903404,
0.023827744647860527,
0.03549204394221306,
0.21653532981872559,
-0.043320026248693466,
0.07825061678886414,
0.032442182302474976,
-0.017328990623354912,
-0.027078593149781227,
0.13733340799808502,
-0.021744202822446823,
-0.07516416907310486,
0.012819799594581127,
0.062270596623420715,
0.010534104891121387,
-0.004781284835189581,
-0.06409560889005661,
-0.03939369320869446,
0.06471901386976242,
0.04687974601984024,
-0.06269693374633789,
0.031047143042087555,
-0.006977001205086708,
-0.023034699261188507,
-0.00022709087352268398,
-0.11526473611593246,
0.016539467498660088,
-0.009173210710287094,
-0.08110857754945755,
-0.045801836997270584,
0.007017213851213455,
-0.011470361612737179,
0.012241898104548454,
0.09547381103038788,
-0.07329876720905304,
-0.02415177971124649,
-0.08051979541778564,
-0.07506653666496277,
-0.017484016716480255,
-0.155037522315979,
0.020158641040325165,
-0.06231440231204033,
-0.16201531887054443,
-0.03040720336139202,
0.055960699915885925,
-0.08258043229579926,
-0.027485115453600883,
-0.03185822069644928,
-0.07819809019565582,
0.02136661298573017,
0.0023631302174180746,
0.21819941699504852,
-0.04966561868786812,
0.0880768746137619,
0.011013926938176155,
0.058103419840335846,
-0.010101709514856339,
0.03579976409673691,
-0.08554902672767639,
0.009457340463995934,
-0.17709149420261383,
0.07583856582641602,
-0.0798862874507904,
0.016931749880313873,
-0.1389186829328537,
-0.0863015428185463,
-0.009949542582035065,
-0.021812306717038155,
0.07927361130714417,
0.10298621654510498,
-0.14262717962265015,
-0.021335069090127945,
0.11546505242586136,
-0.06494488567113876,
-0.053926076740026474,
0.057425446808338165,
-0.07661977410316467,
0.08579980581998825,
0.0538148507475853,
0.19279661774635315,
0.08649375289678574,
-0.10568761825561523,
0.014489160850644112,
0.014389256946742535,
0.03446640446782112,
0.0029017894994467497,
0.050168149173259735,
0.006588655058294535,
0.02759210579097271,
0.0166736152023077,
-0.07264944911003113,
0.0070198290050029755,
-0.09119130671024323,
-0.059521473944187164,
-0.048535432666540146,
-0.08399607986211777,
-0.006090075708925724,
0.012542779557406902,
0.03592348098754883,
-0.08059606701135635,
-0.08405276387929916,
0.07519830763339996,
0.13673050701618195,
-0.04761454835534096,
0.01687437854707241,
-0.07659108191728592,
-0.0018082897877320647,
-0.03201785683631897,
-0.02373691461980343,
-0.20299598574638367,
-0.05738455429673195,
0.03237259387969971,
-0.0038826812524348497,
0.04555925726890564,
0.002942364662885666,
0.0851919874548912,
0.027734652161598206,
-0.054582394659519196,
-0.0008262930787168443,
-0.08494403958320618,
-0.007814619690179825,
-0.09402529895305634,
-0.22106315195560455,
-0.05319816991686821,
-0.038890305906534195,
0.13764366507530212,
-0.1662461757659912,
-0.0042433962225914,
-0.018357811495661736,
0.1156914010643959,
0.04223925620317459,
-0.05105528607964516,
-0.004884902387857437,
0.029249291867017746,
0.01323248352855444,
-0.09744218736886978,
0.03927009180188179,
0.0158877894282341,
-0.09320128709077835,
-0.02977241389453411,
-0.10334677249193192,
-0.006662933621555567,
0.0703408494591713,
0.06894703209400177,
-0.10216113179922104,
-0.01500089280307293,
-0.06213536113500595,
-0.028411664068698883,
-0.05195104330778122,
0.037916384637355804,
0.18303968012332916,
0.01648542657494545,
0.11000949144363403,
-0.07415198534727097,
-0.08320822566747665,
0.018305562436580658,
0.009318175725638866,
0.06245169788599014,
0.10104133933782578,
0.0746089443564415,
-0.11107311397790909,
0.05776991695165634,
0.0921335443854332,
-0.05344796180725098,
0.13764767348766327,
-0.04664859548211098,
-0.07246579229831696,
-0.03268935903906822,
-0.008760849945247173,
-0.006985380779951811,
0.15033569931983948,
-0.038101207464933395,
0.016858676448464394,
0.033621326088905334,
0.035894930362701416,
0.004547840915620327,
-0.1616070717573166,
-0.014524484984576702,
0.013029959984123707,
-0.04776822403073311,
-0.022179214283823967,
0.015520814806222916,
0.015899742022156715,
0.09510080516338348,
0.04150313138961792,
-0.004783314652740955,
0.007862924598157406,
-0.00954488292336464,
-0.04162490740418434,
0.20014508068561554,
-0.09150692820549011,
-0.04215250164270401,
-0.07661861181259155,
-0.0036765215918421745,
-0.023810038343071938,
-0.040176063776016235,
0.01480066031217575,
-0.09507322311401367,
-0.02607857622206211,
-0.07074432820081711,
0.00009722202958073467,
-0.04260498657822609,
0.016036180779337883,
0.005114639177918434,
0.01367922406643629,
0.0598742701113224,
-0.13202403485774994,
0.010545257478952408,
-0.06767775863409042,
-0.11243456602096558,
0.0297395009547472,
0.06158466637134552,
0.07943286746740341,
0.05566154792904854,
-0.031490687280893326,
0.019271768629550934,
-0.04087885096669197,
0.23248736560344696,
-0.07753774523735046,
0.012566917575895786,
0.12332160025835037,
0.0266837440431118,
0.03933944180607796,
0.10381375998258591,
0.03390796482563019,
-0.10088728368282318,
0.03813422843813896,
0.07688293606042862,
-0.040867678821086884,
-0.24349647760391235,
0.008135619573295116,
-0.0386955700814724,
-0.09681018441915512,
0.08877534419298172,
0.0505831241607666,
-0.045871637761592865,
0.06424682587385178,
0.01158248819410801,
0.012380699627101421,
-0.02519911341369152,
0.08784971386194229,
0.09346921741962433,
0.06522919237613678,
0.10875760763883591,
-0.038054171949625015,
-0.018190087750554085,
0.06451663374900818,
0.02985936962068081,
0.3038875162601471,
-0.04886121302843094,
0.08537547290325165,
0.04841264337301254,
0.13983947038650513,
-0.021388374269008636,
0.0459861122071743,
0.006678713019937277,
-0.006442470941692591,
-0.030056092888116837,
-0.05379645526409149,
-0.019215915352106094,
0.00246533565223217,
-0.07867501676082611,
0.044339101761579514,
-0.050246790051460266,
0.04891301319003105,
0.019304854795336723,
0.289997398853302,
0.0014130555791780353,
-0.26510584354400635,
-0.0943029597401619,
-0.01564211957156658,
-0.03742154315114021,
-0.051562972366809845,
0.010576681233942509,
0.12185470014810562,
-0.12240368127822876,
0.03822673112154007,
-0.07083966583013535,
0.07990583777427673,
-0.02906305156648159,
-0.0014647665666416287,
0.04365864768624306,
0.173611581325531,
-0.02101769670844078,
0.05524802953004837,
-0.22065873444080353,
0.2253260612487793,
0.015652325004339218,
0.12842997908592224,
-0.05852967128157616,
0.009166310541331768,
0.02735799551010132,
0.005251079797744751,
0.0880310907959938,
-0.004231962375342846,
-0.06553233414888382,
-0.135846808552742,
-0.05259663239121437,
0.07397241145372391,
0.14320217072963715,
-0.041537582874298096,
0.09742928296327591,
-0.0580148845911026,
0.01406119018793106,
0.0366484597325325,
-0.08950560539960861,
-0.13204526901245117,
-0.09832039475440979,
-0.02336331456899643,
0.015540248714387417,
-0.07162471860647202,
-0.05674709007143974,
-0.06936787068843842,
0.032391756772994995,
0.10626406222581863,
0.013966950587928295,
-0.031679123640060425,
-0.14769457280635834,
0.07913797348737717,
0.15630662441253662,
-0.06589717417955399,
0.025601450353860855,
-0.003785629291087389,
0.07502862066030502,
0.0401025153696537,
-0.08324761688709259,
0.06075068190693855,
-0.065449558198452,
-0.1781042516231537,
-0.0480615459382534,
0.09471618384122849,
0.06856425106525421,
0.04118669033050537,
-0.002823485527187586,
0.0527820847928524,
-0.027005601674318314,
-0.09376244992017746,
0.02569895051419735,
0.020062904804944992,
0.03680167719721794,
0.03656967356801033,
-0.08687523752450943,
0.07728977501392365,
-0.03549417108297348,
-0.013184506446123123,
0.11308268457651138,
0.2296084314584732,
-0.10136165469884872,
0.0976703092455864,
0.06038475036621094,
-0.06139973923563957,
-0.1606096476316452,
0.07125189155340195,
0.10376626253128052,
0.007959000766277313,
0.06939783692359924,
-0.21243558824062347,
0.1302306354045868,
0.10333709418773651,
-0.016074609011411667,
0.042933475226163864,
-0.26924362778663635,
-0.11913246661424637,
0.0433388277888298,
0.1289631873369217,
0.10036668181419373,
-0.12549585103988647,
-0.013989282771945,
-0.015305266715586185,
-0.11177802830934525,
0.09398094564676285,
-0.11680437624454498,
0.13637755811214447,
-0.029860110953450203,
0.11243245750665665,
0.010096096433699131,
-0.02526918612420559,
0.1029866635799408,
0.045837514102458954,
0.10253607481718063,
-0.04202667996287346,
0.006907076109200716,
0.06478310376405716,
-0.04725199192762375,
0.001996551873162389,
-0.07775166630744934,
0.0872812420129776,
-0.13177596032619476,
-0.004039104096591473,
-0.09034255892038345,
0.0457640178501606,
-0.03844036906957626,
-0.06821531057357788,
-0.04093778133392334,
0.055903397500514984,
0.04625990614295006,
-0.03632482886314392,
0.043056655675172806,
-0.018008938059210777,
0.10163947939872742,
0.03871487081050873,
0.08611392229795456,
0.014430556446313858,
-0.04622174799442291,
0.024342261254787445,
-0.010423459112644196,
0.06447157263755798,
-0.16377533972263336,
0.010783394798636436,
0.09912113100290298,
0.059369802474975586,
0.09668797254562378,
0.044911518692970276,
-0.04474388062953949,
0.017557064071297646,
0.03073767200112343,
-0.10208514332771301,
-0.1077360212802887,
0.047503646463155746,
-0.03038726933300495,
-0.13851982355117798,
0.04973170533776283,
0.12228315323591232,
-0.04256243258714676,
-0.026791533455252647,
-0.016945673152804375,
0.004149203654378653,
-0.02354421652853489,
0.1825515478849411,
0.04804236441850662,
0.05472389981150627,
-0.10259141027927399,
0.12666992843151093,
0.02837890200316906,
-0.018843254074454308,
0.05190594121813774,
0.08442696928977966,
-0.1034349799156189,
-0.003359731985256076,
0.08166820555925369,
0.14132824540138245,
-0.058345768600702286,
-0.006442980375140905,
-0.10506223142147064,
-0.08324622362852097,
0.05002807453274727,
0.1414649486541748,
0.049534495919942856,
-0.017247531563043594,
-0.05435684695839882,
0.03955753892660141,
-0.14013248682022095,
0.07049509882926941,
0.023136386647820473,
0.06379769742488861,
-0.07595683634281158,
0.05897383391857147,
0.007040439639240503,
0.012591084465384483,
-0.01764543727040291,
0.008417855016887188,
-0.093404121696949,
-0.016870958730578423,
-0.07994785904884338,
-0.0019413646077737212,
0.001080216490663588,
0.01752268709242344,
-0.020031550899147987,
-0.0710606649518013,
-0.049386583268642426,
0.037801340222358704,
-0.08719760924577713,
-0.049986422061920166,
0.010064849629998207,
0.04037272557616234,
-0.12307199090719223,
-0.00576202105730772,
0.02103564143180847,
-0.09312506020069122,
0.0959656611084938,
0.07382421940565109,
0.017233887687325478,
0.030155355110764503,
-0.12226715683937073,
-0.03416360914707184,
-0.010100013576447964,
-0.00868925079703331,
0.062263090163469315,
-0.09505397826433182,
-0.008453154936432838,
-0.037469759583473206,
0.07120189815759659,
0.01338049117475748,
0.07158307731151581,
-0.13321234285831451,
0.01965673826634884,
-0.07613514363765717,
-0.04727530479431152,
-0.07580915838479996,
0.03567465394735336,
0.0973551794886589,
0.0608927458524704,
0.15128730237483978,
-0.077043816447258,
0.022473420947790146,
-0.2063504308462143,
-0.0346590057015419,
-0.006325291004031897,
-0.06133156269788742,
-0.15085875988006592,
-0.04753004387021065,
0.0807480663061142,
-0.03788458928465843,
0.09522127360105515,
-0.018475113436579704,
0.07474275678396225,
0.0369594544172287,
-0.046501774340867996,
-0.05022796615958214,
-0.016229994595050812,
0.19836996495723724,
0.07262669503688812,
-0.016307283192873,
0.11031199991703033,
0.00200655753724277,
0.02971057780086994,
0.05185960605740547,
0.17880725860595703,
0.21281284093856812,
0.0326245054602623,
0.05473129078745842,
0.06459739059209824,
-0.07577653974294662,
-0.07133758068084717,
0.1813403069972992,
-0.015863610431551933,
0.07030735164880753,
-0.04951227456331253,
0.19716191291809082,
0.10831655561923981,
-0.16741536557674408,
0.045149508863687515,
-0.0430319644510746,
-0.08112411946058273,
-0.12502002716064453,
-0.009770890697836876,
-0.08691371977329254,
-0.12720605731010437,
0.03756893053650856,
-0.11685709655284882,
0.05306820943951607,
0.10705595463514328,
0.013177284970879555,
0.03680054470896721,
0.12637732923030853,
-0.018912706524133682,
0.003310565836727619,
0.06347596645355225,
0.0052817203104496,
-0.01312557328492403,
-0.0347440242767334,
-0.08010353893041611,
0.04915874823927879,
0.0007616303046233952,
0.07823053002357483,
-0.04705405980348587,
-0.014944665133953094,
0.026002725586295128,
-0.028472013771533966,
-0.07991806417703629,
0.027212314307689667,
0.04149635136127472,
0.05517782270908356,
0.04756512865424156,
0.04542591795325279,
-0.00989344622939825,
-0.033385660499334335,
0.3173099756240845,
-0.06674089282751083,
-0.098155178129673,
-0.12162671983242035,
0.2207726091146469,
0.027592118829488754,
-0.030725939199328423,
0.03352337330579758,
-0.08138002455234528,
-0.009535975754261017,
0.15826629102230072,
0.16807183623313904,
-0.072920061647892,
-0.02308887243270874,
-0.0052179680205881596,
-0.018275858834385872,
-0.03665066510438919,
0.1284475326538086,
0.08934102207422256,
-0.024445893242955208,
-0.06298696994781494,
-0.014251098968088627,
-0.018889294937253,
-0.03101579286158085,
-0.04088640213012695,
0.040869370102882385,
0.015459255315363407,
-0.024967260658740997,
-0.04159783199429512,
0.07307326793670654,
0.005738324485719204,
-0.2567862570285797,
0.06978940963745117,
-0.15649531781673431,
-0.17204490303993225,
-0.04389948025345802,
0.03684207797050476,
-0.0017978879623115063,
0.05755462124943733,
-0.01794571802020073,
0.00854162871837616,
0.07818259298801422,
-0.01843142881989479,
-0.03220280259847641,
-0.12441319227218628,
0.12451109290122986,
-0.06624314934015274,
0.17030036449432373,
-0.028519727289676666,
0.049976348876953125,
0.11613427102565765,
0.027983054518699646,
-0.1355617791414261,
0.041910868138074875,
0.05290668085217476,
-0.10666777193546295,
0.015726003795862198,
0.1524055004119873,
-0.04658786579966545,
0.09280171990394592,
0.04186100512742996,
-0.10697851330041885,
0.0038490896113216877,
-0.05533082038164139,
-0.03479985147714615,
-0.08049055933952332,
-0.015179167501628399,
-0.061188433319330215,
0.1686372458934784,
0.21992656588554382,
-0.029526930302381516,
0.011974974535405636,
-0.10191960632801056,
0.017364682629704475,
0.07094511389732361,
0.027870090678334236,
-0.0576825886964798,
-0.18930797278881073,
0.011300649493932724,
0.0705970823764801,
-0.007484897039830685,
-0.24206776916980743,
-0.07586685568094254,
0.038407228887081146,
-0.03479153662919998,
-0.04077509418129921,
0.10369911044836044,
0.040244512259960175,
0.052514344453811646,
-0.02998601272702217,
-0.1604943722486496,
-0.03137868270277977,
0.15236157178878784,
-0.1750003546476364,
-0.036127276718616486
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08059658855199814,
0.08370451629161835,
-0.002440281677991152,
0.08472847193479538,
0.1371995210647583,
0.033336203545331955,
0.10067257285118103,
0.13889913260936737,
-0.11346834152936935,
0.04277197644114494,
0.08855082094669342,
0.0830727219581604,
0.030774781480431557,
0.13571293652057648,
-0.03325862064957619,
-0.25378212332725525,
-0.0065861959010362625,
-0.01913357526063919,
-0.09569036960601807,
0.10913310199975967,
0.0975719541311264,
-0.10020466148853302,
0.07205340266227722,
-0.014694556593894958,
-0.18084923923015594,
0.018484290689229965,
-0.012601073831319809,
-0.049892205744981766,
0.11823330819606781,
-0.009964088909327984,
0.07578834891319275,
0.009113199077546597,
0.12274385988712311,
-0.18847449123859406,
0.016216877847909927,
0.07557478547096252,
0.04590384662151337,
0.09709960967302322,
0.009653046727180481,
-0.010861365124583244,
0.13157309591770172,
-0.12723559141159058,
0.09890121221542358,
0.03195871040225029,
-0.09481561183929443,
-0.2081286758184433,
-0.09545867145061493,
0.005725516472011805,
0.04424511268734932,
0.08395789563655853,
0.009863442741334438,
0.15920723974704742,
-0.09848512709140778,
0.0831226259469986,
0.22858694195747375,
-0.27534759044647217,
-0.08016376942396164,
0.05426045134663582,
0.05899161472916603,
0.08188074827194214,
-0.12723161280155182,
-0.009655565023422241,
0.0067855739034712315,
0.025763124227523804,
0.10318195819854736,
-0.03200008347630501,
-0.08740651607513428,
0.0026820253115147352,
-0.10530833154916763,
0.0076703885570168495,
0.10960176587104797,
0.03322180360555649,
-0.05252986401319504,
-0.07032153755426407,
-0.04423033818602562,
-0.05180910602211952,
-0.03457282856106758,
-0.019374780356884003,
0.039554525166749954,
-0.0584053210914135,
-0.1388808637857437,
-0.0485357865691185,
-0.04920352250337601,
-0.0910165011882782,
-0.007090267259627581,
0.21858017146587372,
0.03583313524723053,
0.0324815958738327,
-0.05088053271174431,
0.10248067229986191,
0.011135085485875607,
-0.1269739419221878,
-0.02910700812935829,
-0.00437696510925889,
-0.09145495295524597,
-0.035675570368766785,
-0.06061102822422981,
0.022753430530428886,
0.034887898713350296,
0.21468304097652435,
-0.043025121092796326,
0.07846477627754211,
0.03180870786309242,
-0.01723872683942318,
-0.02772834338247776,
0.13746802508831024,
-0.020903946831822395,
-0.07326480001211166,
0.012062662281095982,
0.0625147894024849,
0.010164640843868256,
-0.004404278472065926,
-0.06352950632572174,
-0.039558812975883484,
0.06524667888879776,
0.04654573276638985,
-0.06234116852283478,
0.030567388981580734,
-0.007451884914189577,
-0.023233098909258842,
0.00048483232967555523,
-0.11512927711009979,
0.016435544937849045,
-0.009530564770102501,
-0.08142605423927307,
-0.04660777002573013,
0.007150908932089806,
-0.011717562563717365,
0.011975022964179516,
0.09564243257045746,
-0.0737684816122055,
-0.02413882315158844,
-0.08124642819166183,
-0.07479505985975266,
-0.017175685614347458,
-0.15655134618282318,
0.019802125170826912,
-0.061457641422748566,
-0.16219913959503174,
-0.0304887518286705,
0.05551997944712639,
-0.08265301585197449,
-0.027275148779153824,
-0.03239721804857254,
-0.07870350778102875,
0.021622205153107643,
0.002665623789653182,
0.21900881826877594,
-0.04931419715285301,
0.08878245204687119,
0.010937055572867393,
0.05794112756848335,
-0.00944503303617239,
0.036408450454473495,
-0.08638880401849747,
0.009321719408035278,
-0.17641332745552063,
0.0759713351726532,
-0.08055634796619415,
0.016796177253127098,
-0.13969282805919647,
-0.08574994653463364,
-0.011150299571454525,
-0.022417603060603142,
0.08001526445150375,
0.1037057638168335,
-0.14266051352024078,
-0.021091721951961517,
0.11583638936281204,
-0.06570862978696823,
-0.053831253200769424,
0.05647392198443413,
-0.07632416486740112,
0.08551241457462311,
0.05330466479063034,
0.19274531304836273,
0.08601921051740646,
-0.10517334938049316,
0.013357950374484062,
0.01389654353260994,
0.035071566700935364,
0.002307811053469777,
0.05017221346497536,
0.0063682664185762405,
0.028434839099645615,
0.016620757058262825,
-0.07277987897396088,
0.006889554671943188,
-0.09127428382635117,
-0.05889953672885895,
-0.04911340773105621,
-0.08410931378602982,
-0.006311163306236267,
0.013601905666291714,
0.035721659660339355,
-0.08021895587444305,
-0.08360597491264343,
0.07522760331630707,
0.1366652250289917,
-0.0470585934817791,
0.017045127227902412,
-0.07603182643651962,
-0.0029664444737136364,
-0.03280598297715187,
-0.023781057447195053,
-0.20393064618110657,
-0.05786208063364029,
0.03296041116118431,
-0.004132754635065794,
0.04546089470386505,
0.0027773550245910883,
0.08505874872207642,
0.026973696425557137,
-0.05461954325437546,
-0.0009723424445837736,
-0.08584418147802353,
-0.008395267650485039,
-0.09433798491954803,
-0.22083327174186707,
-0.053465817123651505,
-0.03938902169466019,
0.1358896940946579,
-0.16579747200012207,
-0.004444899968802929,
-0.018332352861762047,
0.1160193383693695,
0.04221801087260246,
-0.05158110707998276,
-0.004863389302045107,
0.028663955628871918,
0.01291535422205925,
-0.09773920476436615,
0.03917378932237625,
0.015273749828338623,
-0.09286954253911972,
-0.029633108526468277,
-0.10356488078832626,
-0.008253016509115696,
0.07002831250429153,
0.0703003853559494,
-0.10222792625427246,
-0.014920883812010288,
-0.062296025454998016,
-0.0287574864923954,
-0.0530015304684639,
0.03860244154930115,
0.182328999042511,
0.016812046989798546,
0.10953369736671448,
-0.07451950013637543,
-0.08344960957765579,
0.018907390534877777,
0.00943340826779604,
0.061868827790021896,
0.10175623744726181,
0.07609155029058456,
-0.11239694803953171,
0.0584053136408329,
0.09268736839294434,
-0.052734509110450745,
0.1379873901605606,
-0.04681262746453285,
-0.0730314776301384,
-0.03161315992474556,
-0.008174356073141098,
-0.006803391966968775,
0.1499854326248169,
-0.03779378533363342,
0.017192455008625984,
0.033565450459718704,
0.03628382459282875,
0.004537607543170452,
-0.16191959381103516,
-0.01466626487672329,
0.012879648245871067,
-0.047757912427186966,
-0.022738322615623474,
0.015369056724011898,
0.01587485894560814,
0.09526046365499496,
0.04120182991027832,
-0.004482271149754524,
0.007431270554661751,
-0.009620939381420612,
-0.041706036776304245,
0.2008497565984726,
-0.0911920964717865,
-0.04117321968078613,
-0.07579702883958817,
-0.002879105741158128,
-0.02353729121387005,
-0.040193989872932434,
0.014883413910865784,
-0.09615257382392883,
-0.026018746197223663,
-0.07070261985063553,
0.00014073635975364596,
-0.04264365881681442,
0.015258656814694405,
0.004568359814584255,
0.013995175249874592,
0.059321075677871704,
-0.13265478610992432,
0.01061057299375534,
-0.06811239570379257,
-0.11285889893770218,
0.029470035806298256,
0.06100928783416748,
0.07909545302391052,
0.05589780583977699,
-0.03164996951818466,
0.019140837714076042,
-0.04104007035493851,
0.23140326142311096,
-0.07772056013345718,
0.01209147647023201,
0.12354429066181183,
0.027591245248913765,
0.039900150150060654,
0.10398998111486435,
0.033195361495018005,
-0.10092495381832123,
0.03870845586061478,
0.0773070901632309,
-0.04116468504071236,
-0.24455834925174713,
0.008142424747347832,
-0.03889329358935356,
-0.09706520289182663,
0.08909659832715988,
0.05082181841135025,
-0.046111416071653366,
0.06432059407234192,
0.010620580054819584,
0.011331208050251007,
-0.025266362354159355,
0.08823542296886444,
0.0917319506406784,
0.06586726754903793,
0.10868613421916962,
-0.03803490474820137,
-0.017665458843111992,
0.06383242458105087,
0.030039161443710327,
0.30506476759910583,
-0.04898671433329582,
0.08560362458229065,
0.04807782545685768,
0.139964759349823,
-0.021852489560842514,
0.04661004990339279,
0.006567822303622961,
-0.006512593477964401,
-0.029955703765153885,
-0.05365332216024399,
-0.01949544996023178,
0.002387366956099868,
-0.07917886972427368,
0.04467954859137535,
-0.050346896052360535,
0.049211516976356506,
0.01845160499215126,
0.2907175123691559,
0.001657690852880478,
-0.2642911374568939,
-0.09379810094833374,
-0.01584620587527752,
-0.03768559545278549,
-0.052196748554706573,
0.01035935990512371,
0.1215781718492508,
-0.12206766754388809,
0.037643913179636,
-0.07115402072668076,
0.080573171377182,
-0.027931638062000275,
-0.0021029049530625343,
0.042747862637043,
0.17367447912693024,
-0.020849352702498436,
0.05583073943853378,
-0.2212112545967102,
0.22719810903072357,
0.01550295203924179,
0.12830644845962524,
-0.05882779881358147,
0.009154057130217552,
0.026644373312592506,
0.003335807006806135,
0.08839033544063568,
-0.004152805078774691,
-0.06595505028963089,
-0.135506272315979,
-0.05202898383140564,
0.07369658350944519,
0.14387869834899902,
-0.041957125067710876,
0.09694620221853256,
-0.058139726519584656,
0.01425999030470848,
0.03737255930900574,
-0.08917436003684998,
-0.132219597697258,
-0.0984470471739769,
-0.022953037172555923,
0.014820615760982037,
-0.07240323722362518,
-0.056726645678281784,
-0.06923755258321762,
0.03350649029016495,
0.10614117980003357,
0.01399566326290369,
-0.031341906636953354,
-0.1475856751203537,
0.07974711805582047,
0.15653757750988007,
-0.06621173769235611,
0.025515848770737648,
-0.004121850244700909,
0.07493054121732712,
0.03926553949713707,
-0.08342566341161728,
0.06146176904439926,
-0.06542923301458359,
-0.1786474734544754,
-0.04804031178355217,
0.09458606690168381,
0.06894392520189285,
0.041500817984342575,
-0.002963419770821929,
0.053121160715818405,
-0.02692686766386032,
-0.09352818131446838,
0.026001762598752975,
0.020238900557160378,
0.03678122162818909,
0.03693647310137749,
-0.08644680678844452,
0.07636160403490067,
-0.03618438541889191,
-0.013734486885368824,
0.11280450969934464,
0.2306234985589981,
-0.1012536883354187,
0.09788327664136887,
0.06071031466126442,
-0.06160523742437363,
-0.16097503900527954,
0.07166134566068649,
0.10437088459730148,
0.007567704189568758,
0.07005009800195694,
-0.21239520609378815,
0.13022354245185852,
0.10332594811916351,
-0.016385892406105995,
0.042163506150245667,
-0.26974737644195557,
-0.11912770569324493,
0.04325459152460098,
0.1287948489189148,
0.09902238100767136,
-0.1251765489578247,
-0.013901815749704838,
-0.015007756650447845,
-0.11135631799697876,
0.09468185156583786,
-0.115142323076725,
0.13666145503520966,
-0.03016856499016285,
0.11242713034152985,
0.010246890597045422,
-0.02567330189049244,
0.10196167975664139,
0.04604697227478027,
0.10259661078453064,
-0.041691042482852936,
0.006561569403856993,
0.0650145560503006,
-0.047411780804395676,
0.002128194784745574,
-0.0778026208281517,
0.08767440915107727,
-0.1309535801410675,
-0.0036925955209881067,
-0.09093187749385834,
0.04604469612240791,
-0.038399774581193924,
-0.06812281161546707,
-0.04096004366874695,
0.05616273358464241,
0.046749670058488846,
-0.03650471568107605,
0.044057004153728485,
-0.017762839794158936,
0.10304324328899384,
0.03901218995451927,
0.08646096289157867,
0.01500017661601305,
-0.04540843144059181,
0.023607691749930382,
-0.00974167138338089,
0.06455922871828079,
-0.16458269953727722,
0.010549020953476429,
0.09901537746191025,
0.06020843982696533,
0.09657833725214005,
0.04537274315953255,
-0.04538118839263916,
0.017578057944774628,
0.029958348721265793,
-0.1013980433344841,
-0.10848422348499298,
0.04770493507385254,
-0.02704385109245777,
-0.13925553858280182,
0.04991908371448517,
0.12093612551689148,
-0.0433613583445549,
-0.02689306065440178,
-0.016844339668750763,
0.0045578680001199245,
-0.023014437407255173,
0.1831909418106079,
0.04806528240442276,
0.05532555654644966,
-0.1023966446518898,
0.12710291147232056,
0.028134293854236603,
-0.019439147785305977,
0.05195885896682739,
0.08416109532117844,
-0.10286904126405716,
-0.0031880056485533714,
0.08275102823972702,
0.13991214334964752,
-0.05808489769697189,
-0.005215022712945938,
-0.1042458713054657,
-0.08299224823713303,
0.05034272000193596,
0.14239773154258728,
0.04935668036341667,
-0.017016638070344925,
-0.05412682145833969,
0.03994149714708328,
-0.14059510827064514,
0.07061101496219635,
0.022804221138358116,
0.0639265850186348,
-0.07556888461112976,
0.05780386924743652,
0.007246172986924648,
0.012867332436144352,
-0.01740710809826851,
0.008754323236644268,
-0.09308534115552902,
-0.01690082624554634,
-0.07800551503896713,
-0.0020995738450437784,
0.0010901862988248467,
0.016971971839666367,
-0.020471777766942978,
-0.07134474813938141,
-0.04851432517170906,
0.03787684068083763,
-0.08746064454317093,
-0.05042972043156624,
0.009463006630539894,
0.040079884231090546,
-0.12341137230396271,
-0.005606821738183498,
0.0213879756629467,
-0.09286154806613922,
0.09517969191074371,
0.07336869835853577,
0.017545755952596664,
0.030452551320195198,
-0.12406932562589645,
-0.03387334942817688,
-0.009890233166515827,
-0.008613578043878078,
0.06202177703380585,
-0.09455174207687378,
-0.008757386356592178,
-0.03759229555726051,
0.07123394310474396,
0.0130052724853158,
0.07079553604125977,
-0.13379867374897003,
0.018849052488803864,
-0.07710760831832886,
-0.047935303300619125,
-0.07521899789571762,
0.03604881837964058,
0.09745081514120102,
0.06102912127971649,
0.15108570456504822,
-0.07678297907114029,
0.023143518716096878,
-0.20655642449855804,
-0.03451396897435188,
-0.006213814485818148,
-0.061223167926073074,
-0.1511915624141693,
-0.0468829870223999,
0.08096177130937576,
-0.03768981993198395,
0.09292301535606384,
-0.018353158608078957,
0.07529770582914352,
0.03709638491272926,
-0.04613817110657692,
-0.05036788806319237,
-0.016365133225917816,
0.19844558835029602,
0.07253886759281158,
-0.015889592468738556,
0.11179865896701813,
0.001981081673875451,
0.02952430583536625,
0.052938736975193024,
0.18038450181484222,
0.21363165974617004,
0.03149332106113434,
0.05495600402355194,
0.06459382176399231,
-0.07622383534908295,
-0.07128551602363586,
0.1813548058271408,
-0.015292181633412838,
0.07009226083755493,
-0.04958578944206238,
0.19770637154579163,
0.10846076160669327,
-0.16727106273174286,
0.045590221881866455,
-0.0432884618639946,
-0.08086273074150085,
-0.12503063678741455,
-0.009752674959599972,
-0.08681049942970276,
-0.1271323412656784,
0.0380089245736599,
-0.11714853346347809,
0.05354943871498108,
0.10746224969625473,
0.013078862801194191,
0.036935560405254364,
0.12703163921833038,
-0.018281802535057068,
0.0031592464074492455,
0.06373663991689682,
0.005216161720454693,
-0.013223403133451939,
-0.035024579614400864,
-0.07991451025009155,
0.05036631599068642,
0.0007974770851433277,
0.07841718941926956,
-0.04697595164179802,
-0.01572941057384014,
0.02572278119623661,
-0.02824164740741253,
-0.08014575392007828,
0.027360832318663597,
0.04161413758993149,
0.05521625280380249,
0.04818526282906532,
0.045289792120456696,
-0.009198962710797787,
-0.03352448716759682,
0.31844738125801086,
-0.06704674661159515,
-0.09772301465272903,
-0.12073193490505219,
0.22133375704288483,
0.028521763160824776,
-0.030702706426382065,
0.033488985151052475,
-0.08206584304571152,
-0.0098248440772295,
0.15696410834789276,
0.16624368727207184,
-0.07181277871131897,
-0.022844156250357628,
-0.005450626835227013,
-0.01824219338595867,
-0.0364222452044487,
0.12829062342643738,
0.0896248146891594,
-0.024039044976234436,
-0.06329867988824844,
-0.014097397215664387,
-0.018829889595508575,
-0.03161374852061272,
-0.04002196341753006,
0.040816113352775574,
0.015759805217385292,
-0.02522226795554161,
-0.041724834591150284,
0.07312489300966263,
0.005829094909131527,
-0.2562592029571533,
0.06909951567649841,
-0.15730629861354828,
-0.17164871096611023,
-0.04384377598762512,
0.036525554955005646,
-0.0017714209388941526,
0.05764154717326164,
-0.017926448956131935,
0.009109909646213055,
0.07659348845481873,
-0.018288549035787582,
-0.0325593464076519,
-0.12515975534915924,
0.12398397922515869,
-0.0665791854262352,
0.1706075221300125,
-0.02862200327217579,
0.049222126603126526,
0.11600109934806824,
0.028212131932377815,
-0.13621139526367188,
0.04174463078379631,
0.05308866128325462,
-0.10683853179216385,
0.0155170364305377,
0.15325728058815002,
-0.04652974382042885,
0.09295890480279922,
0.0415349081158638,
-0.10868149995803833,
0.0041511100716888905,
-0.05591093748807907,
-0.03417301923036575,
-0.08101348578929901,
-0.013717709109187126,
-0.06107627600431442,
0.1687680333852768,
0.2202230989933014,
-0.029507171362638474,
0.011389265768229961,
-0.1022409126162529,
0.017093317583203316,
0.07012710720300674,
0.028798244893550873,
-0.05737840756773949,
-0.1892736703157425,
0.011465894058346748,
0.07046841830015182,
-0.007633979432284832,
-0.24203070998191833,
-0.07551436126232147,
0.03862794116139412,
-0.03578481078147888,
-0.04088654741644859,
0.10306244343519211,
0.04008388891816139,
0.05206197500228882,
-0.030025625601410866,
-0.16214540600776672,
-0.03117171861231327,
0.15279296040534973,
-0.17512543499469757,
-0.03577091917395592
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "roberta-base-few-shot-k-64-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/roberta-base-few-shot-k-64-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
|
# roberta-base-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of roberta-base on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# roberta-base-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n",
"# roberta-base-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
48,
46,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# roberta-base-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of roberta-base on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.08115996420383453,
0.08448737859725952,
-0.002456140238791704,
0.0851537361741066,
0.13728004693984985,
0.03380529209971428,
0.09992329776287079,
0.1388428509235382,
-0.11359760165214539,
0.04251246899366379,
0.08865329623222351,
0.08261726051568985,
0.030644120648503304,
0.13516560196876526,
-0.03363615646958351,
-0.25359153747558594,
-0.007311038672924042,
-0.018998945131897926,
-0.09442020207643509,
0.10873066633939743,
0.0976584330201149,
-0.10034681111574173,
0.07208932191133499,
-0.01497737132012844,
-0.18039162456989288,
0.01832088641822338,
-0.012211519293487072,
-0.04999798908829689,
0.11852416396141052,
-0.009368502534925938,
0.07575493305921555,
0.008321593515574932,
0.12298454344272614,
-0.1889374852180481,
0.01606791652739048,
0.07579832524061203,
0.04590589180588722,
0.09713518619537354,
0.008732178248465061,
-0.010185683146119118,
0.13112029433250427,
-0.12764482200145721,
0.09894125908613205,
0.031719133257865906,
-0.0945313349366188,
-0.20855404436588287,
-0.09499579668045044,
0.0066437916830182076,
0.04406974837183952,
0.0833093598484993,
0.010709049180150032,
0.15970121324062347,
-0.09742677956819534,
0.08367352187633514,
0.22910846769809723,
-0.2748223841190338,
-0.07948609441518784,
0.05366350710391998,
0.058803584426641464,
0.08302739262580872,
-0.1264169067144394,
-0.009488214738667011,
0.006656978745013475,
0.025532927364110947,
0.10369810461997986,
-0.032634783536195755,
-0.08897479623556137,
0.0026663574390113354,
-0.10501345992088318,
0.007465735077857971,
0.10995616018772125,
0.03347565233707428,
-0.05274935066699982,
-0.06989135593175888,
-0.044642750173807144,
-0.05253376066684723,
-0.03497619926929474,
-0.019439848139882088,
0.03926631063222885,
-0.058249522000551224,
-0.13785140216350555,
-0.04892521724104881,
-0.0487448126077652,
-0.09015504270792007,
-0.006677438970655203,
0.2185525745153427,
0.03609003871679306,
0.032336924225091934,
-0.050116054713726044,
0.10233241319656372,
0.00956419762223959,
-0.1268008053302765,
-0.028757311403751373,
-0.003927114885300398,
-0.09152902662754059,
-0.03584374487400055,
-0.06065988168120384,
0.021802624687552452,
0.03479817509651184,
0.21571822464466095,
-0.04276347905397415,
0.07826756685972214,
0.03198276087641716,
-0.01668510027229786,
-0.0276455357670784,
0.13827331364154816,
-0.021328480914235115,
-0.07441620528697968,
0.013108039274811745,
0.062244951725006104,
0.010773799382150173,
-0.004990557674318552,
-0.06448470801115036,
-0.039943840354681015,
0.06542149186134338,
0.04632735252380371,
-0.06233511120080948,
0.03010769747197628,
-0.007463809568434954,
-0.023338433355093002,
0.0007629350875504315,
-0.11518529057502747,
0.016656115651130676,
-0.009533394128084183,
-0.08134179562330246,
-0.04626185819506645,
0.007794675882905722,
-0.010988152585923672,
0.01193300448358059,
0.0951157957315445,
-0.07298204302787781,
-0.023638257756829262,
-0.08117999881505966,
-0.07492353022098541,
-0.017016369849443436,
-0.15606723725795746,
0.020269855856895447,
-0.06249123066663742,
-0.16248789429664612,
-0.030318139120936394,
0.055624332278966904,
-0.08211976289749146,
-0.027113813906908035,
-0.03178328648209572,
-0.07781177014112473,
0.02120056562125683,
0.002440324053168297,
0.21792127192020416,
-0.049506980925798416,
0.08841688185930252,
0.010753555223345757,
0.05750004202127457,
-0.010504388250410557,
0.03661801293492317,
-0.08572273701429367,
0.009267671033740044,
-0.17664936184883118,
0.07600615918636322,
-0.08009256422519684,
0.017328433692455292,
-0.13864536583423615,
-0.08558467030525208,
-0.010004017502069473,
-0.021950876340270042,
0.08038170635700226,
0.10278283804655075,
-0.14281681180000305,
-0.02087133750319481,
0.11521308869123459,
-0.0650089830160141,
-0.05397839844226837,
0.05787350609898567,
-0.07644332200288773,
0.08517342805862427,
0.05378943681716919,
0.1925426572561264,
0.08699125051498413,
-0.10503785312175751,
0.014589243568480015,
0.015071379952132702,
0.035649750381708145,
0.00136714824475348,
0.04999162629246712,
0.0064877914264798164,
0.02854110486805439,
0.016811786219477654,
-0.0720692053437233,
0.007429772987961769,
-0.09128478914499283,
-0.059249259531497955,
-0.04937855899333954,
-0.084152951836586,
-0.00709275109693408,
0.013861827552318573,
0.03572488948702812,
-0.08001532405614853,
-0.0838816910982132,
0.07623234391212463,
0.13669776916503906,
-0.047324247658252716,
0.01727122999727726,
-0.07554757595062256,
-0.002363203326240182,
-0.03164556622505188,
-0.02384631521999836,
-0.20359639823436737,
-0.056624483317136765,
0.03287264332175255,
-0.004226437769830227,
0.04514759033918381,
0.002882179571315646,
0.08416874706745148,
0.027465444058179855,
-0.054154861718416214,
-0.00004439988697413355,
-0.08526890724897385,
-0.00806351937353611,
-0.09511086344718933,
-0.22027677297592163,
-0.053444553166627884,
-0.03895697742700577,
0.13640397787094116,
-0.1668446958065033,
-0.004114287439733744,
-0.019223589450120926,
0.11552860587835312,
0.04180263355374336,
-0.05152176693081856,
-0.00507315481081605,
0.029299777001142502,
0.012690131552517414,
-0.09793461114168167,
0.03917135298252106,
0.01582702249288559,
-0.09361521154642105,
-0.030687443912029266,
-0.10350556671619415,
-0.007678750436753035,
0.06954921036958694,
0.06904423981904984,
-0.10226120799779892,
-0.015495505183935165,
-0.06183499097824097,
-0.02867933362722397,
-0.052444688975811005,
0.03778304159641266,
0.18301120400428772,
0.016993185505270958,
0.11085210740566254,
-0.07434260845184326,
-0.08307025581598282,
0.018663913011550903,
0.009395836852490902,
0.0625975951552391,
0.10147681087255478,
0.07591813802719116,
-0.11060827225446701,
0.057802774012088776,
0.09191323071718216,
-0.053893327713012695,
0.1371140480041504,
-0.04656016081571579,
-0.07270147651433945,
-0.03176713362336159,
-0.008349052630364895,
-0.006761870346963406,
0.14979961514472961,
-0.039006609469652176,
0.016535403206944466,
0.033690501004457474,
0.03585902974009514,
0.004747722763568163,
-0.16197016835212708,
-0.014630841091275215,
0.013520467095077038,
-0.047365088015794754,
-0.021594403311610222,
0.014707875438034534,
0.015697676688432693,
0.09493555128574371,
0.04096757993102074,
-0.004864073824137449,
0.007913850247859955,
-0.009557174518704414,
-0.04186781123280525,
0.20041300356388092,
-0.09133308380842209,
-0.042158689349889755,
-0.07688722759485245,
-0.0027580689638853073,
-0.023602614179253578,
-0.040405549108982086,
0.01515701599419117,
-0.09464164823293686,
-0.02605649083852768,
-0.07112114876508713,
-0.001071745646186173,
-0.04249046742916107,
0.01503359992057085,
0.004652244504541159,
0.013654577545821667,
0.05973256379365921,
-0.13256770372390747,
0.010510780848562717,
-0.06749996542930603,
-0.11228054016828537,
0.02965031936764717,
0.06121155992150307,
0.07943685352802277,
0.05657076835632324,
-0.0321807861328125,
0.018799133598804474,
-0.04062367230653763,
0.23111701011657715,
-0.07726986706256866,
0.012352356687188148,
0.12358955293893814,
0.026599250733852386,
0.04010744392871857,
0.1033121794462204,
0.03364015370607376,
-0.10094399005174637,
0.0384550616145134,
0.07671850174665451,
-0.04151562228798866,
-0.243988037109375,
0.008086267858743668,
-0.03908229246735573,
-0.09691254794597626,
0.08882778882980347,
0.05081617087125778,
-0.04558192566037178,
0.06441439688205719,
0.01100344117730856,
0.012649795971810818,
-0.0260719433426857,
0.08798644691705704,
0.09226130694150925,
0.06539355963468552,
0.1083686351776123,
-0.03774365782737732,
-0.017435748130083084,
0.06422383338212967,
0.029205061495304108,
0.3035115599632263,
-0.04919357970356941,
0.0861610546708107,
0.04750926047563553,
0.1404726356267929,
-0.022011272609233856,
0.04609198868274689,
0.006190111394971609,
-0.006695488467812538,
-0.03009803779423237,
-0.05366787314414978,
-0.020466050133109093,
0.0028708376921713352,
-0.07979460060596466,
0.04529943689703941,
-0.050328757613897324,
0.04971602186560631,
0.018192626535892487,
0.2905438244342804,
0.001424329588189721,
-0.26410502195358276,
-0.09391313791275024,
-0.015970060601830482,
-0.037845056504011154,
-0.052546169608831406,
0.010425104759633541,
0.12228092551231384,
-0.12197090685367584,
0.036962587386369705,
-0.07054278999567032,
0.08079639822244644,
-0.029003320261836052,
-0.0018379745306447148,
0.04234467074275017,
0.17412053048610687,
-0.020779289305210114,
0.05593901500105858,
-0.22218525409698486,
0.22608475387096405,
0.01572798378765583,
0.12838803231716156,
-0.058817822486162186,
0.009607023559510708,
0.026683900505304337,
0.004921878222376108,
0.08761856704950333,
-0.003838617354631424,
-0.06499971449375153,
-0.13671447336673737,
-0.052354808896780014,
0.07359865307807922,
0.14288997650146484,
-0.04057609289884567,
0.09652421623468399,
-0.058440543711185455,
0.01435790490359068,
0.03733241930603981,
-0.08859214931726456,
-0.13217908143997192,
-0.09866712242364883,
-0.023379482328891754,
0.015672434121370316,
-0.07210008054971695,
-0.05659393593668938,
-0.06888951361179352,
0.031916432082653046,
0.10643145442008972,
0.014442283660173416,
-0.03140658140182495,
-0.1475241780281067,
0.08046352118253708,
0.15582719445228577,
-0.06641753017902374,
0.025707604363560677,
-0.004178350325673819,
0.07484250515699387,
0.03936181589961052,
-0.08271464705467224,
0.06130977347493172,
-0.06532955914735794,
-0.1782185584306717,
-0.048267483711242676,
0.09398489445447922,
0.06887778639793396,
0.041584562510252,
-0.0027072669472545385,
0.05284397304058075,
-0.02725793793797493,
-0.09349415451288223,
0.024877779185771942,
0.02119954116642475,
0.036384448409080505,
0.03726606070995331,
-0.08663848787546158,
0.07719024270772934,
-0.035940930247306824,
-0.013997972942888737,
0.11381138116121292,
0.23024354875087738,
-0.10160953551530838,
0.0971204861998558,
0.06098587065935135,
-0.061800964176654816,
-0.16041582822799683,
0.07168178260326385,
0.10393694043159485,
0.007716638967394829,
0.07030016928911209,
-0.21187524497509003,
0.1304783821105957,
0.10437341034412384,
-0.01612403430044651,
0.04225730150938034,
-0.26983627676963806,
-0.11934354156255722,
0.04404017701745033,
0.12884216010570526,
0.10037405788898468,
-0.12579715251922607,
-0.01387543324381113,
-0.015623677521944046,
-0.11234299093484879,
0.0935937836766243,
-0.11458264291286469,
0.1364590972661972,
-0.030033648014068604,
0.11141383647918701,
0.010353180579841137,
-0.025692876428365707,
0.10242170095443726,
0.046509694308042526,
0.10261344909667969,
-0.042107727378606796,
0.00683947280049324,
0.064395971596241,
-0.04757590591907501,
0.002556087914854288,
-0.07761628180742264,
0.08774520456790924,
-0.13261212408542633,
-0.0037721379194408655,
-0.08978234976530075,
0.046231720596551895,
-0.03824976831674576,
-0.0681840181350708,
-0.04101093113422394,
0.05541128292679787,
0.04652237147092819,
-0.036337368190288544,
0.04502204805612564,
-0.01768130250275135,
0.10250651091337204,
0.039749808609485626,
0.08532308042049408,
0.012548662722110748,
-0.04550385847687721,
0.023531407117843628,
-0.00974743440747261,
0.0644938200712204,
-0.16453756392002106,
0.011141412891447544,
0.09937403351068497,
0.06039123237133026,
0.09718833118677139,
0.044376738369464874,
-0.04482831060886383,
0.017846381291747093,
0.029583757743239403,
-0.10103250294923782,
-0.10759109258651733,
0.04731755331158638,
-0.02779114991426468,
-0.13886377215385437,
0.049109313637018204,
0.12122474610805511,
-0.04405496269464493,
-0.02644204907119274,
-0.017098084092140198,
0.0037481263279914856,
-0.023130059242248535,
0.18257999420166016,
0.04888267070055008,
0.05502655729651451,
-0.10227591544389725,
0.12679444253444672,
0.02850627526640892,
-0.018411656841635704,
0.05205818638205528,
0.08412789553403854,
-0.10291577875614166,
-0.0034080378245562315,
0.08289438486099243,
0.14001572132110596,
-0.05884065479040146,
-0.006099378224462271,
-0.1046886071562767,
-0.0821182131767273,
0.050063032656908035,
0.1416940689086914,
0.04970288276672363,
-0.017282214015722275,
-0.05432555451989174,
0.03952848166227341,
-0.14058953523635864,
0.07049315422773361,
0.02281070128083229,
0.0642179399728775,
-0.0757499411702156,
0.05942544341087341,
0.007748454809188843,
0.012986338697373867,
-0.017305122688412666,
0.008500626310706139,
-0.09301568567752838,
-0.01664717122912407,
-0.07976441085338593,
-0.0018300468800589442,
0.001479474944062531,
0.017284898087382317,
-0.020453378558158875,
-0.07124905288219452,
-0.048391297459602356,
0.03794810175895691,
-0.08701744675636292,
-0.050337813794612885,
0.009796669706702232,
0.03999623283743858,
-0.12289367616176605,
-0.005947264842689037,
0.021047353744506836,
-0.09250092506408691,
0.09546412527561188,
0.07270806282758713,
0.017270509153604507,
0.030269237235188484,
-0.1245931088924408,
-0.033592961728572845,
-0.00990353338420391,
-0.007912113331258297,
0.062180232256650925,
-0.093268483877182,
-0.008251671679317951,
-0.037387460470199585,
0.07104656100273132,
0.012762150727212429,
0.0709020346403122,
-0.13398492336273193,
0.018960338085889816,
-0.07699918746948242,
-0.047902803868055344,
-0.07559382170438766,
0.03586998209357262,
0.09692484140396118,
0.06059792637825012,
0.1508316546678543,
-0.07663784921169281,
0.0229679923504591,
-0.20650747418403625,
-0.03470934182405472,
-0.006109299603849649,
-0.06104932352900505,
-0.15124231576919556,
-0.04783026874065399,
0.08081279695034027,
-0.03735126554965973,
0.09449765831232071,
-0.018069246783852577,
0.07553748786449432,
0.03684196621179581,
-0.04676709324121475,
-0.05011948570609093,
-0.016806084662675858,
0.19866542518138885,
0.0730648934841156,
-0.0157020166516304,
0.11102985590696335,
0.0022336936090141535,
0.030443070456385612,
0.05239611491560936,
0.17871485650539398,
0.21379871666431427,
0.031018296256661415,
0.054737985134124756,
0.06434228271245956,
-0.07597315311431885,
-0.07147540152072906,
0.18086068332195282,
-0.015438989736139774,
0.07080039381980896,
-0.04973131790757179,
0.19807541370391846,
0.10815983265638351,
-0.16719019412994385,
0.04547926411032677,
-0.04273844510316849,
-0.08103197067975998,
-0.12516823410987854,
-0.009066850878298283,
-0.08689090609550476,
-0.12706422805786133,
0.03747016191482544,
-0.11667530238628387,
0.0537106990814209,
0.10808238387107849,
0.012827110476791859,
0.036740560084581375,
0.12608662247657776,
-0.01829940639436245,
0.0032654821407049894,
0.06372673064470291,
0.005228996742516756,
-0.01292486023157835,
-0.03615584596991539,
-0.08057078719139099,
0.050086382776498795,
0.0009568947134539485,
0.07884661853313446,
-0.04711670055985451,
-0.014398133382201195,
0.026108382269740105,
-0.02813149429857731,
-0.08039356768131256,
0.027433570474386215,
0.04097554087638855,
0.055129896849393845,
0.04705449938774109,
0.04557569697499275,
-0.009139930829405785,
-0.033651068806648254,
0.31781747937202454,
-0.06658177077770233,
-0.09807242453098297,
-0.12050989270210266,
0.22050738334655762,
0.028729714453220367,
-0.030715350061655045,
0.033440928906202316,
-0.08217179030179977,
-0.009004480205476284,
0.15743707120418549,
0.16559074819087982,
-0.07266835123300552,
-0.022806445136666298,
-0.005191870499402285,
-0.018363313749432564,
-0.037281326949596405,
0.12873101234436035,
0.08989742398262024,
-0.025168543681502342,
-0.062359560281038284,
-0.014295588247478008,
-0.018945535644888878,
-0.03149406239390373,
-0.041212745010852814,
0.04011671990156174,
0.015672096982598305,
-0.024655083194375038,
-0.04127315804362297,
0.0727323666214943,
0.006417648866772652,
-0.2573140263557434,
0.06905579566955566,
-0.15725082159042358,
-0.17175817489624023,
-0.043930064886808395,
0.03677654638886452,
-0.0011801280779764056,
0.05689859017729759,
-0.017815440893173218,
0.00949773471802473,
0.07750380039215088,
-0.01869003102183342,
-0.03269120305776596,
-0.12465515732765198,
0.1236458271741867,
-0.06600455939769745,
0.170551598072052,
-0.02857203409075737,
0.0495290532708168,
0.11571045964956284,
0.028380388393998146,
-0.13536813855171204,
0.041939396411180496,
0.05280708894133568,
-0.10626641660928726,
0.015506389550864697,
0.15210367739200592,
-0.04643615707755089,
0.09283150732517242,
0.04192269593477249,
-0.1081656962633133,
0.003648574696853757,
-0.05677710846066475,
-0.03464784100651741,
-0.08072412014007568,
-0.014795854687690735,
-0.06081723794341087,
0.16895006597042084,
0.2198711484670639,
-0.029358619824051857,
0.011114922352135181,
-0.10243074595928192,
0.016716651618480682,
0.07048354297876358,
0.02898446097970009,
-0.057621877640485764,
-0.18918325006961823,
0.012003219686448574,
0.07014651596546173,
-0.007531095761805773,
-0.24142806231975555,
-0.07592915743589401,
0.038462281227111816,
-0.035876888781785965,
-0.04056418687105179,
0.10318733006715775,
0.04040515422821045,
0.05221635103225708,
-0.03007686510682106,
-0.162374809384346,
-0.03166843205690384,
0.1528569757938385,
-0.1750006377696991,
-0.035959597676992416
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09576744586229324,
0.11468048393726349,
-0.002340586157515645,
0.09170104563236237,
0.11954128742218018,
0.022045468911528587,
0.10058967769145966,
0.12874136865139008,
-0.09696414321660995,
0.08775705099105835,
0.08829329162836075,
0.039408113807439804,
0.04770467430353165,
0.1452750414609909,
-0.020159533247351646,
-0.25885388255119324,
0.01115094218403101,
-0.003743123495951295,
-0.03348991274833679,
0.1117706298828125,
0.08511916548013687,
-0.11039765179157257,
0.08658658713102341,
0.014995593577623367,
-0.15353001654148102,
0.018911249935626984,
-0.03725937753915787,
-0.03517088666558266,
0.1133965253829956,
-0.03238152712583542,
0.10844819992780685,
0.02465556003153324,
0.13305774331092834,
-0.2104494273662567,
0.005048088729381561,
0.07441086322069168,
0.04536287486553192,
0.10051100701093674,
0.051707666367292404,
0.014792181551456451,
0.09041202813386917,
-0.15287652611732483,
0.09309655427932739,
0.03002357855439186,
-0.09199938178062439,
-0.13158193230628967,
-0.09657733142375946,
0.026137614622712135,
0.05122321471571922,
0.0696895569562912,
0.0013820300810039043,
0.1519215852022171,
-0.05922378972172737,
0.07902445644140244,
0.2681562304496765,
-0.3262445628643036,
-0.06388477981090546,
0.03223168104887009,
0.06075282022356987,
0.052377644926309586,
-0.12314010411500931,
-0.006122417747974396,
0.02694045566022396,
0.02974182553589344,
0.11769794672727585,
-0.016491420567035675,
-0.11205166578292847,
-0.013216042891144753,
-0.12751396000385284,
-0.0010968464193865657,
0.07111147046089172,
0.035508934408426285,
-0.05174647644162178,
-0.09549855440855026,
-0.07555346190929413,
-0.09446857869625092,
-0.024501321837306023,
-0.06418273597955704,
0.05643945187330246,
-0.0546041838824749,
-0.08073177188634872,
-0.035804178565740585,
-0.05660358443856239,
-0.07573144137859344,
-0.018928667530417442,
0.15718460083007812,
0.03972969576716423,
0.02063017524778843,
-0.03294786065816879,
0.10832837969064713,
0.003569032996892929,
-0.1422768533229828,
-0.015915781259536743,
-0.0014421081868931651,
-0.09647668153047562,
-0.04635391756892204,
-0.050379183143377304,
-0.01855849288403988,
0.010900917463004589,
0.1778838038444519,
-0.0804692804813385,
0.07591865211725235,
0.00956409890204668,
-0.028975404798984528,
-0.006472093518823385,
0.1481758952140808,
-0.04389249533414841,
-0.046963997185230255,
-0.01022338680922985,
0.07325030863285065,
0.003038321156054735,
-0.015933450311422348,
-0.06456021964550018,
-0.026796681806445122,
0.10251335054636002,
0.04585769772529602,
-0.05816284567117691,
0.04026048257946968,
-0.02315027453005314,
-0.028204740956425667,
0.017413029447197914,
-0.11504321545362473,
0.044632311910390854,
-0.0014059711247682571,
-0.08408794552087784,
-0.0010101625230163336,
0.001389809069223702,
-0.005594128742814064,
-0.008373178541660309,
0.1113344207406044,
-0.09988167881965637,
-0.0028121534269303083,
-0.06384032219648361,
-0.08359146118164062,
0.009141416288912296,
-0.15549664199352264,
-0.016696136444807053,
-0.057876043021678925,
-0.16920380294322968,
-0.0329129695892334,
0.0371868871152401,
-0.07387363910675049,
-0.007608313113451004,
-0.048677463084459305,
-0.06598801165819168,
0.025565272197127342,
-0.014459178782999516,
0.17321917414665222,
-0.054066937416791916,
0.07280470430850983,
-0.0012879067799076438,
0.04555352032184601,
0.014661324210464954,
0.035707250237464905,
-0.10519411414861679,
0.025329353287816048,
-0.13651645183563232,
0.06847735494375229,
-0.08447595685720444,
-0.00245445198379457,
-0.13278378546237946,
-0.09779989719390869,
0.010126858949661255,
-0.021985504776239395,
0.09096921235322952,
0.13858626782894135,
-0.193800687789917,
-0.018663959577679634,
0.12813739478588104,
-0.075081966817379,
-0.0647541731595993,
0.06233410909771919,
-0.060866471379995346,
0.02907785028219223,
0.05196687951683998,
0.2105502337217331,
0.037970203906297684,
-0.1683908849954605,
-0.03344261273741722,
-0.007554526906460524,
0.04005330055952072,
0.027521811425685883,
0.03957563638687134,
0.0035858836490660906,
0.06530531495809555,
0.013945075683295727,
-0.07525838166475296,
-0.03252128139138222,
-0.09140635281801224,
-0.06487278640270233,
-0.055063776671886444,
-0.07174380123615265,
0.0400124192237854,
0.0037195920012891293,
0.0417894683778286,
-0.06497275829315186,
-0.10231924057006836,
0.12001815438270569,
0.09613519161939621,
-0.04814758524298668,
0.037381336092948914,
-0.0791458934545517,
0.020592479035258293,
-0.019058464094996452,
-0.03888219594955444,
-0.20632989704608917,
-0.13035888969898224,
0.05182681977748871,
-0.05731307342648506,
0.03355761244893074,
0.005443235393613577,
0.08136522024869919,
0.06144070625305176,
-0.043206751346588135,
-0.011722821742296219,
-0.09376531094312668,
0.0036585961934179068,
-0.1170228123664856,
-0.18907423317432404,
-0.07752088457345963,
-0.03981639817357063,
0.09406571090221405,
-0.17406098544597626,
-0.006709451321512461,
0.015986988320946693,
0.1446373611688614,
0.02780107781291008,
-0.06902378052473068,
-0.0035002187360078096,
0.03819539025425911,
0.0017507752636447549,
-0.09567268937826157,
0.04477517306804657,
0.00834619253873825,
-0.09280639886856079,
-0.06241526082158089,
-0.1369103193283081,
-0.011343017220497131,
0.061018284410238266,
0.05224481225013733,
-0.09696532785892487,
-0.04532743990421295,
-0.07089703530073166,
-0.04032226651906967,
-0.07593759894371033,
0.012528233230113983,
0.20080234110355377,
0.03627773001790047,
0.11314623802900314,
-0.06679286062717438,
-0.07781931012868881,
-0.003220879938453436,
0.02216663770377636,
0.01219023484736681,
0.07684972882270813,
0.0407971553504467,
-0.053316812962293625,
0.0752822682261467,
0.09822073578834534,
-0.022366859018802643,
0.12467356771230698,
-0.046263620257377625,
-0.08378750085830688,
-0.0337945856153965,
-0.02462536096572876,
-0.027731705456972122,
0.12382517009973526,
-0.040028639137744904,
0.005372073035687208,
0.036830294877290726,
0.045054271817207336,
0.017082374542951584,
-0.16134953498840332,
0.008214647881686687,
0.02196124754846096,
-0.05320782586932182,
-0.03644939884543419,
-0.00025640736566856503,
0.027403971180319786,
0.0921863541007042,
0.031915534287691116,
-0.012552307918667793,
0.002429254585877061,
-0.012160375714302063,
-0.06179340183734894,
0.18459093570709229,
-0.09907373785972595,
-0.0852588340640068,
-0.07580708712339401,
0.004912218544632196,
-0.06004919856786728,
-0.03580336645245552,
0.016856037080287933,
-0.08728981763124466,
-0.039594102650880814,
-0.08759178221225739,
-0.01813201792538166,
-0.01789945922791958,
0.02048475109040737,
0.030663222074508667,
-0.022537225857377052,
0.08054482936859131,
-0.14004011452198029,
0.0009506485075689852,
-0.05143725126981735,
-0.09198423475027084,
-0.0008936800877563655,
0.07482413947582245,
0.09820643067359924,
0.07998987287282944,
-0.016398828476667404,
0.029887249693274498,
-0.03414296358823776,
0.24251458048820496,
-0.04612403362989426,
0.010578268207609653,
0.103849396109581,
-0.013427114114165306,
0.05662493407726288,
0.09505375474691391,
0.03781062364578247,
-0.09412509948015213,
0.020580654963850975,
0.08343600481748581,
-0.029081784188747406,
-0.2292943149805069,
-0.025762589648365974,
-0.005233811214566231,
-0.07908666133880615,
0.10590609163045883,
0.0323156975209713,
-0.03586707264184952,
0.04477357491850853,
0.020118365064263344,
0.0014803686644881964,
-0.05540580675005913,
0.08223947882652283,
0.07610295712947845,
0.05698460340499878,
0.09955929219722748,
-0.008871247991919518,
-0.027319425716996193,
0.062125783413648605,
0.006479670759290457,
0.24681751430034637,
-0.024863041937351227,
0.10090005397796631,
0.03327678516507149,
0.1508781760931015,
-0.02676442638039589,
0.06397991627454758,
0.0029745076317340136,
-0.009304975159466267,
-0.014476358890533447,
-0.06736817955970764,
-0.0254717618227005,
0.023031998425722122,
-0.046476393938064575,
0.02990473248064518,
-0.08133634924888611,
0.02426956780254841,
0.028678249567747116,
0.27965784072875977,
0.033760931342840195,
-0.27394023537635803,
-0.06671545654535294,
-0.013327234424650669,
-0.041547901928424835,
-0.06454471498727798,
0.00596475088968873,
0.11976215988397598,
-0.133131742477417,
0.06502876430749893,
-0.07571874558925629,
0.09010512381792068,
-0.03870095685124397,
0.011022737249732018,
0.04706767946481705,
0.15289835631847382,
-0.018693316727876663,
0.04960223287343979,
-0.1853923499584198,
0.24264970421791077,
0.024709507822990417,
0.10775819420814514,
-0.06425792723894119,
0.010279903188347816,
0.01909187063574791,
0.007051264401525259,
0.10861487686634064,
0.0011829260038211942,
-0.06677377969026566,
-0.13772888481616974,
-0.09963234513998032,
0.046286728233098984,
0.14156083762645721,
-0.03471921756863594,
0.0993356704711914,
-0.028711309656500816,
0.012835296802222729,
0.033704452216625214,
-0.028955549001693726,
-0.1561288982629776,
-0.07345131784677505,
0.010794148780405521,
0.027870291844010353,
-0.014579524286091328,
-0.05177300423383713,
-0.10377950966358185,
-0.03993266075849533,
0.11950032413005829,
0.0015958313597366214,
-0.0458303801715374,
-0.15095043182373047,
0.082959845662117,
0.14594630897045135,
-0.05825057625770569,
0.016029465943574905,
0.01339106447994709,
0.1114502027630806,
0.03260643407702446,
-0.08569825440645218,
0.06709466129541397,
-0.053077343851327896,
-0.17454031109809875,
-0.058872733265161514,
0.11887817084789276,
0.07981962710618973,
0.04500146582722664,
0.00029080986860208213,
0.05841121822595596,
0.0008379808277823031,
-0.0971808061003685,
0.0363629013299942,
0.005616332869976759,
0.05213777348399162,
0.028099285438656807,
-0.08441117405891418,
0.07668805867433548,
-0.03421638533473015,
0.017507219687104225,
0.12852779030799866,
0.2340824156999588,
-0.09927263855934143,
0.10320127010345459,
0.0796542838215828,
-0.07600070536136627,
-0.15863117575645447,
0.06071452423930168,
0.1251213252544403,
0.004235071130096912,
0.08290187269449234,
-0.19910404086112976,
0.13364839553833008,
0.10774104297161102,
-0.013913869857788086,
0.022518960759043694,
-0.27156469225883484,
-0.13256096839904785,
0.06535875797271729,
0.10953164100646973,
0.04948001354932785,
-0.12354845553636551,
-0.03541599214076996,
-0.010978924110531807,
-0.12185461819171906,
0.12949667870998383,
-0.07633496075868607,
0.11738789081573486,
-0.021618958562612534,
0.12236003577709198,
0.024610906839370728,
-0.037353869527578354,
0.1137278750538826,
0.07072025537490845,
0.08591299504041672,
-0.04028743505477905,
-0.0037318149115890265,
0.06461486220359802,
-0.06269741803407669,
0.03525133430957794,
-0.03731700778007507,
0.06251677125692368,
-0.14833714067935944,
0.00715581513941288,
-0.07829436659812927,
0.060317836701869965,
-0.04663687199354172,
-0.0652814731001854,
-0.028331469744443893,
0.047279294580221176,
0.07351109385490417,
-0.03555982932448387,
0.047408610582351685,
0.0075376094318926334,
0.09246858954429626,
0.10081774741411209,
0.07206286489963531,
-0.023720690980553627,
-0.08330341428518295,
0.01431850902736187,
0.004174280911684036,
0.0472787581384182,
-0.08468317240476608,
0.015806833282113075,
0.14624203741550446,
0.06027130410075188,
0.10237377136945724,
0.046008482575416565,
-0.04363885521888733,
0.005611601751297712,
0.01759812980890274,
-0.1436263918876648,
-0.09843015670776367,
0.028681360185146332,
-0.05812222510576248,
-0.15329639613628387,
0.03269435465335846,
0.12335292249917984,
-0.03649122267961502,
-0.016074245795607567,
-0.006131071597337723,
0.009253283962607384,
-0.01157807931303978,
0.18378590047359467,
0.0421026274561882,
0.05424177274107933,
-0.09121686220169067,
0.1140940859913826,
0.036468036472797394,
-0.04184512794017792,
0.05461817979812622,
0.06840144097805023,
-0.09891267865896225,
0.013750866986811161,
0.07209935039281845,
0.1500062197446823,
-0.06723038107156754,
-0.012985272333025932,
-0.09151598066091537,
-0.0764756128191948,
0.04487452283501625,
0.14586767554283142,
0.0525674968957901,
-0.003954110201448202,
-0.060453422367572784,
0.03601637855172157,
-0.1181236281991005,
0.0678531751036644,
0.05284116789698601,
0.08183112740516663,
-0.10757846385240555,
0.1251089721918106,
-0.006982486229389906,
0.022331029176712036,
-0.02788824774324894,
0.018086010590195656,
-0.10143543034791946,
-0.034105755388736725,
-0.10883224010467529,
-0.01411820761859417,
-0.01793917641043663,
-0.0035015575122088194,
-0.019481725990772247,
-0.07561437785625458,
-0.043097835034132004,
0.03262104466557503,
-0.07683594524860382,
-0.048855800181627274,
0.017537016421556473,
0.0399819016456604,
-0.16210144758224487,
0.0031869281083345413,
0.026455018669366837,
-0.08794190734624863,
0.08768825232982635,
0.06994185596704483,
0.015386300161480904,
0.027664758265018463,
-0.12523603439331055,
-0.03320692479610443,
-0.00016337547276634723,
0.010297550819814205,
0.07761885225772858,
-0.09408155083656311,
-0.029526451602578163,
-0.031056847423315048,
0.04815998673439026,
0.01557888276875019,
0.10278592258691788,
-0.11910045146942139,
-0.013048570603132248,
-0.04505518451333046,
-0.03825640305876732,
-0.056896816939115524,
0.02627047523856163,
0.11480523645877838,
0.04493294283747673,
0.15714223682880402,
-0.07065059244632721,
0.054968129843473434,
-0.2042478322982788,
-0.032477691769599915,
0.011365530081093311,
-0.04750712215900421,
-0.07341066747903824,
-0.04434286430478096,
0.08415554463863373,
-0.05037878081202507,
0.12207378447055817,
-0.016033165156841278,
0.09185023605823517,
0.04451938718557358,
-0.004643712658435106,
-0.07221046090126038,
-0.01199114415794611,
0.18418771028518677,
0.05745857208967209,
-0.021155159920454025,
0.12054263800382614,
0.0036336821503937244,
0.042104579508304596,
0.06711295247077942,
0.2349616140127182,
0.15127965807914734,
-0.01134763564914465,
0.07448845356702805,
0.06631748378276825,
-0.07533697038888931,
-0.1406802237033844,
0.12120454013347626,
-0.020967556163668633,
0.1066109910607338,
-0.05313962697982788,
0.18958573043346405,
0.03908335044980049,
-0.17661131918430328,
0.054480183869600296,
-0.025139791890978813,
-0.10727158188819885,
-0.12582597136497498,
-0.016335008665919304,
-0.08140986412763596,
-0.11591697484254837,
0.027955064550042152,
-0.12324239313602448,
0.06947451084852219,
0.09487960487604141,
0.007695368491113186,
0.035590045154094696,
0.18336203694343567,
-0.057933684438467026,
0.01076994463801384,
0.07160687446594238,
0.021379973739385605,
-0.004547245800495148,
-0.040591537952423096,
-0.06688924133777618,
0.036498550325632095,
0.043631959706544876,
0.07137754559516907,
-0.04949859902262688,
0.009864810854196548,
0.015284990891814232,
-0.010367963463068008,
-0.07820659130811691,
0.008213866502046585,
0.014584175311028957,
0.048828307539224625,
0.037393391132354736,
0.04725881665945053,
0.008776976726949215,
-0.05348455160856247,
0.2754664421081543,
-0.06778807193040848,
-0.06276202201843262,
-0.12310342490673065,
0.19430038332939148,
0.03339528292417526,
-0.019158542156219482,
0.05487499386072159,
-0.09304685145616531,
-0.013204065151512623,
0.16282930970191956,
0.13523630797863007,
-0.09053792804479599,
-0.02071254514157772,
-0.023693779483437538,
-0.00851933192461729,
-0.012484564445912838,
0.10371909290552139,
0.07122514396905899,
0.0027720278594642878,
-0.06591907143592834,
-0.013082691468298435,
-0.029779788106679916,
-0.046912435442209244,
-0.0630667582154274,
0.05939329043030739,
0.026529423892498016,
-0.006819199770689011,
-0.05927903205156326,
0.06283263862133026,
-0.00525028258562088,
-0.23642463982105255,
0.03886676952242851,
-0.17377841472625732,
-0.1733560711145401,
-0.013540208339691162,
0.07051293551921844,
0.001175587298348546,
0.05626104772090912,
-0.005502867046743631,
0.009372676722705364,
0.11615694314241409,
-0.016597378998994827,
-0.013855354860424995,
-0.11780870705842972,
0.10873064398765564,
-0.10812806338071823,
0.21342246234416962,
-0.0017340714111924171,
0.06375201046466827,
0.09940726310014725,
0.03820689767599106,
-0.13495604693889618,
0.018982160836458206,
0.06226719543337822,
-0.1272907555103302,
0.0008397088386118412,
0.14506250619888306,
-0.03458708897233009,
0.062431253492832184,
0.03154336288571358,
-0.14906641840934753,
-0.0025668530724942684,
0.027744406834244728,
-0.03766250982880592,
-0.06896806508302689,
-0.010325399227440357,
-0.055701564997434616,
0.1657790094614029,
0.20699484646320343,
-0.02885548397898674,
0.011708828620612621,
-0.08458098024129868,
0.02161383256316185,
0.04824792221188545,
0.05716715380549431,
-0.039775408804416656,
-0.21632502973079681,
0.022660288959741592,
0.07289789617061615,
-0.002774752676486969,
-0.19449937343597412,
-0.09607469290494919,
0.042696017771959305,
-0.035917699337005615,
-0.04611362889409065,
0.09121909737586975,
0.023737553507089615,
0.03708309307694435,
-0.019346049055457115,
-0.11458270251750946,
-0.026861337944865227,
0.146287202835083,
-0.17550520598888397,
-0.04278471693396568
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09655676782131195,
0.1142968162894249,
-0.002274144673720002,
0.0911821499466896,
0.11933670192956924,
0.02208700403571129,
0.1010490283370018,
0.12764598429203033,
-0.0960942879319191,
0.08643737435340881,
0.08722340315580368,
0.04003063216805458,
0.0482400543987751,
0.14665091037750244,
-0.019209817051887512,
-0.2597530484199524,
0.011037024669349194,
-0.0029997341334819794,
-0.032051410526037216,
0.11163952946662903,
0.08557623624801636,
-0.11047720909118652,
0.08578779548406601,
0.015030854381620884,
-0.1530047208070755,
0.019259896129369736,
-0.037601813673973083,
-0.03494279459118843,
0.11333432793617249,
-0.03384075686335564,
0.10811575502157211,
0.02445901930332184,
0.1351083517074585,
-0.20948560535907745,
0.005059713032096624,
0.0735098272562027,
0.045345619320869446,
0.10020213574171066,
0.05110027268528938,
0.015657462179660797,
0.08978472650051117,
-0.15365652740001678,
0.09390510618686676,
0.029088586568832397,
-0.09155835211277008,
-0.12946920096874237,
-0.09647271037101746,
0.026942988857626915,
0.05351684242486954,
0.069142185151577,
0.0009595334413461387,
0.1519186794757843,
-0.05999792739748955,
0.0789649710059166,
0.2675192356109619,
-0.32697105407714844,
-0.06377194821834564,
0.03405250981450081,
0.06244570016860962,
0.05280574783682823,
-0.12340003252029419,
-0.006995449308305979,
0.02747231535613537,
0.029044969007372856,
0.11876724660396576,
-0.016980275511741638,
-0.11176177114248276,
-0.0134039968252182,
-0.1283845454454422,
-0.0002989306522067636,
0.07162509858608246,
0.036298878490924835,
-0.05178166553378105,
-0.09700250625610352,
-0.07525802403688431,
-0.09501292556524277,
-0.02545638382434845,
-0.06538581103086472,
0.05628334730863571,
-0.055146943777799606,
-0.0802718847990036,
-0.03605605289340019,
-0.05599287524819374,
-0.07625481486320496,
-0.01719222404062748,
0.15533600747585297,
0.040043905377388,
0.019739296287298203,
-0.03245508670806885,
0.10817588120698929,
0.0013899237383157015,
-0.14190369844436646,
-0.014995411038398743,
-0.0016505037201568484,
-0.09803329408168793,
-0.047541048377752304,
-0.05064080283045769,
-0.017919551581144333,
0.009506071917712688,
0.1785968840122223,
-0.0774223580956459,
0.07591091841459274,
0.011443659663200378,
-0.029733875766396523,
-0.006006614770740271,
0.14821453392505646,
-0.04470027610659599,
-0.049144815653562546,
-0.01058581005781889,
0.0743015855550766,
0.002584741683676839,
-0.01471169013530016,
-0.06539276987314224,
-0.027355702593922615,
0.10281965136528015,
0.04570762813091278,
-0.059668730944395065,
0.04043359309434891,
-0.022265667095780373,
-0.028100337833166122,
0.018610596656799316,
-0.11568168550729752,
0.044851336628198624,
-0.002150028944015503,
-0.08522476255893707,
-0.0022061988711357117,
0.0000320594226650428,
-0.004525577183812857,
-0.008354878053069115,
0.11001361906528473,
-0.09959185123443604,
-0.003242950886487961,
-0.0642910897731781,
-0.08416473865509033,
0.009265333414077759,
-0.15872086584568024,
-0.015402058139443398,
-0.05694787576794624,
-0.17260807752609253,
-0.033663585782051086,
0.03626579791307449,
-0.07302627712488174,
-0.00833052396774292,
-0.04985859990119934,
-0.06576365232467651,
0.02387521229684353,
-0.013830466195940971,
0.1741776168346405,
-0.05310210958123207,
0.07169263809919357,
-0.0014257048023864627,
0.04643426090478897,
0.015038599260151386,
0.036048226058483124,
-0.10474056750535965,
0.02511974237859249,
-0.13545389473438263,
0.0686909407377243,
-0.08472664654254913,
-0.0006865188479423523,
-0.13241855800151825,
-0.09795979410409927,
0.008412966504693031,
-0.02180613949894905,
0.0906757041811943,
0.13863040506839752,
-0.1948315054178238,
-0.017716815695166588,
0.12937040627002716,
-0.07456348836421967,
-0.06402880698442459,
0.06125136837363243,
-0.06134530529379845,
0.030464615672826767,
0.053593408316373825,
0.21001993119716644,
0.040484052151441574,
-0.16764231026172638,
-0.03383199870586395,
-0.007318513933569193,
0.03988488391041756,
0.02577435038983822,
0.039643749594688416,
0.004954935051500797,
0.06645344942808151,
0.014055228792130947,
-0.0745544582605362,
-0.03257332369685173,
-0.09073987603187561,
-0.0652792677283287,
-0.0546397902071476,
-0.07265562564134598,
0.04041657969355583,
0.004168370272964239,
0.04212876781821251,
-0.06510581821203232,
-0.10221676528453827,
0.12070085108280182,
0.09683629870414734,
-0.048086997121572495,
0.035938896238803864,
-0.07918824255466461,
0.020114073529839516,
-0.01986716128885746,
-0.039008788764476776,
-0.2061450183391571,
-0.1286170780658722,
0.05218609422445297,
-0.05762225389480591,
0.03336821123957634,
0.007564500439912081,
0.08204235136508942,
0.06105874106287956,
-0.04295025020837784,
-0.012282575480639935,
-0.09372847527265549,
0.0035631379578262568,
-0.11796290427446365,
-0.18761079013347626,
-0.07849137485027313,
-0.04034708812832832,
0.09497299790382385,
-0.17540961503982544,
-0.005858704913407564,
0.014237714000046253,
0.14422862231731415,
0.027103254571557045,
-0.0691009908914566,
-0.0026222558226436377,
0.03727811574935913,
0.0025178203359246254,
-0.09606505930423737,
0.04449110105633736,
0.007305992301553488,
-0.09256128966808319,
-0.06405869871377945,
-0.1379978507757187,
-0.012958473525941372,
0.06036977097392082,
0.054744746536016464,
-0.09668396413326263,
-0.04583190381526947,
-0.07082682102918625,
-0.03970113396644592,
-0.07484474778175354,
0.011961935088038445,
0.19999487698078156,
0.03538147360086441,
0.11257809400558472,
-0.06686614453792572,
-0.07886402308940887,
-0.0034119300544261932,
0.02365736849606037,
0.012835294008255005,
0.076418437063694,
0.04058687016367912,
-0.053327564150094986,
0.07470706850290298,
0.09915832430124283,
-0.02182777225971222,
0.12407203763723373,
-0.04650173708796501,
-0.08449742197990417,
-0.03346618264913559,
-0.02500254474580288,
-0.02878105640411377,
0.12351522594690323,
-0.04038692265748978,
0.0036945445463061333,
0.036412958055734634,
0.043787505477666855,
0.017038637772202492,
-0.16168789565563202,
0.0085074994713068,
0.022071927785873413,
-0.05234319344162941,
-0.03782987594604492,
-0.0011868939036503434,
0.02628861926496029,
0.09154237806797028,
0.03128354623913765,
-0.01316425297409296,
0.001810763729736209,
-0.011960819363594055,
-0.06138022243976593,
0.18460281193256378,
-0.09795370697975159,
-0.08409031480550766,
-0.07519474625587463,
0.005675516091287136,
-0.058151762932538986,
-0.035847801715135574,
0.01564619690179825,
-0.08737270534038544,
-0.0390474870800972,
-0.08716695010662079,
-0.01900274120271206,
-0.017732184380292892,
0.01976367086172104,
0.03219614550471306,
-0.021832119673490524,
0.07879237085580826,
-0.14018845558166504,
0.0016016410663723946,
-0.05192828178405762,
-0.09171155095100403,
-0.001502276281826198,
0.07381415367126465,
0.0984451174736023,
0.08052003383636475,
-0.016969915479421616,
0.03003048151731491,
-0.03474801778793335,
0.24188180267810822,
-0.046693019568920135,
0.012025197967886925,
0.10355167835950851,
-0.013120725750923157,
0.056232452392578125,
0.09585316479206085,
0.037069566547870636,
-0.09385720640420914,
0.020293472334742546,
0.08262812346220016,
-0.028562339022755623,
-0.22987858951091766,
-0.025194335728883743,
-0.004532423336058855,
-0.08041118830442429,
0.10660703480243683,
0.03183137997984886,
-0.033835507929325104,
0.04643998667597771,
0.019877580925822258,
0.0028281144332140684,
-0.054063793271780014,
0.08179961144924164,
0.07384154200553894,
0.056111857295036316,
0.09982184320688248,
-0.009112720377743244,
-0.028460588306188583,
0.06066161394119263,
0.007190467789769173,
0.2477087527513504,
-0.02382506988942623,
0.10072091966867447,
0.03227335214614868,
0.15026190876960754,
-0.02742813713848591,
0.06626944243907928,
0.0038886922411620617,
-0.009679373353719711,
-0.01445452868938446,
-0.06708047538995743,
-0.0242607444524765,
0.02354622073471546,
-0.045712973922491074,
0.029683474451303482,
-0.08070281893014908,
0.02393912523984909,
0.02804134041070938,
0.27840739488601685,
0.035271722823381424,
-0.2749309837818146,
-0.06608931720256805,
-0.012933557853102684,
-0.042547427117824554,
-0.06439752131700516,
0.005627673584967852,
0.11872388422489166,
-0.1330571323633194,
0.06578907370567322,
-0.07572033256292343,
0.09029319137334824,
-0.03804074227809906,
0.011131970211863518,
0.045843854546546936,
0.15262895822525024,
-0.018526585772633553,
0.05073036998510361,
-0.18557170033454895,
0.24147510528564453,
0.024666104465723038,
0.10904230922460556,
-0.06540003418922424,
0.010449192486703396,
0.019120559096336365,
0.006581815425306559,
0.10987275093793869,
0.0008522681891918182,
-0.06691478192806244,
-0.13824184238910675,
-0.09987321496009827,
0.04625946283340454,
0.1416676789522171,
-0.034400977194309235,
0.09988708794116974,
-0.027875572443008423,
0.011894013732671738,
0.03395295888185501,
-0.030384371057152748,
-0.1573973000049591,
-0.0733533501625061,
0.010229254141449928,
0.02671194262802601,
-0.014646355994045734,
-0.0513669028878212,
-0.10381575673818588,
-0.04195301979780197,
0.11815192550420761,
0.0027564193587750196,
-0.046004604548215866,
-0.1507955640554428,
0.08452200144529343,
0.146275594830513,
-0.05778365209698677,
0.016607264056801796,
0.014764751307666302,
0.11233446002006531,
0.0320916548371315,
-0.08526650071144104,
0.06627144664525986,
-0.05311663821339607,
-0.17334246635437012,
-0.05800341069698334,
0.1201423928141594,
0.08025762438774109,
0.045287180691957474,
0.0017582663567736745,
0.05776466801762581,
0.0008891233592294157,
-0.09685297310352325,
0.035837721079587936,
0.005389383528381586,
0.051567867398262024,
0.02829589694738388,
-0.0853777676820755,
0.07522588968276978,
-0.03480226919054985,
0.01946762204170227,
0.12873464822769165,
0.23083153367042542,
-0.09908635169267654,
0.10251915454864502,
0.07948490977287292,
-0.07644954323768616,
-0.15836948156356812,
0.061480604112148285,
0.12516728043556213,
0.005018298048526049,
0.08318722248077393,
-0.19878245890140533,
0.13378213346004486,
0.10713660717010498,
-0.013334267772734165,
0.021239610388875008,
-0.2715395390987396,
-0.13208909332752228,
0.06653444468975067,
0.10964275151491165,
0.047214657068252563,
-0.12274809926748276,
-0.035249415785074234,
-0.01152511965483427,
-0.12179099768400192,
0.12898555397987366,
-0.07734493166208267,
0.11648885160684586,
-0.020951755344867706,
0.1213105171918869,
0.024618063122034073,
-0.037446971982717514,
0.11213061958551407,
0.07241645455360413,
0.08613365143537521,
-0.04003742337226868,
-0.004720533266663551,
0.06704948842525482,
-0.06237984448671341,
0.03698628395795822,
-0.03634254261851311,
0.06219482421875,
-0.14817547798156738,
0.00648076506331563,
-0.07776428759098053,
0.060070496052503586,
-0.046349138021469116,
-0.0655345618724823,
-0.027855148538947105,
0.04732652008533478,
0.07316963374614716,
-0.03563035652041435,
0.045293405652046204,
0.008793516084551811,
0.09173333644866943,
0.09699300676584244,
0.07331421226263046,
-0.022639764472842216,
-0.08242933452129364,
0.01422884687781334,
0.0043785893358290195,
0.04671351611614227,
-0.08533425629138947,
0.015247552655637264,
0.14626073837280273,
0.06070985645055771,
0.10263718664646149,
0.04546493664383888,
-0.043794531375169754,
0.005097848363220692,
0.016980573534965515,
-0.14122609794139862,
-0.09982508420944214,
0.02852538786828518,
-0.05912083759903908,
-0.15389792621135712,
0.03385663032531738,
0.12138621509075165,
-0.03727427497506142,
-0.016960563138127327,
-0.0071968939155340195,
0.008846103213727474,
-0.01114825252443552,
0.18505725264549255,
0.04287080839276314,
0.054373279213905334,
-0.0916857123374939,
0.11366163939237595,
0.03677797317504883,
-0.042152877897024155,
0.05436941236257553,
0.06788895279169083,
-0.09964878112077713,
0.013409162871539593,
0.0732845887541771,
0.15039180219173431,
-0.06413473933935165,
-0.013329347595572472,
-0.09130685776472092,
-0.07654933631420135,
0.04522421211004257,
0.1452883929014206,
0.05275622010231018,
-0.005207688082009554,
-0.060072559863328934,
0.036435190588235855,
-0.11862315982580185,
0.06786469370126724,
0.05294281244277954,
0.08168663829565048,
-0.10783739387989044,
0.12532387673854828,
-0.006884168833494186,
0.02283874899148941,
-0.02786053717136383,
0.018872268497943878,
-0.10059616714715958,
-0.03449539840221405,
-0.1073734387755394,
-0.015512961894273758,
-0.01930192857980728,
-0.0033648659009486437,
-0.020105386152863503,
-0.07508838921785355,
-0.04279998317360878,
0.03224962577223778,
-0.07658733427524567,
-0.04896465688943863,
0.017090212553739548,
0.03920554369688034,
-0.16117677092552185,
0.0031873316038399935,
0.02576652355492115,
-0.08748999983072281,
0.08705393970012665,
0.06923524290323257,
0.016140103340148926,
0.028340451419353485,
-0.12454245239496231,
-0.03269321098923683,
0.0007295922259800136,
0.010715765878558159,
0.07721380144357681,
-0.09382983297109604,
-0.029016464948654175,
-0.030837107449769974,
0.04918981343507767,
0.014761790633201599,
0.10002005100250244,
-0.11811891198158264,
-0.013947946950793266,
-0.046179499477148056,
-0.03771438077092171,
-0.057342760264873505,
0.026935327798128128,
0.11427950859069824,
0.04343843832612038,
0.15773552656173706,
-0.06892465800046921,
0.05488250032067299,
-0.20500852167606354,
-0.033011797815561295,
0.010663315653800964,
-0.047487515956163406,
-0.07316212356090546,
-0.045086007565259933,
0.084307961165905,
-0.049928631633520126,
0.12306999415159225,
-0.016046978533267975,
0.09278962016105652,
0.04364306852221489,
-0.0015729316510260105,
-0.07178542017936707,
-0.011271283030509949,
0.1846543550491333,
0.057704951614141464,
-0.021706795319914818,
0.11967649310827255,
0.0041313632391393185,
0.0424150675535202,
0.06562824547290802,
0.23236455023288727,
0.15140433609485626,
-0.01224974263459444,
0.07445278018712997,
0.06714604049921036,
-0.07554677873849869,
-0.13951502740383148,
0.12248189002275467,
-0.021711498498916626,
0.10481464117765427,
-0.05289587378501892,
0.1918489933013916,
0.03872223198413849,
-0.1766454428434372,
0.05507589504122734,
-0.02441570535302162,
-0.10811862349510193,
-0.12460999935865402,
-0.018108537420630455,
-0.08169450610876083,
-0.11475399881601334,
0.028259525075554848,
-0.123287133872509,
0.0678190365433693,
0.09508740901947021,
0.007678062655031681,
0.03490441292524338,
0.18458931148052216,
-0.05855691805481911,
0.011090565472841263,
0.07186628878116608,
0.020957300439476967,
-0.004008536692708731,
-0.04100349172949791,
-0.06597316265106201,
0.037665653973817825,
0.04283337667584419,
0.07205107063055038,
-0.051731500774621964,
0.009274226613342762,
0.015172052197158337,
-0.009540324099361897,
-0.07759495079517365,
0.008363412693142891,
0.014142759144306183,
0.04872775822877884,
0.036310527473688126,
0.047522444278001785,
0.008258833549916744,
-0.053881410509347916,
0.27407199144363403,
-0.06774858385324478,
-0.0636899322271347,
-0.12348965555429459,
0.19162721931934357,
0.03438650816679001,
-0.019165586680173874,
0.05578188970685005,
-0.09338720887899399,
-0.01101021096110344,
0.16370277106761932,
0.13545264303684235,
-0.08834148943424225,
-0.02123868837952614,
-0.023290477693080902,
-0.0088788578286767,
-0.013240150175988674,
0.10323993861675262,
0.07160294055938721,
0.0004514638858381659,
-0.0657193586230278,
-0.012425029650330544,
-0.028241155669093132,
-0.048142366111278534,
-0.06287968903779984,
0.05929866433143616,
0.027491627261042595,
-0.007176809478551149,
-0.05747884511947632,
0.06440304964780807,
-0.0031037554144859314,
-0.2364090532064438,
0.03760731965303421,
-0.1727810502052307,
-0.17349298298358917,
-0.014186456799507141,
0.0702594667673111,
0.0029078717343509197,
0.056175608187913895,
-0.005286484491080046,
0.009663810022175312,
0.11558406054973602,
-0.01593758910894394,
-0.014448380097746849,
-0.11786253750324249,
0.10896269232034683,
-0.1093774065375328,
0.21236370503902435,
-0.0019762562587857246,
0.06471572071313858,
0.0992664098739624,
0.036588117480278015,
-0.1347723752260208,
0.019392361864447594,
0.06241032853722572,
-0.12519709765911102,
0.002356364391744137,
0.1455184817314148,
-0.03433641791343689,
0.06020020693540573,
0.030773967504501343,
-0.14933015406131744,
-0.0021936490666121244,
0.027858443558216095,
-0.03712736442685127,
-0.06985677033662796,
-0.007788433227688074,
-0.05495218187570572,
0.16645380854606628,
0.2068076878786087,
-0.02927565574645996,
0.01223505474627018,
-0.08507630974054337,
0.021004701033234596,
0.04859060049057007,
0.05734763666987419,
-0.039560165256261826,
-0.21601133048534393,
0.021495450288057327,
0.07028377801179886,
-0.0022229335736483335,
-0.19377832114696503,
-0.09509687125682831,
0.04157397150993347,
-0.03728793188929558,
-0.04617106914520264,
0.09046480804681778,
0.025379303842782974,
0.03709986433386803,
-0.019064391031861305,
-0.11485963314771652,
-0.027141345664858818,
0.1462976187467575,
-0.1764112412929535,
-0.041901834309101105
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09537152945995331,
0.11522786319255829,
-0.0023674124386161566,
0.09151331335306168,
0.1189081072807312,
0.022435875609517097,
0.1006641611456871,
0.1284085363149643,
-0.09668318182229996,
0.08684852719306946,
0.08791568130254745,
0.03883175551891327,
0.04760359227657318,
0.1457030177116394,
-0.019375143572688103,
-0.25984492897987366,
0.01072352658957243,
-0.004423913080245256,
-0.03382154926657677,
0.11159126460552216,
0.08617935329675674,
-0.10952968150377274,
0.08611846715211868,
0.015378325246274471,
-0.15407417714595795,
0.01925625093281269,
-0.03667648881673813,
-0.03421207144856453,
0.11318020522594452,
-0.03310689702630043,
0.10843156278133392,
0.025437206029891968,
0.1348515897989273,
-0.21004162728786469,
0.004727584309875965,
0.07272573560476303,
0.04578942805528641,
0.10055487602949142,
0.05172029137611389,
0.01570984721183777,
0.08923088014125824,
-0.15309785306453705,
0.0928201824426651,
0.029408104717731476,
-0.0914284810423851,
-0.13047249615192413,
-0.09688850492238998,
0.027081923559308052,
0.05337892845273018,
0.06805142760276794,
0.001805038540624082,
0.1506359875202179,
-0.05927414074540138,
0.0786842629313469,
0.26761943101882935,
-0.3276502192020416,
-0.06417758017778397,
0.03266332671046257,
0.061119645833969116,
0.053718626499176025,
-0.12186717987060547,
-0.0062570697627961636,
0.027482418343424797,
0.029323549941182137,
0.11779041588306427,
-0.016850057989358902,
-0.11172528564929962,
-0.013465107418596745,
-0.1284620612859726,
-0.002034260891377926,
0.0720604658126831,
0.03571267053484917,
-0.052808523178100586,
-0.09468313306570053,
-0.07592093199491501,
-0.09281162917613983,
-0.024174416437745094,
-0.06583026051521301,
0.056586772203445435,
-0.05385099723935127,
-0.07996463030576706,
-0.03828735277056694,
-0.057347241789102554,
-0.0765811875462532,
-0.018091602250933647,
0.15722282230854034,
0.04004046693444252,
0.021168576553463936,
-0.03208121284842491,
0.10934635996818542,
0.0030678538605570793,
-0.14154645800590515,
-0.015610688365995884,
-0.0009793530916795135,
-0.09699046611785889,
-0.04691849276423454,
-0.05057539418339729,
-0.0172871183604002,
0.010287817567586899,
0.17744562029838562,
-0.0797448605298996,
0.07514434307813644,
0.010013207793235779,
-0.028540020808577538,
-0.006056307815015316,
0.14809535443782806,
-0.04276123270392418,
-0.046041183173656464,
-0.010074586607515812,
0.07334291934967041,
0.0033940968569368124,
-0.01490476168692112,
-0.06547844409942627,
-0.028017297387123108,
0.10355070233345032,
0.046939242631196976,
-0.05927489697933197,
0.03835679590702057,
-0.02387099713087082,
-0.028089744970202446,
0.01677061803638935,
-0.11572354286909103,
0.04462360218167305,
-0.0019207886653020978,
-0.0836344063282013,
-0.0025172012392431498,
0.0009563873754814267,
-0.0044708168134093285,
-0.007321964018046856,
0.10890552401542664,
-0.09840328246355057,
-0.0020777147728949785,
-0.06283485889434814,
-0.08208674192428589,
0.009176990017294884,
-0.1555469036102295,
-0.016215553507208824,
-0.05779077485203743,
-0.17020869255065918,
-0.03174523264169693,
0.03703991696238518,
-0.07457821816205978,
-0.010607539676129818,
-0.04873814061284065,
-0.06494689732789993,
0.02508053183555603,
-0.014526808634400368,
0.1738819032907486,
-0.053185224533081055,
0.07269660383462906,
-0.0010953373275697231,
0.04626210033893585,
0.01476144790649414,
0.03513211011886597,
-0.10383866727352142,
0.02621135674417019,
-0.1365681290626526,
0.06918410956859589,
-0.08372204005718231,
-0.003747009439393878,
-0.1337445080280304,
-0.09747519344091415,
0.010158007964491844,
-0.02283298224210739,
0.09006386995315552,
0.13852359354496002,
-0.19340050220489502,
-0.01725887879729271,
0.12805843353271484,
-0.07606134563684464,
-0.0638587474822998,
0.06265906244516373,
-0.06112358346581459,
0.032193977385759354,
0.05146513134241104,
0.21024034917354584,
0.03964880108833313,
-0.16780665516853333,
-0.031552642583847046,
-0.005575479939579964,
0.03956415504217148,
0.02728271484375,
0.04130621999502182,
0.0035669743083417416,
0.06373708695173264,
0.013748357072472572,
-0.07699760049581528,
-0.03275948762893677,
-0.09171165525913239,
-0.06593754142522812,
-0.05513029173016548,
-0.07226239144802094,
0.04151234030723572,
0.002101401798427105,
0.04249276593327522,
-0.06440328061580658,
-0.10111576318740845,
0.1199679747223854,
0.09720493108034134,
-0.04770669341087341,
0.037564340978860855,
-0.07940937578678131,
0.020206129178404808,
-0.02083873189985752,
-0.03968749940395355,
-0.2059641182422638,
-0.1295323222875595,
0.052824776619672775,
-0.057807233184576035,
0.033481888473033905,
0.007667251862585545,
0.08101174235343933,
0.06128619611263275,
-0.04304657131433487,
-0.012188727967441082,
-0.09407373517751694,
0.003152415156364441,
-0.11752624809741974,
-0.18830056488513947,
-0.07804398983716965,
-0.04018646478652954,
0.09525101631879807,
-0.17435629665851593,
-0.0069603584706783295,
0.015163357369601727,
0.1441309005022049,
0.02723792940378189,
-0.06856735050678253,
-0.003893190296366811,
0.0364711694419384,
0.0018699439242482185,
-0.09537900984287262,
0.044683028012514114,
0.008536753244698048,
-0.0935157984495163,
-0.06208283454179764,
-0.13560549914836884,
-0.012032631784677505,
0.05893433466553688,
0.053426750004291534,
-0.09688419848680496,
-0.04614819586277008,
-0.07103581726551056,
-0.04055601358413696,
-0.07608165591955185,
0.01214717049151659,
0.20026373863220215,
0.03509090840816498,
0.11291942745447159,
-0.06725558638572693,
-0.07810628414154053,
-0.003638908267021179,
0.02129870094358921,
0.011942229233682156,
0.07624421268701553,
0.04061175137758255,
-0.05511137843132019,
0.07380270212888718,
0.09967520833015442,
-0.022163718938827515,
0.12324646860361099,
-0.04650265723466873,
-0.08414339274168015,
-0.035093311220407486,
-0.022725874558091164,
-0.028214672580361366,
0.12283487617969513,
-0.039264023303985596,
0.0056982459500432014,
0.03649413213133812,
0.04486491158604622,
0.01686755195260048,
-0.1624891757965088,
0.008348883129656315,
0.022279253229498863,
-0.054209765046834946,
-0.03519504517316818,
-0.0011631408706307411,
0.02720082737505436,
0.0919867753982544,
0.031926218420267105,
-0.012922864407300949,
0.003505147062242031,
-0.012071473523974419,
-0.062353748828172684,
0.18382160365581512,
-0.09802639484405518,
-0.08539586514234543,
-0.0768265426158905,
0.0057265316136181355,
-0.05869238078594208,
-0.03559267893433571,
0.016520541161298752,
-0.08611823618412018,
-0.03862028196454048,
-0.08754406124353409,
-0.017792657017707825,
-0.0191205944865942,
0.020767681300640106,
0.03275309130549431,
-0.02209819294512272,
0.08134579658508301,
-0.13945433497428894,
0.0012399767292663455,
-0.05153898522257805,
-0.09312644600868225,
-0.0007829934475012124,
0.0746997520327568,
0.0979100838303566,
0.0795753076672554,
-0.01627679541707039,
0.02976500615477562,
-0.03384603559970856,
0.2428482323884964,
-0.045291975140571594,
0.011005695909261703,
0.10421228408813477,
-0.014701214618980885,
0.05711044371128082,
0.09533049911260605,
0.03679579496383667,
-0.09389398992061615,
0.020617740228772163,
0.08205817639827728,
-0.029089948162436485,
-0.22968338429927826,
-0.025782886892557144,
-0.003963626455515623,
-0.07992992550134659,
0.10613354295492172,
0.03194580227136612,
-0.03779718279838562,
0.0452701561152935,
0.020722443237900734,
0.0016799949808046222,
-0.054801035672426224,
0.08197027444839478,
0.07557584345340729,
0.056265175342559814,
0.09942799806594849,
-0.008672242052853107,
-0.028380680829286575,
0.06230245158076286,
0.007730789016932249,
0.24712979793548584,
-0.024025673046708107,
0.1000915914773941,
0.03292198106646538,
0.15197975933551788,
-0.02721567265689373,
0.06418277323246002,
0.004091322887688875,
-0.008944697678089142,
-0.015106228180229664,
-0.0670945942401886,
-0.025248060002923012,
0.02424168586730957,
-0.04517089203000069,
0.02990126982331276,
-0.08176358044147491,
0.02638229727745056,
0.02770962379872799,
0.2800525724887848,
0.03493005782365799,
-0.27274689078330994,
-0.0657491385936737,
-0.012287931516766548,
-0.04232519865036011,
-0.06392890959978104,
0.005712657701224089,
0.1206023246049881,
-0.13402007520198822,
0.0641203448176384,
-0.07534389197826385,
0.08948948234319687,
-0.03862487152218819,
0.010471731424331665,
0.04510798677802086,
0.151736781001091,
-0.01737850345671177,
0.05100656673312187,
-0.18361608684062958,
0.2421043962240219,
0.024509062990546227,
0.10714725404977798,
-0.06370336562395096,
0.010675077326595783,
0.018407732248306274,
0.0069021545350551605,
0.11007858067750931,
0.0018711434677243233,
-0.0681585744023323,
-0.13803963363170624,
-0.1010977029800415,
0.04598475620150566,
0.1422237604856491,
-0.035807497799396515,
0.09927324205636978,
-0.028684768825769424,
0.012755471281707287,
0.0334952138364315,
-0.028617074713110924,
-0.15668915212154388,
-0.0726025402545929,
0.010448778979480267,
0.025875752791762352,
-0.01516345888376236,
-0.052490267902612686,
-0.10394182056188583,
-0.03832893446087837,
0.12018182128667831,
0.002358557190746069,
-0.04623871669173241,
-0.15017864108085632,
0.08412006497383118,
0.14581076800823212,
-0.058863695710897446,
0.01572081819176674,
0.014144734479486942,
0.1124909445643425,
0.032482899725437164,
-0.08545602113008499,
0.066264308989048,
-0.052935704588890076,
-0.1750272661447525,
-0.058110278099775314,
0.12071224302053452,
0.07970338314771652,
0.045520782470703125,
0.0016189217567443848,
0.057439886033535004,
0.0019606323912739754,
-0.09688818454742432,
0.03697141632437706,
0.005866435822099447,
0.051329247653484344,
0.028414040803909302,
-0.08513349294662476,
0.0782322809100151,
-0.03446858376264572,
0.0175543911755085,
0.1306374967098236,
0.23487278819084167,
-0.09975562989711761,
0.10413111001253128,
0.07902128249406815,
-0.07693260908126831,
-0.158255934715271,
0.05950519070029259,
0.12674854695796967,
0.003964267671108246,
0.0848328173160553,
-0.1990954577922821,
0.13335345685482025,
0.1071438416838646,
-0.014596676453948021,
0.02014963887631893,
-0.2726287245750427,
-0.13224458694458008,
0.06513106822967529,
0.10954294353723526,
0.05059703066945076,
-0.12232144176959991,
-0.03616506978869438,
-0.010269462130963802,
-0.12193238735198975,
0.1284090131521225,
-0.07530135661363602,
0.11691273003816605,
-0.021155469119548798,
0.12245280295610428,
0.024746686220169067,
-0.036611929535865784,
0.11450853198766708,
0.07095754146575928,
0.08460047096014023,
-0.039758119732141495,
-0.003817156422883272,
0.06487591564655304,
-0.06305240094661713,
0.03590167686343193,
-0.03583729639649391,
0.06291007995605469,
-0.14891210198402405,
0.006493810564279556,
-0.07744639366865158,
0.05976345017552376,
-0.047100458294153214,
-0.06520679593086243,
-0.02795107290148735,
0.04674875736236572,
0.07338003069162369,
-0.03534592688083649,
0.04497446492314339,
0.009480264037847519,
0.09019305557012558,
0.1016479879617691,
0.0721774697303772,
-0.021233657374978065,
-0.08305364102125168,
0.01283068023622036,
0.00402118731290102,
0.04671959951519966,
-0.08443524688482285,
0.01654953695833683,
0.14541606605052948,
0.05985216423869133,
0.10268911719322205,
0.04545527324080467,
-0.04428669810295105,
0.006196765694767237,
0.016837215051054955,
-0.14288146793842316,
-0.10075518488883972,
0.02788730338215828,
-0.05814502760767937,
-0.15423542261123657,
0.03200370445847511,
0.12409103661775589,
-0.03650215268135071,
-0.016791801899671555,
-0.0069403876550495625,
0.009610538370907307,
-0.011826816014945507,
0.18334214389324188,
0.04185981675982475,
0.05481581762433052,
-0.090377077460289,
0.11374612897634506,
0.03665408119559288,
-0.04042666405439377,
0.05411496385931969,
0.06730931997299194,
-0.09872697293758392,
0.014019225724041462,
0.07266446202993393,
0.14957626163959503,
-0.06701455265283585,
-0.012717798352241516,
-0.09115651249885559,
-0.076540008187294,
0.04430265724658966,
0.14403332769870758,
0.053544364869594574,
-0.004452838562428951,
-0.06002368777990341,
0.036091383546590805,
-0.11744143813848495,
0.06818883121013641,
0.05362345278263092,
0.08191383630037308,
-0.10874117910861969,
0.12529459595680237,
-0.0074400040321052074,
0.024522168561816216,
-0.02812589891254902,
0.017897579818964005,
-0.10035132616758347,
-0.03457850590348244,
-0.10917993634939194,
-0.013357988558709621,
-0.01743483543395996,
-0.003542493563145399,
-0.019089415669441223,
-0.07609665393829346,
-0.042773496359586716,
0.03320744261145592,
-0.07660995423793793,
-0.0490536205470562,
0.0166569072753191,
0.03991961479187012,
-0.16106060147285461,
0.0024062544107437134,
0.026976019144058228,
-0.08819588273763657,
0.08843956887722015,
0.06996705383062363,
0.016302864998579025,
0.02781936153769493,
-0.1230948343873024,
-0.03298802301287651,
0.0008303190697915852,
0.010578887537121773,
0.07676902413368225,
-0.09514041990041733,
-0.03021899238228798,
-0.03049432300031185,
0.04873894527554512,
0.015010270290076733,
0.10324862599372864,
-0.11896590143442154,
-0.013545344583690166,
-0.04650135710835457,
-0.03905094787478447,
-0.057129375636577606,
0.02571115642786026,
0.11358010023832321,
0.04571467265486717,
0.15736576914787292,
-0.0703035295009613,
0.055294815450906754,
-0.2044670283794403,
-0.032595328986644745,
0.010636523365974426,
-0.04566558450460434,
-0.07390214502811432,
-0.045174937695264816,
0.08308587968349457,
-0.05009450018405914,
0.1218709796667099,
-0.016494523733854294,
0.09127447009086609,
0.043980106711387634,
-0.004016538616269827,
-0.07011979818344116,
-0.011787589639425278,
0.1835806518793106,
0.05763944983482361,
-0.021033337339758873,
0.12017758190631866,
0.0026107714511454105,
0.043054427951574326,
0.06539781391620636,
0.23428183794021606,
0.15126098692417145,
-0.011855191551148891,
0.0745588168501854,
0.06686023622751236,
-0.07466766983270645,
-0.14146466553211212,
0.12114766240119934,
-0.02073172852396965,
0.10513101518154144,
-0.05185164883732796,
0.1899210661649704,
0.03934028372168541,
-0.1771068125963211,
0.05341821163892746,
-0.024417860433459282,
-0.1080889105796814,
-0.12617450952529907,
-0.01657763309776783,
-0.08254808932542801,
-0.11539068073034286,
0.02794845588505268,
-0.1231381967663765,
0.06893002241849899,
0.09464026987552643,
0.006709713954478502,
0.035662971436977386,
0.18198469281196594,
-0.057835791260004044,
0.011039024218916893,
0.07108569145202637,
0.02095491625368595,
-0.003184927860274911,
-0.03929128125309944,
-0.0670294389128685,
0.03720281645655632,
0.04496787488460541,
0.0713290199637413,
-0.05013596639037132,
0.010523884557187557,
0.014236249029636383,
-0.01072878297418356,
-0.07814337313175201,
0.00791256781667471,
0.014415471814572811,
0.048709869384765625,
0.035274688154459,
0.04778303951025009,
0.00859216321259737,
-0.053331222385168076,
0.2754289209842682,
-0.06751391291618347,
-0.06303989142179489,
-0.12302717566490173,
0.19377999007701874,
0.032486189156770706,
-0.01846833899617195,
0.05651307478547096,
-0.09293632954359055,
-0.012845118530094624,
0.16179896891117096,
0.13449610769748688,
-0.09095559269189835,
-0.02095208317041397,
-0.023738201707601547,
-0.008820557966828346,
-0.012355263344943523,
0.1043679490685463,
0.07108056545257568,
0.0016597093781456351,
-0.06666329503059387,
-0.013135312125086784,
-0.030005479231476784,
-0.04684159904718399,
-0.06398794054985046,
0.05948108434677124,
0.026180796325206757,
-0.005694949068129063,
-0.05875154212117195,
0.06309767067432404,
-0.0034975316375494003,
-0.23576746881008148,
0.037594061344861984,
-0.17238734662532806,
-0.1737850308418274,
-0.012994285672903061,
0.07099970430135727,
0.001002514734864235,
0.05613124743103981,
-0.006832482758909464,
0.009464162401854992,
0.1164155974984169,
-0.016497429460287094,
-0.014615094289183617,
-0.1163015067577362,
0.10977046191692352,
-0.10916221141815186,
0.21337740123271942,
-0.0008279558969661593,
0.06534311920404434,
0.09867575019598007,
0.03738902136683464,
-0.135698601603508,
0.018261343240737915,
0.06232991814613342,
-0.12546472251415253,
0.0016272899229079485,
0.14624238014221191,
-0.03450412675738335,
0.06174150109291077,
0.032320473343133926,
-0.14908789098262787,
-0.004009720403701067,
0.02879532426595688,
-0.03726779296994209,
-0.06899265199899673,
-0.010557841509580612,
-0.056852784007787704,
0.16537627577781677,
0.20591653883457184,
-0.02972402237355709,
0.01253075897693634,
-0.0843191146850586,
0.021719040349125862,
0.04858938977122307,
0.05890055373311043,
-0.0385332889854908,
-0.21614082157611847,
0.021423233672976494,
0.0720965787768364,
-0.0019958389457315207,
-0.19460469484329224,
-0.09710768610239029,
0.041994236409664154,
-0.03593037277460098,
-0.04613456502556801,
0.09174981713294983,
0.023997275158762932,
0.0364212729036808,
-0.018581481650471687,
-0.11623378843069077,
-0.027848944067955017,
0.14572682976722717,
-0.17581817507743835,
-0.0421922393143177
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09596126526594162,
0.1143331527709961,
-0.002329719951376319,
0.09232183545827866,
0.11995401233434677,
0.02276361919939518,
0.10132038593292236,
0.1276458501815796,
-0.09687690436840057,
0.08633413165807724,
0.08798981457948685,
0.038543857634067535,
0.04693625494837761,
0.14547991752624512,
-0.01917804218828678,
-0.25983333587646484,
0.01051914133131504,
-0.0038871292490512133,
-0.033146731555461884,
0.11173943430185318,
0.08520353585481644,
-0.11010416597127914,
0.08663345128297806,
0.01511739008128643,
-0.1547684222459793,
0.019711658358573914,
-0.03735457360744476,
-0.034119393676519394,
0.11336778849363327,
-0.033082831650972366,
0.1087113544344902,
0.02515527978539467,
0.1347205936908722,
-0.20862558484077454,
0.004966778680682182,
0.07266971468925476,
0.04521679878234863,
0.100022092461586,
0.051377762109041214,
0.015478132292628288,
0.08836019039154053,
-0.15303170680999756,
0.09292130172252655,
0.02915908209979534,
-0.09161300212144852,
-0.1308704912662506,
-0.09623943269252777,
0.025993207469582558,
0.052734971046447754,
0.06903070956468582,
0.001257681054994464,
0.1499153971672058,
-0.05982646346092224,
0.07893817871809006,
0.2652408182621002,
-0.32876014709472656,
-0.06453162431716919,
0.03296501189470291,
0.061390120536088943,
0.05380171164870262,
-0.12266981601715088,
-0.0056751323863863945,
0.027579238638281822,
0.03024299442768097,
0.11820436269044876,
-0.017142251133918762,
-0.111529141664505,
-0.01303855050355196,
-0.1283216029405594,
-0.00046201684745028615,
0.07265476882457733,
0.0359392948448658,
-0.052547577768564224,
-0.09544926881790161,
-0.0748697817325592,
-0.09408663958311081,
-0.024604152888059616,
-0.06502678245306015,
0.05659610033035278,
-0.05483013391494751,
-0.08054061233997345,
-0.03637227788567543,
-0.057202018797397614,
-0.0755976140499115,
-0.017917964607477188,
0.15585701167583466,
0.039948802441358566,
0.020883161574602127,
-0.03168986737728119,
0.10882922261953354,
0.002964173210784793,
-0.1412876546382904,
-0.014507031068205833,
-0.0016646116273477674,
-0.09683430194854736,
-0.046935439109802246,
-0.051248129457235336,
-0.01561307068914175,
0.010458573698997498,
0.1776067018508911,
-0.07990594953298569,
0.07582991570234299,
0.01113746128976345,
-0.029386891052126884,
-0.005934050772339106,
0.1468154788017273,
-0.04423222318291664,
-0.04769732803106308,
-0.010401318781077862,
0.0738644078373909,
0.002557690953835845,
-0.014949328266084194,
-0.06548299640417099,
-0.02701386623084545,
0.1025664284825325,
0.046267323195934296,
-0.06044115871191025,
0.04013146832585335,
-0.022899288684129715,
-0.027826381847262383,
0.017178496345877647,
-0.11553626507520676,
0.04431961476802826,
-0.0019009154057130218,
-0.08390256017446518,
-0.0017585369059816003,
0.00018998501764144748,
-0.005320255178958178,
-0.007724236696958542,
0.10917358845472336,
-0.09932518750429153,
-0.0029088736046105623,
-0.06325092166662216,
-0.08312738686800003,
0.008605302311480045,
-0.15537849068641663,
-0.01524820365011692,
-0.057356275618076324,
-0.17058904469013214,
-0.03325209021568298,
0.036873724311590195,
-0.07446126639842987,
-0.009585481137037277,
-0.04882624372839928,
-0.06558295339345932,
0.024639161303639412,
-0.014550723135471344,
0.17446112632751465,
-0.053606320172548294,
0.07169510424137115,
0.00004513439489528537,
0.046194493770599365,
0.014455415308475494,
0.035197507590055466,
-0.1036529690027237,
0.025527501478791237,
-0.13648073375225067,
0.06860251724720001,
-0.08427383005619049,
-0.0020595428068190813,
-0.13258004188537598,
-0.09864386916160583,
0.010899519547820091,
-0.021715687587857246,
0.08958160132169724,
0.13831259310245514,
-0.1930670291185379,
-0.017523860558867455,
0.12747091054916382,
-0.07506698369979858,
-0.06398197263479233,
0.062398407608270645,
-0.061355046927928925,
0.03099150024354458,
0.0521574392914772,
0.21027104556560516,
0.03988534212112427,
-0.16723212599754333,
-0.03281866014003754,
-0.0064790151081979275,
0.039929747581481934,
0.026634858921170235,
0.03988917917013168,
0.004931105300784111,
0.06401815265417099,
0.014426537789404392,
-0.07599365711212158,
-0.03274446725845337,
-0.09117808938026428,
-0.06553104519844055,
-0.05425507202744484,
-0.0721784383058548,
0.04109122231602669,
0.0026701989118009806,
0.04251322150230408,
-0.0654325857758522,
-0.10160446166992188,
0.11946048587560654,
0.0969405248761177,
-0.04824817553162575,
0.036131035536527634,
-0.07979079335927963,
0.020745975896716118,
-0.020863067358732224,
-0.03916085883975029,
-0.20550942420959473,
-0.13066406548023224,
0.05176985263824463,
-0.05534984543919563,
0.033381007611751556,
0.008078006096184254,
0.0818040519952774,
0.06160900369286537,
-0.04345274716615677,
-0.012669188901782036,
-0.09298579394817352,
0.0034862328320741653,
-0.11726479977369308,
-0.18874329328536987,
-0.07851462066173553,
-0.040041234344244,
0.09569712728261948,
-0.1755329966545105,
-0.006564476061612368,
0.015501696616411209,
0.14331306517124176,
0.026808666065335274,
-0.06812480837106705,
-0.0029889759607613087,
0.037750374525785446,
0.0027817802038043737,
-0.09520403295755386,
0.04529449716210365,
0.008410455659031868,
-0.09243407845497131,
-0.06313011050224304,
-0.13602231442928314,
-0.01037169061601162,
0.06003110110759735,
0.0527057908475399,
-0.09726153314113617,
-0.04612147808074951,
-0.07078717648983002,
-0.04056550934910774,
-0.07442990690469742,
0.012137874029576778,
0.20121213793754578,
0.03467699512839317,
0.11264032870531082,
-0.06640548259019852,
-0.0776980072259903,
-0.003535515395924449,
0.022696318104863167,
0.0129954032599926,
0.07609578967094421,
0.03947287052869797,
-0.05300580710172653,
0.07410498708486557,
0.09874220192432404,
-0.022518252953886986,
0.12405296415090561,
-0.04672912880778313,
-0.08362749963998795,
-0.034456875175237656,
-0.023721659556031227,
-0.02872340939939022,
0.12358049303293228,
-0.03971223905682564,
0.004412135109305382,
0.03616567701101303,
0.04419945925474167,
0.01719474047422409,
-0.1616497039794922,
0.008326453156769276,
0.021543167531490326,
-0.053215689957141876,
-0.03633498400449753,
-0.0007871789857745171,
0.026573091745376587,
0.09157931059598923,
0.03172881156206131,
-0.014223050326108932,
0.0033659343607723713,
-0.011907501146197319,
-0.0617372989654541,
0.18433170020580292,
-0.09813131392002106,
-0.08428121358156204,
-0.07661177963018417,
0.004064805340021849,
-0.0596722736954689,
-0.036012209951877594,
0.01608293130993843,
-0.08756376802921295,
-0.03878806531429291,
-0.0872238501906395,
-0.01879681833088398,
-0.01879953406751156,
0.020614590495824814,
0.031918346881866455,
-0.02241511270403862,
0.08081948757171631,
-0.13914401829242706,
0.0014883343828842044,
-0.05179676413536072,
-0.09313903003931046,
-0.00018238552729599178,
0.0750017836689949,
0.09820375591516495,
0.07941416651010513,
-0.016034167259931564,
0.02980506792664528,
-0.03403031826019287,
0.2435649037361145,
-0.04544690623879433,
0.0111238406971097,
0.10407253354787827,
-0.013868413865566254,
0.055965714156627655,
0.0953805074095726,
0.037721000611782074,
-0.09429676830768585,
0.020430535078048706,
0.0825016126036644,
-0.02861892245709896,
-0.2294429987668991,
-0.02518756315112114,
-0.004313449375331402,
-0.07992294430732727,
0.10606071352958679,
0.03189126029610634,
-0.03727889433503151,
0.04603647068142891,
0.021544622257351875,
0.002587200840935111,
-0.05505082383751869,
0.08149431645870209,
0.07694397866725922,
0.055734988301992416,
0.10042832046747208,
-0.00876685231924057,
-0.028672227635979652,
0.061502620577812195,
0.007949932478368282,
0.24772433936595917,
-0.024340001866221428,
0.0997902974486351,
0.03345414996147156,
0.15140531957149506,
-0.02671985700726509,
0.06499704718589783,
0.003580323187634349,
-0.009597968310117722,
-0.014853518456220627,
-0.06691650301218033,
-0.024057133123278618,
0.023349057883024216,
-0.04621275141835213,
0.02931530773639679,
-0.0818491131067276,
0.024868281558156013,
0.02786167524755001,
0.27961114048957825,
0.0349104069173336,
-0.27526816725730896,
-0.06657631695270538,
-0.01307192724198103,
-0.04196356609463692,
-0.06347287446260452,
0.005902323871850967,
0.12010128796100616,
-0.1333867758512497,
0.06484667956829071,
-0.0754135325551033,
0.08913761377334595,
-0.03871801495552063,
0.011421076953411102,
0.04710124805569649,
0.15273532271385193,
-0.018241677433252335,
0.05059095844626427,
-0.18403922021389008,
0.2409457415342331,
0.024622783064842224,
0.10789231956005096,
-0.06388410180807114,
0.010555031709372997,
0.019291384145617485,
0.008147495798766613,
0.1097169741988182,
0.0012418740661814809,
-0.06776823848485947,
-0.1378914713859558,
-0.10023844987154007,
0.04719002544879913,
0.14132508635520935,
-0.03487919643521309,
0.10008452087640762,
-0.02778276428580284,
0.012363585643470287,
0.03317135572433472,
-0.029660912230610847,
-0.15682610869407654,
-0.07317664474248886,
0.009736011736094952,
0.027078254148364067,
-0.014722759835422039,
-0.05181373655796051,
-0.10444246232509613,
-0.03973587229847908,
0.11947011947631836,
0.0036652113776654005,
-0.0464598573744297,
-0.15070325136184692,
0.08346021920442581,
0.14597001671791077,
-0.0577610544860363,
0.015664802864193916,
0.014915168285369873,
0.11160849779844284,
0.03345461189746857,
-0.08513401448726654,
0.06631605327129364,
-0.05361340194940567,
-0.1737956702709198,
-0.05808984115719795,
0.11986076086759567,
0.07941433787345886,
0.04479377716779709,
0.0013875743607059121,
0.05732147395610809,
0.0011415499029681087,
-0.09747403860092163,
0.03730057552456856,
0.003953092265874147,
0.05222562327980995,
0.027997365221381187,
-0.08591341227293015,
0.0780654177069664,
-0.03385843709111214,
0.018511207774281502,
0.12880352139472961,
0.23208928108215332,
-0.09899970144033432,
0.10221922397613525,
0.0794612318277359,
-0.07650747895240784,
-0.15784652531147003,
0.060474321246147156,
0.12564387917518616,
0.00471431203186512,
0.08340564370155334,
-0.19993208348751068,
0.1344640851020813,
0.10658686608076096,
-0.013594293966889381,
0.021997250616550446,
-0.2699555456638336,
-0.1315331608057022,
0.06548555195331573,
0.11026512086391449,
0.05141165107488632,
-0.1227010041475296,
-0.03555232286453247,
-0.010356953367590904,
-0.12134116142988205,
0.12762786448001862,
-0.07789695262908936,
0.11698747426271439,
-0.02149706520140171,
0.12283752858638763,
0.024024397134780884,
-0.036677949130535126,
0.11362010985612869,
0.07182581722736359,
0.0856102705001831,
-0.0398876927793026,
-0.003079627873376012,
0.06481841206550598,
-0.062364108860492706,
0.03583380952477455,
-0.03690223768353462,
0.06232098117470741,
-0.1492937207221985,
0.006492446176707745,
-0.07828296720981598,
0.059505611658096313,
-0.046887192875146866,
-0.06499078869819641,
-0.027021687477827072,
0.04706646874547005,
0.07231899350881577,
-0.03547220304608345,
0.04362206161022186,
0.008540170267224312,
0.09058196097612381,
0.1004130095243454,
0.07283755391836166,
-0.022418741136789322,
-0.08388479799032211,
0.013986410573124886,
0.0036498040426522493,
0.046697650104761124,
-0.08404338359832764,
0.015263602137565613,
0.14612317085266113,
0.0591457299888134,
0.10239855945110321,
0.04627217352390289,
-0.04332051798701286,
0.005710342898964882,
0.017897913232445717,
-0.14301088452339172,
-0.09957325458526611,
0.02808302454650402,
-0.06156246364116669,
-0.1540113240480423,
0.03307672590017319,
0.12398099899291992,
-0.03636815398931503,
-0.01657278463244438,
-0.006915832869708538,
0.008969293907284737,
-0.011977249756455421,
0.18454225361347198,
0.042002446949481964,
0.05440301448106766,
-0.09146088361740112,
0.1131364032626152,
0.036548782140016556,
-0.04105493798851967,
0.0540541373193264,
0.06802644580602646,
-0.09971363842487335,
0.013057519681751728,
0.07183147221803665,
0.15089847147464752,
-0.06590919196605682,
-0.013590668328106403,
-0.09240183234214783,
-0.07701028138399124,
0.04432080313563347,
0.14287374913692474,
0.05329762026667595,
-0.005176045000553131,
-0.06035948544740677,
0.03559105098247528,
-0.11807172000408173,
0.06778419017791748,
0.052838992327451706,
0.08216691762208939,
-0.10881125926971436,
0.12464533001184464,
-0.007519882172346115,
0.023747015744447708,
-0.028134047985076904,
0.018424106761813164,
-0.10104840993881226,
-0.03464164212346077,
-0.10926461964845657,
-0.014267992228269577,
-0.017829162999987602,
-0.003025304526090622,
-0.019393321126699448,
-0.0748099833726883,
-0.043421823531389236,
0.0332174189388752,
-0.07689507305622101,
-0.04844609647989273,
0.018284987658262253,
0.04050762206315994,
-0.1604376584291458,
0.002786741591989994,
0.025813136249780655,
-0.08813635259866714,
0.08826640993356705,
0.06991613656282425,
0.016078930348157883,
0.02819424867630005,
-0.12149621546268463,
-0.03351833298802376,
0.00036739165079779923,
0.00986559595912695,
0.07729997485876083,
-0.09440791606903076,
-0.029581744223833084,
-0.030689066275954247,
0.048995643854141235,
0.015244247391819954,
0.102435402572155,
-0.1181907132267952,
-0.013198906555771828,
-0.04571767896413803,
-0.037793245166540146,
-0.057605646550655365,
0.026199132204055786,
0.11428701877593994,
0.0451384000480175,
0.1578049212694168,
-0.07031004875898361,
0.0544419027864933,
-0.2047322690486908,
-0.03295276686549187,
0.010495468974113464,
-0.04695760831236839,
-0.07319019734859467,
-0.045163121074438095,
0.08381794393062592,
-0.050624433904886246,
0.12338085472583771,
-0.01672869175672531,
0.09168501943349838,
0.043445490300655365,
-0.0037739980034530163,
-0.07119008153676987,
-0.01120741106569767,
0.18332606554031372,
0.05720141902565956,
-0.021702652797102928,
0.11992364376783371,
0.003811078378930688,
0.042235273867845535,
0.0666567012667656,
0.2325117588043213,
0.15041834115982056,
-0.011153262108564377,
0.07477675378322601,
0.06754112243652344,
-0.07512501627206802,
-0.14022955298423767,
0.1220535933971405,
-0.02070641703903675,
0.10574483871459961,
-0.05272252857685089,
0.18877346813678741,
0.03857462480664253,
-0.17625866830348969,
0.05381551384925842,
-0.02525276504456997,
-0.10824910551309586,
-0.1249711737036705,
-0.015693742781877518,
-0.08189234137535095,
-0.11599445343017578,
0.028088320046663284,
-0.1236438900232315,
0.06756597757339478,
0.09574401378631592,
0.00750691955909133,
0.035179540514945984,
0.1835053563117981,
-0.05673655495047569,
0.011702210642397404,
0.07140899449586868,
0.020427772775292397,
-0.003521523205563426,
-0.0390792042016983,
-0.06624434888362885,
0.036643195897340775,
0.043478768318891525,
0.07087808847427368,
-0.051051583141088486,
0.009855539537966251,
0.014913653954863548,
-0.009934957139194012,
-0.07772476971149445,
0.007700514979660511,
0.014660494402050972,
0.04842457175254822,
0.0362536646425724,
0.04732527211308479,
0.007761369459331036,
-0.05349385738372803,
0.27434855699539185,
-0.06751198321580887,
-0.06179691106081009,
-0.12385555356740952,
0.1934623420238495,
0.03234100341796875,
-0.019058678299188614,
0.05561412498354912,
-0.09205586463212967,
-0.011738892644643784,
0.16305387020111084,
0.13636481761932373,
-0.09097540378570557,
-0.021297579631209373,
-0.023259064182639122,
-0.0088167330250144,
-0.012523045763373375,
0.1047237291932106,
0.071053147315979,
0.0011955132940784097,
-0.06668560206890106,
-0.013032066635787487,
-0.029397640377283096,
-0.046811893582344055,
-0.06359818577766418,
0.05895936116576195,
0.027329137548804283,
-0.006066365633159876,
-0.05809426307678223,
0.06329743564128876,
-0.003819589503109455,
-0.23558297753334045,
0.03851013258099556,
-0.17183572053909302,
-0.17422127723693848,
-0.01410029549151659,
0.070277139544487,
0.0017772603314369917,
0.05689642205834389,
-0.006720739882439375,
0.009276424534618855,
0.11697383970022202,
-0.016752813011407852,
-0.01376529224216938,
-0.11709676682949066,
0.10990546643733978,
-0.1088789701461792,
0.21214768290519714,
-0.0015199299668893218,
0.0653330460190773,
0.0991201177239418,
0.03702859580516815,
-0.13544563949108124,
0.0189223513007164,
0.06155338138341904,
-0.1259951889514923,
0.002197730354964733,
0.14538390934467316,
-0.03434816002845764,
0.06094251200556755,
0.03114289976656437,
-0.148182675242424,
-0.0031924350187182426,
0.029010288417339325,
-0.03708065673708916,
-0.06904306262731552,
-0.010246618650853634,
-0.05643734335899353,
0.16590487957000732,
0.20745010673999786,
-0.029552463442087173,
0.012404218316078186,
-0.08422175794839859,
0.02193223312497139,
0.049104999750852585,
0.05763004347681999,
-0.03934469446539879,
-0.21612654626369476,
0.02154294215142727,
0.07260294258594513,
-0.0025369899813085794,
-0.19568760693073273,
-0.09594545513391495,
0.04205050691962242,
-0.03580727055668831,
-0.04635120928287506,
0.09167402982711792,
0.02490544691681862,
0.037487804889678955,
-0.01906224898993969,
-0.1138903796672821,
-0.027783002704381943,
0.14543525874614716,
-0.1761440485715866,
-0.04290812462568283
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
{'exact_match': 64.02081362346263, 'f1': 75.36439229517165}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
{'exact_match': 64.02081362346263, 'f1': 75.36439229517165}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10",
"### Training results\n\n{'exact_match': 64.02081362346263, 'f1': 75.36439229517165}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10",
"### Training results\n\n{'exact_match': 64.02081362346263, 'f1': 75.36439229517165}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
46,
56,
6,
12,
8,
3,
105,
34,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10### Training results\n\n{'exact_match': 64.02081362346263, 'f1': 75.36439229517165}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.10709252953529358,
0.09730905294418335,
-0.002101187827065587,
0.09354621171951294,
0.1368592530488968,
0.03701057285070419,
0.10579690337181091,
0.11050187051296234,
-0.10211030393838882,
0.07659182697534561,
0.06730978935956955,
0.03477101027965546,
0.05278867110610008,
0.11574182659387589,
-0.022511033341288567,
-0.26376551389694214,
-0.005049781873822212,
0.00597648648545146,
-0.11629259586334229,
0.11273546516895294,
0.1083906963467598,
-0.09721342474222183,
0.07631727308034897,
0.017353862524032593,
-0.16293108463287354,
0.03451918438076973,
-0.02309415303170681,
-0.02417457103729248,
0.10952084511518478,
0.013205391354858875,
0.11893612891435623,
0.019446833059191704,
0.13812430202960968,
-0.20038829743862152,
0.005195613019168377,
0.08814048767089844,
0.040516458451747894,
0.09671099483966827,
0.07046946883201599,
0.008865226060152054,
0.10875992476940155,
-0.1391628533601761,
0.07370320707559586,
0.03740094602108002,
-0.09730739891529083,
-0.15397733449935913,
-0.09758220613002777,
0.06668798625469208,
0.06498172879219055,
0.07917049527168274,
-0.000743347336538136,
0.13463899493217468,
-0.051201265305280685,
0.0874801054596901,
0.24365046620368958,
-0.30128952860832214,
-0.08567409962415695,
0.053075894713401794,
0.06140807271003723,
0.04527641460299492,
-0.11378519982099533,
-0.0011501444969326258,
0.03302106633782387,
0.026381926611065865,
0.08694107085466385,
-0.021118776872754097,
-0.11998628079891205,
-0.014059100300073624,
-0.1277374029159546,
0.011308600194752216,
0.10122934728860855,
0.05526898428797722,
-0.045569341629743576,
-0.06947499513626099,
-0.0694560632109642,
-0.09003804624080658,
-0.011206120252609253,
-0.06652999669313431,
0.045831143856048584,
-0.055661968886852264,
-0.07576120644807816,
-0.05284503847360611,
-0.06643404811620712,
-0.0813450962305069,
-0.013264649547636509,
0.15752357244491577,
0.030031684786081314,
0.01959240436553955,
-0.025804094970226288,
0.11283406615257263,
0.0037896085996180773,
-0.13486889004707336,
-0.0036544068716466427,
0.003169640898704529,
-0.1137247234582901,
-0.051500432193279266,
-0.053295861929655075,
0.00012925939518027008,
-0.0059014856815338135,
0.1595398634672165,
-0.09652407467365265,
0.0773385614156723,
0.033096835017204285,
-0.018983738496899605,
-0.02812841720879078,
0.14163193106651306,
-0.08022944629192352,
-0.04183351621031761,
-0.023281754925847054,
0.08328350633382797,
-0.007306980900466442,
-0.00786706805229187,
-0.06090111285448074,
-0.04520125687122345,
0.08919142931699753,
0.04844329506158829,
-0.04545855522155762,
0.042000144720077515,
-0.020220236852765083,
-0.03876075893640518,
0.014534922316670418,
-0.12103579193353653,
0.039632152765989304,
0.004277696367353201,
-0.10578778386116028,
-0.012643546797335148,
0.017760299146175385,
-0.00250156014226377,
-0.01408575102686882,
0.08623049408197403,
-0.09718737751245499,
0.0008786096004769206,
-0.07901671528816223,
-0.0915873721241951,
-0.0012136095901951194,
-0.1255333125591278,
-0.010407686233520508,
-0.04759453982114792,
-0.16826583445072174,
-0.050060924142599106,
0.04099511727690697,
-0.07152190804481506,
-0.031542934477329254,
-0.028286408632993698,
-0.0757487490773201,
0.013508803211152554,
-0.0043103969655931,
0.19716478884220123,
-0.03929326310753822,
0.06774090975522995,
0.009961205534636974,
0.038732051849365234,
0.011310549452900887,
0.032971955835819244,
-0.08482160419225693,
0.035072144120931625,
-0.12939739227294922,
0.07508350163698196,
-0.09281529486179352,
0.0037634088657796383,
-0.13249044120311737,
-0.1036902368068695,
0.00905323214828968,
-0.00969064049422741,
0.08917403966188431,
0.11953701078891754,
-0.20004291832447052,
-0.015350252389907837,
0.11640243232250214,
-0.07175039499998093,
-0.07791230082511902,
0.05620298534631729,
-0.04532002657651901,
0.03748335316777229,
0.04172191396355629,
0.15764914453029633,
0.09921679645776749,
-0.1434449404478073,
-0.02358270436525345,
0.01232914812862873,
0.05183987691998482,
0.040440190583467484,
0.03985310345888138,
-0.0010317983105778694,
0.04930783063173294,
0.006170037668198347,
-0.10638012737035751,
-0.03146824613213539,
-0.10012169927358627,
-0.07097877562046051,
-0.05392971634864807,
-0.09153292328119278,
0.03828269988298416,
0.030588772147893906,
0.04071468859910965,
-0.060936443507671356,
-0.11991863697767258,
0.1231388971209526,
0.11676405370235443,
-0.05215391889214516,
0.021837998181581497,
-0.07521016150712967,
-0.005270545836538076,
0.016039211302995682,
-0.03671892732381821,
-0.21093754470348358,
-0.1481400579214096,
0.027014704421162605,
-0.057097066193819046,
0.037056814879179,
0.028366852551698685,
0.08655623346567154,
0.05856042727828026,
-0.051818378269672394,
-0.00991850346326828,
-0.09012717008590698,
-0.004340906161814928,
-0.10015524923801422,
-0.20120836794376373,
-0.10155434161424637,
-0.01823018677532673,
0.14813658595085144,
-0.2179170846939087,
0.0055129737593233585,
-0.0000751353072701022,
0.15074923634529114,
0.005966576281934977,
-0.057388897985219955,
-0.010910376906394958,
0.036043357104063034,
0.0008286294178105891,
-0.0959918200969696,
0.0417105071246624,
0.00648757116869092,
-0.09765831381082535,
-0.06118408963084221,
-0.1477392464876175,
-0.005581910256296396,
0.05609998479485512,
0.03808407485485077,
-0.11305548250675201,
-0.029518060386180878,
-0.06731458008289337,
-0.05015729367733002,
-0.0591670498251915,
0.0032119080424308777,
0.1819952428340912,
0.025697268545627594,
0.10426173359155655,
-0.056324552744627,
-0.07310685515403748,
-0.015117165632545948,
0.018932083621621132,
0.008135262876749039,
0.09353628754615784,
0.04488670453429222,
-0.07871904969215393,
0.07934556901454926,
0.06924475729465485,
-0.055934444069862366,
0.1282336711883545,
-0.05189156159758568,
-0.0802483931183815,
-0.034348227083683014,
-0.004396404139697552,
-0.018006302416324615,
0.1220862939953804,
-0.0328902006149292,
0.014902994967997074,
0.029665915295481682,
0.019124895334243774,
0.02797410450875759,
-0.16769886016845703,
0.003116765059530735,
0.022857841104269028,
-0.04955001175403595,
-0.024604566395282745,
-0.020379746332764626,
0.03523917496204376,
0.09086693823337555,
0.01835392601788044,
-0.005741009954363108,
0.005656504072248936,
-0.009947500191628933,
-0.07125944644212723,
0.20714066922664642,
-0.09562423825263977,
-0.09394211322069168,
-0.10671623051166534,
0.0566268190741539,
-0.06624855846166611,
-0.03748701885342598,
-0.009219340980052948,
-0.08471456170082092,
-0.04129723086953163,
-0.07116411626338959,
0.004057013429701328,
-0.01889394409954548,
-0.0027608161326497793,
0.03292762488126755,
-0.0032227844931185246,
0.10515540093183517,
-0.1458021104335785,
0.024282807484269142,
-0.037297189235687256,
-0.12225872278213501,
-0.00990310125052929,
0.07352147251367569,
0.09560242295265198,
0.10406971722841263,
-0.017411349341273308,
0.01847533881664276,
-0.02021498791873455,
0.23632802069187164,
-0.06299155205488205,
0.018611103296279907,
0.12077263742685318,
-0.008794684894382954,
0.048239726573228836,
0.0919986143708229,
0.044812846928834915,
-0.08905663341283798,
0.023135242983698845,
0.1024496853351593,
-0.021494891494512558,
-0.24558280408382416,
-0.02582133188843727,
-0.003038461785763502,
-0.07996485382318497,
0.09841464459896088,
0.02223026752471924,
-0.015368839725852013,
0.06085249409079552,
-0.004583916161209345,
0.013396530412137508,
-0.03640693053603172,
0.060132093727588654,
0.08100009709596634,
0.051801178604364395,
0.11576827615499496,
-0.017152998596429825,
-0.02020244300365448,
0.0520084984600544,
0.00766667490825057,
0.25181591510772705,
-0.04100002348423004,
0.11025843769311905,
0.03772785887122154,
0.14441365003585815,
-0.033548690378665924,
0.06122744828462601,
0.00830170139670372,
-0.020364301279187202,
0.015618357807397842,
-0.06374453008174896,
-0.010191740468144417,
0.014754361473023891,
-0.03444354608654976,
0.04228091239929199,
-0.076856330037117,
0.04997139051556587,
0.018897274509072304,
0.2780091464519501,
0.02863999642431736,
-0.2713560163974762,
-0.08536460250616074,
-0.011661466211080551,
-0.01906885951757431,
-0.07141309976577759,
-0.00885463785380125,
0.13126061856746674,
-0.13368871808052063,
0.0664854496717453,
-0.06738028675317764,
0.09553420543670654,
-0.023010337725281715,
0.0001486793626099825,
0.07539528608322144,
0.15664994716644287,
-0.012535428628325462,
0.06396820396184921,
-0.18953894078731537,
0.2462725192308426,
0.0201544351875782,
0.10757219791412354,
-0.04737916216254234,
0.021484078839421272,
0.01668829284608364,
0.030538568273186684,
0.0908123031258583,
0.0021299454383552074,
-0.05279276520013809,
-0.15107879042625427,
-0.06826862692832947,
0.04024331644177437,
0.14358100295066833,
-0.04321378096938133,
0.09536667168140411,
-0.03653721511363983,
0.008719084784388542,
0.046352941542863846,
-0.054788440465927124,
-0.16817957162857056,
-0.08267408609390259,
0.011193085461854935,
-0.007186434231698513,
-0.023787563666701317,
-0.06539522111415863,
-0.10172253847122192,
-0.03519841656088829,
0.14078769087791443,
-0.024918051436543465,
-0.049658600240945816,
-0.14424993097782135,
0.1056993305683136,
0.16676375269889832,
-0.06517446041107178,
0.012922178022563457,
0.02046990767121315,
0.11846526712179184,
0.04227312281727791,
-0.08665939420461655,
0.06206470727920532,
-0.0661773681640625,
-0.17410415410995483,
-0.03461272642016411,
0.1327686458826065,
0.08059556782245636,
0.04264715313911438,
-0.0015801142435520887,
0.04089173674583435,
0.008280586451292038,
-0.10074224323034286,
0.02536151558160782,
-0.023618826642632484,
0.031034620478749275,
0.04385564476251602,
-0.08037224411964417,
0.058399587869644165,
-0.03915363550186157,
0.00792076624929905,
0.10490132868289948,
0.20997269451618195,
-0.10477736592292786,
0.04440606012940407,
0.06548099219799042,
-0.0848456397652626,
-0.16533713042736053,
0.08551112562417984,
0.15214598178863525,
0.005831976421177387,
0.07907998561859131,
-0.21529950201511383,
0.14889246225357056,
0.12969960272312164,
-0.013695010915398598,
0.05389077588915825,
-0.2757767140865326,
-0.14134936034679413,
0.08482358604669571,
0.10436765849590302,
0.0003445317561272532,
-0.14952826499938965,
-0.0444999597966671,
-0.015348202548921108,
-0.17044812440872192,
0.13964369893074036,
-0.09038202464580536,
0.10204680263996124,
-0.0058442880399525166,
0.09939739853143692,
0.03142663463950157,
-0.023626897484064102,
0.1339363157749176,
0.06323855370283127,
0.0903211310505867,
-0.0413094088435173,
0.0021258227061480284,
0.0800706073641777,
-0.0526922233402729,
0.026354635134339333,
-0.017182258889079094,
0.06649770587682724,
-0.13603109121322632,
0.007353374268859625,
-0.09189099073410034,
0.07513067871332169,
-0.06815623492002487,
-0.05444274842739105,
-0.02834758721292019,
0.05465511232614517,
0.03902420401573181,
-0.03505545109510422,
0.04333033412694931,
0.007983529940247536,
0.1325104683637619,
0.11024180054664612,
0.08628682792186737,
0.00035392690915614367,
-0.09787295013666153,
0.019521037116646767,
0.009756888262927532,
0.054544977843761444,
-0.0795716717839241,
0.023056993260979652,
0.14500802755355835,
0.07129331678152084,
0.10494217276573181,
0.04367876797914505,
-0.04643935710191727,
-0.010345779359340668,
0.025921136140823364,
-0.1200595274567604,
-0.1296454817056656,
0.002731884364038706,
-0.07355017960071564,
-0.14888343214988708,
0.029521645978093147,
0.11849573254585266,
-0.03329910337924957,
-0.009657331742346287,
-0.00651159230619669,
0.0067899879068136215,
-0.024329014122486115,
0.20290106534957886,
0.05669889599084854,
0.07279320061206818,
-0.08547516167163849,
0.10699985921382904,
0.04575938358902931,
-0.04361989349126816,
0.03572433069348335,
0.07956363260746002,
-0.08028127253055573,
0.003958493005484343,
0.061974409967660904,
0.11156284064054489,
-0.08597882091999054,
-0.02632201462984085,
-0.08990051597356796,
-0.10125900059938431,
0.03231891989707947,
0.14653842151165009,
0.04835387319326401,
-0.0070779891684651375,
-0.03499235212802887,
0.04065345600247383,
-0.13353756070137024,
0.06636104732751846,
0.06108270585536957,
0.0877632424235344,
-0.12573930621147156,
0.1671285629272461,
0.004891699180006981,
0.029385915026068687,
-0.01543651707470417,
0.025487683713436127,
-0.10665372759103775,
-0.034085649996995926,
-0.1226997897028923,
-0.03007560595870018,
-0.01363105047494173,
0.0004001292400062084,
-0.017214259132742882,
-0.07242199033498764,
-0.05035527050495148,
0.04864783585071564,
-0.07335411757230759,
-0.058280427008867264,
0.01710026152431965,
0.03117927722632885,
-0.16052575409412384,
0.0038598249666392803,
0.03271059691905975,
-0.09081193059682846,
0.0754004493355751,
0.06685853749513626,
0.04022107645869255,
0.033801134675741196,
-0.12955690920352936,
-0.02475569397211075,
-0.007714908570051193,
0.017137013375759125,
0.06918644160032272,
-0.10563315451145172,
-0.0178622305393219,
-0.04266852140426636,
0.06935140490531921,
0.007917521521449089,
0.07983498275279999,
-0.11815103143453598,
-0.006528399884700775,
-0.04948867857456207,
-0.0289005097001791,
-0.06501986086368561,
0.04328880459070206,
0.10995073616504669,
0.055856961756944656,
0.15289343893527985,
-0.0687728002667427,
0.04354479908943176,
-0.22217310965061188,
-0.02990412525832653,
-0.00297369621694088,
-0.03946691006422043,
-0.04405780881643295,
-0.03199080377817154,
0.09110818803310394,
-0.06241808086633682,
0.09327846020460129,
-0.019693342968821526,
0.10123893618583679,
0.04523419588804245,
-0.010787888430058956,
-0.06058419868350029,
0.01309781800955534,
0.17799034714698792,
0.050338976085186005,
-0.0049431356601417065,
0.10968314111232758,
0.009625494480133057,
0.030542811378836632,
0.04328892007470131,
0.23462118208408356,
0.1311374455690384,
-0.06987270712852478,
0.06444678455591202,
0.09101908653974533,
-0.11121797561645508,
-0.11910834908485413,
0.11183715611696243,
-0.03576330840587616,
0.09673692286014557,
-0.05854790657758713,
0.17100924253463745,
0.061237167567014694,
-0.18738165497779846,
0.060386743396520615,
-0.051537372171878815,
-0.12285060435533524,
-0.12146645784378052,
-0.0037900605238974094,
-0.07251021265983582,
-0.11781012266874313,
0.025262901559472084,
-0.13550500571727753,
0.06245312839746475,
0.12461730092763901,
0.011382518336176872,
0.028338687494397163,
0.18851254880428314,
-0.05049467086791992,
0.004528720863163471,
0.06035536527633667,
0.021262140944600105,
0.005442644935101271,
-0.045560602098703384,
-0.059832584112882614,
0.0496281199157238,
0.03721284121274948,
0.06122637167572975,
-0.06880366057157516,
-0.0066299340687692165,
0.014925449155271053,
-0.002029101364314556,
-0.07002445310354233,
0.008210570551455021,
0.012576127424836159,
0.043789394199848175,
0.03769499436020851,
0.0419100858271122,
0.018919654190540314,
-0.060926344245672226,
0.29404574632644653,
-0.07095380127429962,
-0.08290513604879379,
-0.13640248775482178,
0.20072057843208313,
0.018462063744664192,
-0.018587937578558922,
0.06735577434301376,
-0.09643853455781937,
-0.00430376548320055,
0.1489730030298233,
0.14251691102981567,
-0.09100357443094254,
-0.021555442363023758,
-0.01659158617258072,
-0.01183544471859932,
-0.022539036348462105,
0.1176198422908783,
0.0890941470861435,
0.01544789969921112,
-0.06739477813243866,
-0.0100306561216712,
-0.0191311314702034,
-0.043689895421266556,
-0.06724943220615387,
0.07752656191587448,
0.03204108029603958,
0.0066177453845739365,
-0.033798616379499435,
0.07219100743532181,
0.0010364762274548411,
-0.23450273275375366,
0.04723648726940155,
-0.16943144798278809,
-0.1816864311695099,
-0.04095715656876564,
0.07046559453010559,
-0.009848436340689659,
0.061762139201164246,
-0.004639483988285065,
-0.008768124505877495,
0.12293685227632523,
-0.008720796555280685,
-0.026575636118650436,
-0.13636833429336548,
0.10592801123857498,
-0.11691164970397949,
0.21526412665843964,
-0.01878557913005352,
0.06470894068479538,
0.09920880943536758,
0.019159525632858276,
-0.13488730788230896,
0.01308837067335844,
0.06811632215976715,
-0.12810753285884857,
0.018409354612231255,
0.14280414581298828,
-0.037465423345565796,
0.07022960484027863,
0.02683143876492977,
-0.16735567152500153,
-0.008770459331572056,
0.022393235936760902,
-0.027004199102520943,
-0.07361938804388046,
-0.004661316052079201,
-0.06774649769067764,
0.16120636463165283,
0.2392021119594574,
-0.036505017429590225,
0.007457788567990065,
-0.09119325131177902,
0.012881084345281124,
0.05387165769934654,
0.10484284162521362,
-0.03316648676991463,
-0.22097431123256683,
0.029493751004338264,
0.0363537073135376,
0.010500257834792137,
-0.20020852982997894,
-0.08068457990884781,
0.054914142936468124,
-0.05557232350111008,
-0.02492726780474186,
0.0997740626335144,
0.0542256124317646,
0.053301215171813965,
-0.022610504180192947,
-0.10271554440259933,
-0.041544560343027115,
0.1502676010131836,
-0.17518717050552368,
-0.046219535171985626
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09602359682321548,
0.11403444409370422,
-0.002287903567776084,
0.09204193949699402,
0.12012409418821335,
0.02230750396847725,
0.10095051676034927,
0.1280750334262848,
-0.09618483483791351,
0.08673408627510071,
0.08719637244939804,
0.03934064134955406,
0.047412674874067307,
0.14536984264850616,
-0.019161295145750046,
-0.25941383838653564,
0.010798902250826359,
-0.003591503482311964,
-0.03308134526014328,
0.11174536496400833,
0.08519860357046127,
-0.11032954603433609,
0.08588489145040512,
0.014609341509640217,
-0.15494251251220703,
0.019969336688518524,
-0.03762827813625336,
-0.033781230449676514,
0.1132134273648262,
-0.033547040075063705,
0.10833442956209183,
0.02527593821287155,
0.13465943932533264,
-0.20926938951015472,
0.004998847376555204,
0.07325180619955063,
0.045367252081632614,
0.10020134598016739,
0.05219171941280365,
0.015646230429410934,
0.09011658281087875,
-0.1527780443429947,
0.09277531504631042,
0.02972399815917015,
-0.09148178994655609,
-0.12913408875465393,
-0.09670611470937729,
0.025290174409747124,
0.05337013304233551,
0.06970780342817307,
0.0008215959533117712,
0.15086646378040314,
-0.06032782793045044,
0.07915542274713516,
0.2668960392475128,
-0.32730069756507874,
-0.0644877478480339,
0.0339968241751194,
0.06122317910194397,
0.052840735763311386,
-0.12353474646806717,
-0.006582735572010279,
0.027604447677731514,
0.029910346493124962,
0.11724846810102463,
-0.016706988215446472,
-0.11299920827150345,
-0.01340639777481556,
-0.1286354809999466,
-0.0005270008696243167,
0.07106462866067886,
0.035989243537187576,
-0.051993705332279205,
-0.09523230791091919,
-0.07534641027450562,
-0.09351636469364166,
-0.02443019673228264,
-0.06496067345142365,
0.05679786205291748,
-0.054892685264348984,
-0.08000554889440536,
-0.03621389716863632,
-0.056941088289022446,
-0.07671207934617996,
-0.017687112092971802,
0.15557049214839935,
0.0399353988468647,
0.020713917911052704,
-0.03243628889322281,
0.10865380614995956,
0.0029707103967666626,
-0.14145852625370026,
-0.015508178621530533,
-0.0009906506165862083,
-0.09720853716135025,
-0.047159675508737564,
-0.050719719380140305,
-0.017361357808113098,
0.01001316774636507,
0.17587298154830933,
-0.08011547476053238,
0.07626672834157944,
0.010198576375842094,
-0.02948307991027832,
-0.00655368110165,
0.1469932496547699,
-0.043318070471286774,
-0.046236053109169006,
-0.01106297504156828,
0.07384166121482849,
0.0020614182576537132,
-0.014606200158596039,
-0.06489357352256775,
-0.02713019587099552,
0.1029982641339302,
0.04612622782588005,
-0.059944137930870056,
0.03980008140206337,
-0.023129602894186974,
-0.027964025735855103,
0.01748640090227127,
-0.11540838330984116,
0.04450107738375664,
-0.0023382024373859167,
-0.08431589603424072,
-0.002823854563757777,
0.00038606609450653195,
-0.005609926767647266,
-0.008071525022387505,
0.10987528413534164,
-0.09967470169067383,
-0.0026782590430229902,
-0.06424310058355331,
-0.08296934515237808,
0.009056736715137959,
-0.15698975324630737,
-0.015644699335098267,
-0.05659773200750351,
-0.1710270494222641,
-0.0333654060959816,
0.036522481590509415,
-0.07449010759592056,
-0.00896498467773199,
-0.049424394965171814,
-0.06615936011075974,
0.02493048645555973,
-0.014162139967083931,
0.17542970180511475,
-0.053337711840867996,
0.07235673815011978,
-0.00020801776554435492,
0.04595017805695534,
0.014947397634387016,
0.035819556564092636,
-0.10456864535808563,
0.02527010068297386,
-0.1359810084104538,
0.06867338716983795,
-0.0852123275399208,
-0.002220617840066552,
-0.1334993690252304,
-0.09794629365205765,
0.00971130095422268,
-0.022223303094506264,
0.09033747762441635,
0.13901309669017792,
-0.19340312480926514,
-0.017190484330058098,
0.12793606519699097,
-0.07585535198450089,
-0.06400597095489502,
0.06143057718873024,
-0.061053935438394547,
0.030595680698752403,
0.05173148214817047,
0.21059080958366394,
0.03921876847743988,
-0.1670125275850296,
-0.0338839516043663,
-0.0070694913156330585,
0.04041451960802078,
0.02609335072338581,
0.03966672345995903,
0.004840199835598469,
0.06508847326040268,
0.014146591536700726,
-0.07583948969841003,
-0.03301706165075302,
-0.09135212004184723,
-0.06492733210325241,
-0.054762788116931915,
-0.07229705899953842,
0.04051564261317253,
0.003898506984114647,
0.04218365624547005,
-0.06499538570642471,
-0.10113795101642609,
0.11947987228631973,
0.0969177782535553,
-0.04764465615153313,
0.036394163966178894,
-0.07934512943029404,
0.01964837685227394,
-0.021320408210158348,
-0.03908552601933479,
-0.20684917271137238,
-0.1309492439031601,
0.05219730734825134,
-0.05594620108604431,
0.03351713716983795,
0.00753630930557847,
0.08170084655284882,
0.06078125908970833,
-0.043589163571596146,
-0.012656508944928646,
-0.09371501952409744,
0.0030557725112885237,
-0.11776850372552872,
-0.18885715305805206,
-0.07859326899051666,
-0.04070039466023445,
0.09350238740444183,
-0.17500653862953186,
-0.006867642048746347,
0.015540390275418758,
0.14391686022281647,
0.027060145512223244,
-0.0686083659529686,
-0.0029753188136965036,
0.03751236945390701,
0.002481995150446892,
-0.09547087550163269,
0.04511179402470589,
0.007554813753813505,
-0.09198254346847534,
-0.06323914229869843,
-0.13634318113327026,
-0.012043188326060772,
0.05972624570131302,
0.05408328399062157,
-0.09744595736265182,
-0.04610829055309296,
-0.07073606550693512,
-0.04073692485690117,
-0.0756063312292099,
0.013122432865202427,
0.20066212117671967,
0.03496741130948067,
0.11218766868114471,
-0.06657087802886963,
-0.0779109075665474,
-0.0030181468464434147,
0.02298414707183838,
0.012553134933114052,
0.07708731293678284,
0.04122301563620567,
-0.05476631969213486,
0.07486360520124435,
0.09923520684242249,
-0.02193814143538475,
0.12439640611410141,
-0.04687870666384697,
-0.0840546265244484,
-0.033145543187856674,
-0.023399565368890762,
-0.028603101149201393,
0.12339998036623001,
-0.03941163420677185,
0.0047925966791808605,
0.0359947495162487,
0.044663988053798676,
0.01722012273967266,
-0.16179829835891724,
0.008351861499249935,
0.02135860174894333,
-0.05307597666978836,
-0.037055741995573044,
-0.000826735922601074,
0.026527704671025276,
0.09191633760929108,
0.03143337741494179,
-0.013888811692595482,
0.0027184158097952604,
-0.011940671131014824,
-0.06176424026489258,
0.1851557493209839,
-0.09776317328214645,
-0.08321300148963928,
-0.07532214373350143,
0.004849317949265242,
-0.059247978031635284,
-0.036293040961027145,
0.0162500012665987,
-0.08858539909124374,
-0.03882497549057007,
-0.08719966560602188,
-0.01857634261250496,
-0.01859932951629162,
0.019822461530566216,
0.03125438839197159,
-0.022106127813458443,
0.08007000386714935,
-0.1398768424987793,
0.0016920219641178846,
-0.052263617515563965,
-0.09334983676671982,
-0.0004939445643685758,
0.07463383674621582,
0.09811413288116455,
0.07966348528862,
-0.01641707494854927,
0.029864130541682243,
-0.0341421402990818,
0.24236136674880981,
-0.045942068099975586,
0.010691838338971138,
0.10391727089881897,
-0.012803320772945881,
0.056379284709692,
0.09563601016998291,
0.03704984113574028,
-0.09433124214410782,
0.020738180726766586,
0.08320894092321396,
-0.028927577659487724,
-0.23048120737075806,
-0.025375349447131157,
-0.004482151474803686,
-0.08006966859102249,
0.10606112331151962,
0.032201431691646576,
-0.03730250522494316,
0.04585708677768707,
0.020910456776618958,
0.0012864315649494529,
-0.05498944967985153,
0.08181736618280411,
0.07484336942434311,
0.05646544694900513,
0.10011260211467743,
-0.008800610899925232,
-0.028047075495123863,
0.06085721403360367,
0.007950816303491592,
0.24882981181144714,
-0.024625860154628754,
0.0995255559682846,
0.03301096335053444,
0.15109184384346008,
-0.02724761888384819,
0.06545720249414444,
0.0033421057742089033,
-0.009703920222818851,
-0.014676177874207497,
-0.06673465669155121,
-0.02419653721153736,
0.023317761719226837,
-0.046890947967767715,
0.029453570023179054,
-0.08171006292104721,
0.025213835760951042,
0.02734048292040825,
0.28025713562965393,
0.035002004355192184,
-0.274269163608551,
-0.06585120409727097,
-0.013311351649463177,
-0.04227099567651749,
-0.06407379359006882,
0.0057271006517112255,
0.11969728767871857,
-0.13298916816711426,
0.06465829163789749,
-0.07583650946617126,
0.0900479331612587,
-0.037446193397045135,
0.010768186300992966,
0.04645361378788948,
0.15289826691150665,
-0.01821131259202957,
0.05099192634224892,
-0.1850925236940384,
0.2429511547088623,
0.024744266644120216,
0.1080249547958374,
-0.06416165828704834,
0.010363471694290638,
0.018697619438171387,
0.006623168475925922,
0.11019129306077957,
0.0012009484926238656,
-0.06843778491020203,
-0.13768291473388672,
-0.09954261034727097,
0.04696739837527275,
0.14218740165233612,
-0.034935835748910904,
0.09952692687511444,
-0.027761677280068398,
0.012435238808393478,
0.03393643721938133,
-0.02978694811463356,
-0.15719762444496155,
-0.07308056205511093,
0.01001923531293869,
0.02650831826031208,
-0.015459554269909859,
-0.051490575075149536,
-0.10420314967632294,
-0.03860589489340782,
0.11911201477050781,
0.0036169178783893585,
-0.04585646092891693,
-0.15072131156921387,
0.08426423370838165,
0.1460174024105072,
-0.058126747608184814,
0.015574075281620026,
0.014627117663621902,
0.11122171580791473,
0.03258290886878967,
-0.08563894033432007,
0.06724783033132553,
-0.053632449358701706,
-0.17411033809185028,
-0.05801774561405182,
0.11937690526247025,
0.0800362080335617,
0.04519207030534744,
0.0011909090681001544,
0.05772159993648529,
0.001406815485097468,
-0.09718676656484604,
0.03756730630993843,
0.004228139296174049,
0.0521480068564415,
0.02849463000893593,
-0.08547045290470123,
0.07675771415233612,
-0.034450042992830276,
0.017967721447348595,
0.12842291593551636,
0.23301860690116882,
-0.09889925271272659,
0.10259053111076355,
0.0799330621957779,
-0.07688435167074203,
-0.1585405319929123,
0.06137142330408096,
0.12594719231128693,
0.004197145812213421,
0.08413106203079224,
-0.19981275498867035,
0.1344214528799057,
0.10642403364181519,
-0.013932344503700733,
0.021208012476563454,
-0.2707490921020508,
-0.13155829906463623,
0.06542593240737915,
0.11013205349445343,
0.04936648905277252,
-0.12262903153896332,
-0.03535997495055199,
-0.010070516727864742,
-0.12094859033823013,
0.12835894525051117,
-0.07673697918653488,
0.11719129979610443,
-0.02181944251060486,
0.12293810397386551,
0.024189632385969162,
-0.03719676285982132,
0.11224839836359024,
0.07193931937217712,
0.08588811755180359,
-0.03965448588132858,
-0.003293768037110567,
0.0648765116930008,
-0.06250230967998505,
0.03598310425877571,
-0.03699520230293274,
0.0627206489443779,
-0.1481391340494156,
0.006865167524665594,
-0.07877637445926666,
0.06000366806983948,
-0.04680624604225159,
-0.06500281393527985,
-0.027147997170686722,
0.04750816896557808,
0.07274003326892853,
-0.03590485453605652,
0.044939715415239334,
0.008574707433581352,
0.0922415629029274,
0.1005576029419899,
0.07333541661500931,
-0.022195708006620407,
-0.08281805366277695,
0.013462990522384644,
0.004370127338916063,
0.04686139523983002,
-0.08500370383262634,
0.01507571805268526,
0.14623799920082092,
0.06024520844221115,
0.10227334499359131,
0.04666744917631149,
-0.04393574222922325,
0.005751105956733227,
0.017087062820792198,
-0.14230717718601227,
-0.10045989602804184,
0.0284141693264246,
-0.05808994546532631,
-0.15436583757400513,
0.03360128402709961,
0.1226072609424591,
-0.037293121218681335,
-0.01660998724400997,
-0.006703418679535389,
0.009525066241621971,
-0.011396746151149273,
0.18518418073654175,
0.04204562306404114,
0.05498350411653519,
-0.09131846576929092,
0.11355132609605789,
0.03610886633396149,
-0.042228132486343384,
0.054247502237558365,
0.06828612089157104,
-0.0991193875670433,
0.013037599623203278,
0.07337064296007156,
0.15001800656318665,
-0.06594546139240265,
-0.012340625748038292,
-0.09138315171003342,
-0.0767349824309349,
0.044620953500270844,
0.14459873735904694,
0.05313485860824585,
-0.0052820127457380295,
-0.060028981417417526,
0.036056455224752426,
-0.11861133575439453,
0.06805030256509781,
0.05241674184799194,
0.08236395567655563,
-0.10844319313764572,
0.12338896095752716,
-0.00734080420807004,
0.023945920169353485,
-0.028061510995030403,
0.01882787048816681,
-0.10083997994661331,
-0.03458574786782265,
-0.10729243606328964,
-0.014467097818851471,
-0.017896797508001328,
-0.0034967223182320595,
-0.019866744056344032,
-0.07521750032901764,
-0.042639438062906265,
0.03328600898385048,
-0.07712841033935547,
-0.048938751220703125,
0.017649859189987183,
0.04005033150315285,
-0.16087165474891663,
0.0029281761962920427,
0.025967616587877274,
-0.08755463361740112,
0.08729709684848785,
0.06922927498817444,
0.01615118607878685,
0.02851938270032406,
-0.12396485358476639,
-0.033166058361530304,
0.0006036445265635848,
0.009918643161654472,
0.07728192210197449,
-0.09377701580524445,
-0.029898136854171753,
-0.03089229390025139,
0.049126919358968735,
0.01506563276052475,
0.10172645002603531,
-0.1186690703034401,
-0.013909920118749142,
-0.04703284427523613,
-0.03846688196063042,
-0.05701981112360954,
0.026729613542556763,
0.11463794857263565,
0.04526182636618614,
0.15745708346366882,
-0.07009848952293396,
0.055109307169914246,
-0.20471733808517456,
-0.03285461291670799,
0.010785563848912716,
-0.04676811397075653,
-0.0736604556441307,
-0.04443634673953056,
0.08418973535299301,
-0.050679102540016174,
0.1211005374789238,
-0.016166921705007553,
0.09234297275543213,
0.04383096471428871,
-0.003656962886452675,
-0.07162211090326309,
-0.011329824104905128,
0.18290406465530396,
0.05698656290769577,
-0.02119220420718193,
0.1215188279747963,
0.0040638078935444355,
0.04188327491283417,
0.06809653341770172,
0.23453693091869354,
0.15155792236328125,
-0.01212401781231165,
0.07488168776035309,
0.06730151921510696,
-0.07573826611042023,
-0.140097513794899,
0.12197063863277435,
-0.020391253754496574,
0.10563764721155167,
-0.053066354244947433,
0.18945781886577606,
0.03851446881890297,
-0.17596624791622162,
0.05442046746611595,
-0.02568230591714382,
-0.10807112604379654,
-0.1250578761100769,
-0.015601986087858677,
-0.08182837814092636,
-0.11584349721670151,
0.02844526804983616,
-0.12385156005620956,
0.06830892711877823,
0.09606166929006577,
0.007525863125920296,
0.035400837659835815,
0.18418560922145844,
-0.05648822337388992,
0.011748998425900936,
0.07176593691110611,
0.020577020943164825,
-0.0039061878342181444,
-0.03938121348619461,
-0.06586938351392746,
0.03773704916238785,
0.04336928948760033,
0.07099542766809464,
-0.05092169716954231,
0.008807743899524212,
0.014761250466108322,
-0.009901494719088078,
-0.07784368842840195,
0.007925037294626236,
0.014980918727815151,
0.048601217567920685,
0.036900974810123444,
0.04724889621138573,
0.0082479203119874,
-0.05358163267374039,
0.276041179895401,
-0.06787721067667007,
-0.06160334125161171,
-0.12299491465091705,
0.19426767528057098,
0.03337680548429489,
-0.018991868942975998,
0.05551685020327568,
-0.09289601445198059,
-0.012001519091427326,
0.16161750257015228,
0.13467296957969666,
-0.08994685858488083,
-0.021074289456009865,
-0.02364164963364601,
-0.008686423301696777,
-0.012359682470560074,
0.10439932346343994,
0.07137981057167053,
0.0019259589025750756,
-0.06694447994232178,
-0.012805274687707424,
-0.02938486449420452,
-0.04761163517832756,
-0.06263997405767441,
0.05909192934632301,
0.027706684544682503,
-0.0065283696167171,
-0.058292414993047714,
0.06357713788747787,
-0.003986557014286518,
-0.23520193994045258,
0.03783419355750084,
-0.17287085950374603,
-0.1737671047449112,
-0.014048131182789803,
0.07020930200815201,
0.0018710417207330465,
0.056631941348314285,
-0.006559076253324747,
0.009776200167834759,
0.11498554795980453,
-0.01662139594554901,
-0.014038083143532276,
-0.11834955215454102,
0.10924083739519119,
-0.1094801053404808,
0.21268585324287415,
-0.001641039620153606,
0.06447994709014893,
0.09912193566560745,
0.03714968264102936,
-0.13597255945205688,
0.018682172521948814,
0.061845581978559494,
-0.12661334872245789,
0.0020432663150131702,
0.14641274511814117,
-0.03440523147583008,
0.06127898395061493,
0.030796241015195847,
-0.149760901927948,
-0.002777398331090808,
0.02794860117137432,
-0.036670517176389694,
-0.06961273401975632,
-0.008562020026147366,
-0.05621180683374405,
0.16597458720207214,
0.2076471447944641,
-0.02926386334002018,
0.012206965126097202,
-0.08473123610019684,
0.02166404202580452,
0.048080332577228546,
0.05845894664525986,
-0.03915943577885628,
-0.21618425846099854,
0.021930847316980362,
0.0724245235323906,
-0.0027249353006482124,
-0.19537392258644104,
-0.09554769098758698,
0.0422537736594677,
-0.03673449158668518,
-0.04639749601483345,
0.09106805175542831,
0.025066886097192764,
0.03693581372499466,
-0.01918955147266388,
-0.11601366102695465,
-0.027418462559580803,
0.14596427977085114,
-0.1762307584285736,
-0.04254068806767464
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-1024-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09651055932044983,
0.11489236354827881,
-0.0023121130652725697,
0.09244512021541595,
0.12019168585538864,
0.022948764264583588,
0.10058419406414032,
0.12800709903240204,
-0.09582960605621338,
0.08633457124233246,
0.08745722472667694,
0.03864460438489914,
0.04735307767987251,
0.14467661082744598,
-0.01930258981883526,
-0.2595095634460449,
0.010338992811739445,
-0.0030823987908661366,
-0.03125064820051193,
0.11128278821706772,
0.08525780588388443,
-0.11038938164710999,
0.08583197742700577,
0.014446195214986801,
-0.15427853167057037,
0.01980714686214924,
-0.03710166737437248,
-0.03429446741938591,
0.1135481670498848,
-0.032938264310359955,
0.10836541652679443,
0.02454250305891037,
0.13471640646457672,
-0.2099476009607315,
0.004805531352758408,
0.07352003455162048,
0.045233841985464096,
0.1002681627869606,
0.051025401800870895,
0.016531724482774734,
0.08917680382728577,
-0.15329067409038544,
0.09299921989440918,
0.029346400871872902,
-0.09131509065628052,
-0.12951180338859558,
-0.09590884298086166,
0.026436040177941322,
0.052775319665670395,
0.0690864771604538,
0.001794564537703991,
0.1524430215358734,
-0.05909525230526924,
0.07952765375375748,
0.2671598792076111,
-0.3270939886569977,
-0.06372502446174622,
0.03298599272966385,
0.061073221266269684,
0.0540500283241272,
-0.12272349745035172,
-0.006406786385923624,
0.02725137770175934,
0.02978338859975338,
0.11777770519256592,
-0.017452389001846313,
-0.11443718522787094,
-0.01354705635458231,
-0.12812121212482452,
-0.0009715126943774521,
0.07154244184494019,
0.03617028146982193,
-0.05220673605799675,
-0.095294289290905,
-0.07603389024734497,
-0.09431635588407516,
-0.024902988225221634,
-0.06533795595169067,
0.056252386420965195,
-0.05453312024474144,
-0.0791691318154335,
-0.03633914887905121,
-0.056455761194229126,
-0.0759180560708046,
-0.017398923635482788,
0.1555422991514206,
0.04005901888012886,
0.020635709166526794,
-0.031674306839704514,
0.10829463601112366,
0.0010953241726383567,
-0.1415235996246338,
-0.015101272612810135,
-0.0007481934735551476,
-0.097306989133358,
-0.04735776409506798,
-0.05095502734184265,
-0.01838734745979309,
0.009620527736842632,
0.1773107647895813,
-0.07975802570581436,
0.07601028680801392,
0.01064213365316391,
-0.02892869897186756,
-0.006154731847345829,
0.14776252210140228,
-0.043995484709739685,
-0.04712778329849243,
-0.010137009434401989,
0.07351084798574448,
0.002941910410299897,
-0.015505814924836159,
-0.06590157002210617,
-0.027884377166628838,
0.10368698835372925,
0.04561476781964302,
-0.05990627035498619,
0.03928975388407707,
-0.02325861155986786,
-0.028162216767668724,
0.018336711451411247,
-0.11525555700063705,
0.04469949007034302,
-0.002391690155491233,
-0.08400660753250122,
-0.00206960691139102,
0.0011026527499780059,
-0.004906964488327503,
-0.008356361649930477,
0.10901074856519699,
-0.09886666387319565,
-0.0022322640288621187,
-0.06386715918779373,
-0.0831521600484848,
0.009393257088959217,
-0.15694214403629303,
-0.014892193488776684,
-0.05777556449174881,
-0.17127260565757751,
-0.03289288282394409,
0.03668834641575813,
-0.07392727583646774,
-0.008671008050441742,
-0.048370614647865295,
-0.06489238888025284,
0.024768101051449776,
-0.014752967283129692,
0.17344848811626434,
-0.05355125293135643,
0.07197415828704834,
-0.00018527092470321804,
0.0455673485994339,
0.01368686929345131,
0.03588765487074852,
-0.10414065420627594,
0.025115296244621277,
-0.1360604465007782,
0.06857398897409439,
-0.08457347750663757,
-0.002112370915710926,
-0.13223733007907867,
-0.09788975119590759,
0.010726094245910645,
-0.021596936509013176,
0.09098321944475174,
0.13832563161849976,
-0.1934223473072052,
-0.01699468120932579,
0.1271265298128128,
-0.07555390894412994,
-0.06420112401247025,
0.06290701031684875,
-0.061225175857543945,
0.03052590601146221,
0.05228348821401596,
0.2104388028383255,
0.04077884927392006,
-0.16706494987010956,
-0.033204928040504456,
-0.005885894875973463,
0.04102808237075806,
0.025060057640075684,
0.03953350707888603,
0.005206963513046503,
0.0655537098646164,
0.014446224085986614,
-0.07477697730064392,
-0.032226357609033585,
-0.09122881293296814,
-0.06531360000371933,
-0.055010441690683365,
-0.07202216237783432,
0.04017854481935501,
0.00452013686299324,
0.042193617671728134,
-0.06480597704648972,
-0.10165861248970032,
0.12095854431390762,
0.09658796340227127,
-0.04798872768878937,
0.03644715994596481,
-0.07896920293569565,
0.020183630287647247,
-0.020573504269123077,
-0.039001014083623886,
-0.20641125738620758,
-0.1297999620437622,
0.05238107591867447,
-0.05581120029091835,
0.03302439674735069,
0.007912230677902699,
0.08106120675802231,
0.06121198087930679,
-0.043066516518592834,
-0.011897648684680462,
-0.09316934645175934,
0.003331569256260991,
-0.11852403730154037,
-0.1877591609954834,
-0.07876364141702652,
-0.04019896313548088,
0.09395509213209152,
-0.17581835389137268,
-0.006340041756629944,
0.0151937585324049,
0.14331337809562683,
0.026683956384658813,
-0.06853298097848892,
-0.002966369967907667,
0.03817187249660492,
0.0022897536400705576,
-0.09575161337852478,
0.045270923525094986,
0.008154802955687046,
-0.09278133511543274,
-0.06425698101520538,
-0.1366974264383316,
-0.011096515692770481,
0.05947359651327133,
0.05322916433215141,
-0.09733697772026062,
-0.04671027138829231,
-0.07022473961114883,
-0.04054996743798256,
-0.07489410042762756,
0.012214883230626583,
0.20153656601905823,
0.03527074679732323,
0.11316312104463577,
-0.06643359363079071,
-0.07732800394296646,
-0.0029428417328745127,
0.022913306951522827,
0.013172069564461708,
0.07615111023187637,
0.04071914404630661,
-0.05266827344894409,
0.07411058992147446,
0.09844939410686493,
-0.022896338254213333,
0.12375755608081818,
-0.04655088111758232,
-0.08362951129674911,
-0.033336203545331955,
-0.023669645190238953,
-0.028396055102348328,
0.12340638041496277,
-0.040918923914432526,
0.004173817113041878,
0.036255426704883575,
0.04424188286066055,
0.0173382256180048,
-0.16174833476543427,
0.008240040391683578,
0.02219218946993351,
-0.05262118577957153,
-0.03613952919840813,
-0.0014599990099668503,
0.02619507908821106,
0.09115926176309586,
0.0310880858451128,
-0.014225911349058151,
0.003394033759832382,
-0.011662881821393967,
-0.062052078545093536,
0.18449148535728455,
-0.0977545976638794,
-0.08455163240432739,
-0.07662785053253174,
0.005214780569076538,
-0.05934656783938408,
-0.03632647916674614,
0.01659066416323185,
-0.08690289407968521,
-0.03865980729460716,
-0.0875859335064888,
-0.020335566252470016,
-0.01797953061759472,
0.019776811823248863,
0.0315500944852829,
-0.022583123296499252,
0.08064797520637512,
-0.13970044255256653,
0.0014710439136251807,
-0.051658131182193756,
-0.0925174355506897,
-0.00008783170051174238,
0.07455488294363022,
0.09894480556249619,
0.08012282848358154,
-0.017140580341219902,
0.02955351211130619,
-0.03390245512127876,
0.24217359721660614,
-0.045449912548065186,
0.01104879379272461,
0.10428223013877869,
-0.013798369094729424,
0.05660329386591911,
0.09510628879070282,
0.037358082830905914,
-0.0942741185426712,
0.020259516313672066,
0.08213873952627182,
-0.029479824006557465,
-0.22964338958263397,
-0.02528994530439377,
-0.004619326908141375,
-0.07988280802965164,
0.10633224993944168,
0.03212898224592209,
-0.0367809496819973,
0.04631250724196434,
0.0210247989743948,
0.0028243842534720898,
-0.05577157437801361,
0.08173523843288422,
0.07550700753927231,
0.056389790028333664,
0.09994923323392868,
-0.008427080698311329,
-0.028097977861762047,
0.061138466000556946,
0.007361532188951969,
0.24682649970054626,
-0.024834323674440384,
0.1004018560051918,
0.031974535435438156,
0.1516903042793274,
-0.027194807305932045,
0.0655914843082428,
0.00307602621614933,
-0.009989495389163494,
-0.014895710162818432,
-0.06692469120025635,
-0.02562810480594635,
0.023873748257756233,
-0.04750958830118179,
0.029830653220415115,
-0.08194892108440399,
0.025900855660438538,
0.027095554396510124,
0.2797911465167999,
0.03479083999991417,
-0.2744758725166321,
-0.0661311224102974,
-0.013589544221758842,
-0.0421350821852684,
-0.06457154452800751,
0.005616558250039816,
0.12053684145212173,
-0.1326657235622406,
0.06420990079641342,
-0.07521988451480865,
0.09012715518474579,
-0.038507018238306046,
0.01112598367035389,
0.04568425193428993,
0.1529356837272644,
-0.01791692152619362,
0.05124792456626892,
-0.18573524057865143,
0.24140821397304535,
0.025077728554606438,
0.10831288993358612,
-0.06452829390764236,
0.01091326680034399,
0.018376031890511513,
0.00828136969357729,
0.10938160866498947,
0.0015527267241850495,
-0.06730934232473373,
-0.1390317976474762,
-0.09991362690925598,
0.04698743298649788,
0.14094853401184082,
-0.03344583138823509,
0.09912402182817459,
-0.027885790914297104,
0.01252071000635624,
0.03419069945812225,
-0.029052693396806717,
-0.1571914702653885,
-0.0737464502453804,
0.009623176418244839,
0.02771325781941414,
-0.015142655000090599,
-0.0512908473610878,
-0.10388519614934921,
-0.04001612216234207,
0.11930135637521744,
0.004398828838020563,
-0.04607471823692322,
-0.1506967395544052,
0.08478119224309921,
0.14527077972888947,
-0.05834414064884186,
0.015489007346332073,
0.014602613635361195,
0.1113986000418663,
0.0323355458676815,
-0.08478730171918869,
0.06724918633699417,
-0.05339569225907326,
-0.17330266535282135,
-0.05837076157331467,
0.1185627281665802,
0.07957139611244202,
0.04533335939049721,
0.0016659238608554006,
0.057451289147138596,
0.001305237878113985,
-0.09723854809999466,
0.03633579611778259,
0.00488761393353343,
0.051272451877593994,
0.028393995016813278,
-0.08554169535636902,
0.07764539122581482,
-0.034114379435777664,
0.01818975992500782,
0.12957677245140076,
0.23259936273097992,
-0.09927694499492645,
0.10179659724235535,
0.0807129293680191,
-0.0767718255519867,
-0.15817712247371674,
0.06094227358698845,
0.12535780668258667,
0.004233983810991049,
0.08433353155851364,
-0.19922232627868652,
0.13440342247486115,
0.10743245482444763,
-0.013511271215975285,
0.02096935361623764,
-0.270698606967926,
-0.13177715241909027,
0.06618624180555344,
0.11017318814992905,
0.05146479979157448,
-0.12323398888111115,
-0.03531815856695175,
-0.010785169899463654,
-0.12213174253702164,
0.1274871528148651,
-0.07642774283885956,
0.11697868257761002,
-0.021649204194545746,
0.12154102325439453,
0.02431141398847103,
-0.03740308806300163,
0.11258979886770248,
0.07257147878408432,
0.08579985797405243,
-0.039925526827573776,
-0.0029625960160046816,
0.06466495245695114,
-0.06269840151071548,
0.036745935678482056,
-0.036871492862701416,
0.06273798644542694,
-0.1498524695634842,
0.006686953827738762,
-0.07752517610788345,
0.06053423881530762,
-0.046496860682964325,
-0.06529578566551208,
-0.027026722207665443,
0.046472400426864624,
0.07263320684432983,
-0.03576034680008888,
0.04592971131205559,
0.008899588137865067,
0.09151219576597214,
0.1020217314362526,
0.07196135073900223,
-0.025377606973052025,
-0.08295638859272003,
0.01348577719181776,
0.004129427019506693,
0.047157809138298035,
-0.08462870121002197,
0.015595432370901108,
0.1463545709848404,
0.060296230018138885,
0.10256565362215042,
0.04564403370022774,
-0.04339597746729851,
0.005875890608876944,
0.016616955399513245,
-0.142181858420372,
-0.09957714378833771,
0.02786666713654995,
-0.05837983265519142,
-0.15394863486289978,
0.03267967328429222,
0.12314363569021225,
-0.038007911294698715,
-0.01599428430199623,
-0.006968461908400059,
0.00827631726861,
-0.011286984197795391,
0.18475568294525146,
0.04264337569475174,
0.05478941649198532,
-0.09122677892446518,
0.11311209201812744,
0.03655043989419937,
-0.04130511358380318,
0.054599858820438385,
0.06785238534212112,
-0.09941793233156204,
0.012868151068687439,
0.07333500683307648,
0.14992555975914001,
-0.06671956926584244,
-0.013374908827245235,
-0.0919203907251358,
-0.0757000744342804,
0.0441652275621891,
0.1435656100511551,
0.05350428819656372,
-0.005579205695539713,
-0.06033182889223099,
0.03533780202269554,
-0.11875826865434647,
0.06752332299947739,
0.05208532512187958,
0.08265510201454163,
-0.10872504115104675,
0.12475791573524475,
-0.006676832213997841,
0.023902347311377525,
-0.028002437204122543,
0.018441511318087578,
-0.10082191973924637,
-0.03437873348593712,
-0.1086505576968193,
-0.014221441000699997,
-0.017814604565501213,
-0.0030693900771439075,
-0.019958456978201866,
-0.07498955726623535,
-0.04264984279870987,
0.03316492214798927,
-0.07642124593257904,
-0.04876630753278732,
0.01801559142768383,
0.039679691195487976,
-0.1604156643152237,
0.002602939959615469,
0.025606559589505196,
-0.08743385225534439,
0.0876106545329094,
0.06862358003854752,
0.015728957951068878,
0.02815340645611286,
-0.12420576810836792,
-0.033140793442726135,
0.00066627241903916,
0.010770997032523155,
0.0773136243224144,
-0.0921817421913147,
-0.029073769226670265,
-0.030474117025732994,
0.04903598502278328,
0.014960048720240593,
0.10211271047592163,
-0.11883927881717682,
-0.013773921877145767,
-0.046794790774583817,
-0.038462597876787186,
-0.05744193121790886,
0.02669113129377365,
0.11446379125118256,
0.044485192745923996,
0.15741442143917084,
-0.06998547166585922,
0.05496079474687576,
-0.2049826979637146,
-0.03306929022073746,
0.010987723246216774,
-0.04655025154352188,
-0.07355372607707977,
-0.045277442783117294,
0.08387618511915207,
-0.05011129751801491,
0.12287533283233643,
-0.016172245144844055,
0.09295495599508286,
0.043426018208265305,
-0.004229375161230564,
-0.07156679034233093,
-0.012007688172161579,
0.18375690281391144,
0.05810698866844177,
-0.021199466660618782,
0.12071645259857178,
0.00433950312435627,
0.042960457503795624,
0.06744363158941269,
0.2321886271238327,
0.15151020884513855,
-0.012874910607933998,
0.07492109388113022,
0.06714549660682678,
-0.0752476304769516,
-0.1401236355304718,
0.12159740179777145,
-0.020483186468482018,
0.10602232068777084,
-0.0530410073697567,
0.1894761323928833,
0.03848165646195412,
-0.1758425235748291,
0.05408987030386925,
-0.024994296953082085,
-0.10826270282268524,
-0.12518931925296783,
-0.015024777501821518,
-0.08191123604774475,
-0.11587663739919662,
0.027992412447929382,
-0.12329567223787308,
0.06791272759437561,
0.0963556095957756,
0.007033678237348795,
0.03519127890467644,
0.18350152671337128,
-0.05651002749800682,
0.012086720205843449,
0.07149581611156464,
0.020448219031095505,
-0.003550863591954112,
-0.04043424502015114,
-0.06668893247842789,
0.03765368461608887,
0.04352155327796936,
0.07152799516916275,
-0.050979554653167725,
0.010707307606935501,
0.015141531825065613,
-0.009636358357965946,
-0.07831219583749771,
0.007872811518609524,
0.01438892725855112,
0.04832826554775238,
0.03574605658650398,
0.04752589389681816,
0.008376035839319229,
-0.053748056292533875,
0.2744973301887512,
-0.06720882654190063,
-0.0621747225522995,
-0.12290892004966736,
0.1932094544172287,
0.03376134857535362,
-0.018745796754956245,
0.05546993389725685,
-0.09288733452558517,
-0.01129121333360672,
0.1618739515542984,
0.13404172658920288,
-0.0906316339969635,
-0.021021658554673195,
-0.02334371581673622,
-0.008824142627418041,
-0.01338718831539154,
0.10494251549243927,
0.07149655371904373,
0.0008121988503262401,
-0.06617716699838638,
-0.012933852151036263,
-0.02949635311961174,
-0.04759383201599121,
-0.06383403390645981,
0.05837661772966385,
0.027823472395539284,
-0.005816074088215828,
-0.05787239223718643,
0.06309860199689865,
-0.0031133925076574087,
-0.23578733205795288,
0.03797318786382675,
-0.17316262423992157,
-0.17363452911376953,
-0.013871260918676853,
0.07049989700317383,
0.0021076181437820196,
0.056238431483507156,
-0.006466049700975418,
0.010177153162658215,
0.11570871621370316,
-0.016897499561309814,
-0.014452215284109116,
-0.11722814291715622,
0.1085842177271843,
-0.1083015501499176,
0.21240799129009247,
-0.0015900768339633942,
0.06512191891670227,
0.09908886253833771,
0.0375124029815197,
-0.13523916900157928,
0.01894933544099331,
0.06153193116188049,
-0.1256890445947647,
0.0017884995322674513,
0.14497733116149902,
-0.03440570831298828,
0.06110619008541107,
0.03123955987393856,
-0.14942526817321777,
-0.0034573283046483994,
0.026849906891584396,
-0.036690790206193924,
-0.06923389434814453,
-0.010001523420214653,
-0.05561031028628349,
0.1660299450159073,
0.20677940547466278,
-0.029016844928264618,
0.011630896478891373,
-0.08483988791704178,
0.021473469212651253,
0.04857071489095688,
0.05830385908484459,
-0.03937371447682381,
-0.2159700095653534,
0.022419476881623268,
0.07233622670173645,
-0.002503053518012166,
-0.19506029784679413,
-0.09609966725111008,
0.04220804199576378,
-0.03712766245007515,
-0.046137843281030655,
0.09112869203090668,
0.02512785978615284,
0.03722890466451645,
-0.01926391012966633,
-0.11617231369018555,
-0.027886787429451942,
0.14598138630390167,
-0.1762675940990448,
-0.042751703411340714
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09758219867944717,
0.10821618139743805,
-0.002388017950579524,
0.09534620493650436,
0.12083656340837479,
0.015378996729850769,
0.09843062609434128,
0.13116255402565002,
-0.10562213510274887,
0.06923670321702957,
0.08847587555646896,
0.034827135503292084,
0.04414544627070427,
0.14951978623867035,
-0.007591812871396542,
-0.27283063530921936,
0.0003327816375531256,
-0.00039440393447875977,
-0.04240413382649422,
0.1208961009979248,
0.08812495321035385,
-0.11064637452363968,
0.0777917355298996,
0.01037674117833376,
-0.1536208838224411,
0.01859124004840851,
-0.03164030238986015,
-0.037242766469717026,
0.12240584194660187,
-0.03612391650676727,
0.10709116607904434,
0.029986141249537468,
0.13542447984218597,
-0.20859743654727936,
0.0067203473299741745,
0.07758653163909912,
0.05253630131483078,
0.10033750534057617,
0.04668427258729935,
0.00936218537390232,
0.08942350745201111,
-0.14871586859226227,
0.09295710921287537,
0.030236447229981422,
-0.09195791184902191,
-0.15362964570522308,
-0.09241383522748947,
0.03115967847406864,
0.048224374651908875,
0.07241727411746979,
0.00256110611371696,
0.14818236231803894,
-0.06320550292730331,
0.08388303220272064,
0.2645421624183655,
-0.3212214410305023,
-0.067087821662426,
0.029009142890572548,
0.05626509711146355,
0.05963699519634247,
-0.11985224485397339,
-0.002809792524203658,
0.01954205520451069,
0.02805420011281967,
0.1272263079881668,
-0.01578284613788128,
-0.10785438865423203,
-0.009537527337670326,
-0.12398356199264526,
-0.0025474061258137226,
0.05909619852900505,
0.025682920590043068,
-0.05274220183491707,
-0.10714352875947952,
-0.06776148080825806,
-0.08537375926971436,
-0.02203672006726265,
-0.05456779524683952,
0.051197126507759094,
-0.05503080040216446,
-0.09808491170406342,
-0.039263661950826645,
-0.05751848593354225,
-0.07966219633817673,
-0.009265866130590439,
0.16743707656860352,
0.03564560040831566,
0.020158778876066208,
-0.031103456392884254,
0.11833414435386658,
0.022133950144052505,
-0.14047692716121674,
-0.00985109992325306,
-0.004559408873319626,
-0.09251529723405838,
-0.04051556810736656,
-0.052592817693948746,
-0.0130785396322608,
0.005850983317941427,
0.16802625358104706,
-0.08200955390930176,
0.07450489699840546,
0.013072602450847626,
-0.023658512160182,
-0.013022640720009804,
0.1527220755815506,
-0.04006973281502724,
-0.0379696786403656,
-0.016557304188609123,
0.0799078419804573,
0.004026374313980341,
-0.01993231475353241,
-0.06588918715715408,
-0.027485961094498634,
0.09405140578746796,
0.055593252182006836,
-0.061347074806690216,
0.039648838341236115,
-0.029897507280111313,
-0.026223452761769295,
0.018210412934422493,
-0.11902918666601181,
0.041494496166706085,
-0.0029873966705054045,
-0.08164861053228378,
-0.006909526418894529,
-0.0013642851263284683,
-0.009284492582082748,
-0.010717793367803097,
0.10022132843732834,
-0.0876181349158287,
-0.0007728585042059422,
-0.06958853453397751,
-0.08085144311189651,
-0.0005423931288532913,
-0.15473665297031403,
-0.015157459303736687,
-0.059860847890377045,
-0.16394005715847015,
-0.03338237479329109,
0.04413069784641266,
-0.07521285861730576,
-0.011428351514041424,
-0.04350588843226433,
-0.06261442601680756,
0.01776430569589138,
-0.013188617303967476,
0.1909642070531845,
-0.053049977868795395,
0.08180827647447586,
-0.008253796957433224,
0.05019902437925339,
0.02704150788486004,
0.03578978776931763,
-0.10410767048597336,
0.028123777359724045,
-0.1403024047613144,
0.07698586583137512,
-0.08499991148710251,
-0.004913673270493746,
-0.13671068847179413,
-0.10203137248754501,
0.01450824923813343,
-0.020342012867331505,
0.09438876807689667,
0.1338587999343872,
-0.19891387224197388,
-0.020256362855434418,
0.12567923963069916,
-0.07557309418916702,
-0.052458565682172775,
0.060605503618717194,
-0.06149714067578316,
0.039675477892160416,
0.05138295143842697,
0.21239694952964783,
0.05383900925517082,
-0.15755176544189453,
-0.010649214498698711,
0.00488569401204586,
0.04474763572216034,
0.027866117656230927,
0.03746788948774338,
0.0021801518741995096,
0.058112822473049164,
0.01571401208639145,
-0.08956973999738693,
-0.024587851017713547,
-0.09020864218473434,
-0.0661546066403389,
-0.05077080801129341,
-0.07488047331571579,
0.05428772792220116,
0.00800466537475586,
0.04088165983557701,
-0.06527337431907654,
-0.104131780564785,
0.11322564631700516,
0.09490818530321121,
-0.05173078551888466,
0.03917568549513817,
-0.08064775913953781,
0.013551129028201103,
-0.0034181231167167425,
-0.03553272783756256,
-0.2113080471754074,
-0.11618861556053162,
0.050897225737571716,
-0.04567993804812431,
0.025434846058487892,
0.0008331835269927979,
0.08389885723590851,
0.05599695071578026,
-0.05093934014439583,
-0.014820082113146782,
-0.09715709835290909,
0.002031163312494755,
-0.11376775056123734,
-0.19101718068122864,
-0.08431976288557053,
-0.04223514720797539,
0.09278519451618195,
-0.16682405769824982,
-0.005232529249042273,
0.022474635392427444,
0.1385059505701065,
0.028485935181379318,
-0.06787613779306412,
0.0007535105687566102,
0.04713066667318344,
0.012739693745970726,
-0.09665219485759735,
0.05441415309906006,
0.013189231976866722,
-0.10394404083490372,
-0.0465426966547966,
-0.13239260017871857,
-0.01791536808013916,
0.05498010292649269,
0.05909787490963936,
-0.10374962538480759,
-0.05910874903202057,
-0.07303626090288162,
-0.03740842640399933,
-0.07879600673913956,
0.0229659266769886,
0.21261464059352875,
0.03990853950381279,
0.11342844367027283,
-0.06554478406906128,
-0.08403654396533966,
-0.007372202817350626,
0.025762498378753662,
0.02203458361327648,
0.08610179275274277,
0.023721501231193542,
-0.03762134909629822,
0.06909209489822388,
0.10185027867555618,
-0.026409290730953217,
0.13280270993709564,
-0.05544070899486542,
-0.08093535155057907,
-0.031227780506014824,
-0.022142458707094193,
-0.02481485903263092,
0.12977293133735657,
-0.027914689853787422,
0.0009592402493581176,
0.035112302750349045,
0.03904775157570839,
0.011039807461202145,
-0.16881729662418365,
0.0024406833108514547,
0.028117196634411812,
-0.05555659532546997,
-0.0403178371489048,
-0.005557413678616285,
0.022036051377654076,
0.08914025872945786,
0.030587848275899887,
-0.006594200152903795,
0.008232559077441692,
-0.01154331211000681,
-0.05851596221327782,
0.1880473792552948,
-0.09717382490634918,
-0.08160897344350815,
-0.07155643403530121,
0.02117142826318741,
-0.051317885518074036,
-0.03837243840098381,
0.008849333971738815,
-0.09304852038621902,
-0.030195822939276695,
-0.0888613685965538,
-0.02622503787279129,
-0.01880491152405739,
0.01990230195224285,
0.023443622514605522,
-0.01705390028655529,
0.08542934060096741,
-0.13819928467273712,
0.004726677667349577,
-0.04840265214443207,
-0.09413855522871017,
0.008563726209104061,
0.07635058462619781,
0.09202983975410461,
0.0820910707116127,
-0.019151223823428154,
0.02808571234345436,
-0.039046645164489746,
0.23277252912521362,
-0.053749267011880875,
0.012462303973734379,
0.11452530324459076,
-0.010921229608356953,
0.05512373149394989,
0.0923275426030159,
0.03892543166875839,
-0.09052121639251709,
0.02410721592605114,
0.07785248011350632,
-0.03798171505331993,
-0.22560113668441772,
-0.01885944791138172,
-0.001575630740262568,
-0.07575222849845886,
0.1070035994052887,
0.03240464627742767,
-0.048425909131765366,
0.04342622309923172,
0.02245449274778366,
-0.009901014156639576,
-0.04593590274453163,
0.07491966336965561,
0.07159345597028732,
0.052100181579589844,
0.10580801218748093,
-0.005597590934485197,
-0.025792686268687248,
0.056323111057281494,
0.01592138595879078,
0.25262659788131714,
-0.04398757591843605,
0.10263166576623917,
0.03248166665434837,
0.1539078652858734,
-0.01967962272465229,
0.06570340692996979,
0.0006447845953516662,
-0.009738380089402199,
-0.010909166187047958,
-0.06626518070697784,
-0.025209255516529083,
0.016912607476115227,
-0.04671163484454155,
0.024714849889278412,
-0.0766887217760086,
0.023936372250318527,
0.027965296059846878,
0.2919485867023468,
0.027503041550517082,
-0.2596564292907715,
-0.07380881160497665,
-0.016086848452687263,
-0.04457888752222061,
-0.06160932406783104,
0.008066610433161259,
0.13406936824321747,
-0.1399020552635193,
0.053109414875507355,
-0.07854057103395462,
0.08859807252883911,
-0.04606757313013077,
0.012556822970509529,
0.047267403453588486,
0.15051253139972687,
-0.019276512786746025,
0.05090964958071709,
-0.19969598948955536,
0.2527761459350586,
0.02032661810517311,
0.10649437457323074,
-0.06493477523326874,
0.010762924328446388,
0.020010290667414665,
0.01050761342048645,
0.11087825149297714,
0.0029175393283367157,
-0.06485205143690109,
-0.14681021869182587,
-0.09041287004947662,
0.04534618929028511,
0.14101597666740417,
-0.038957368582487106,
0.08936174213886261,
-0.029317021369934082,
0.012158006429672241,
0.029942231252789497,
-0.04081818088889122,
-0.1517394334077835,
-0.07819594442844391,
0.0013031436828896403,
0.015187901444733143,
-0.005839366931468248,
-0.06143244728446007,
-0.10496987402439117,
-0.019354023039340973,
0.11166089028120041,
-0.002037663012742996,
-0.05755091458559036,
-0.1545543223619461,
0.08170101791620255,
0.1416349858045578,
-0.05476130172610283,
0.012345079332590103,
0.016145426779985428,
0.11474654823541641,
0.02923784963786602,
-0.08068165928125381,
0.06474000960588455,
-0.05649976804852486,
-0.18233579397201538,
-0.056557729840278625,
0.1216224804520607,
0.08288650959730148,
0.04939556121826172,
-0.0016659711254760623,
0.0547189936041832,
0.002028547925874591,
-0.0952516496181488,
0.039267588406801224,
0.0026221300940960646,
0.04345860332250595,
0.017165979370474815,
-0.08292902261018753,
0.09529374539852142,
-0.037789974361658096,
0.009318591095507145,
0.1280336230993271,
0.21384942531585693,
-0.10756698250770569,
0.1153179407119751,
0.08578582108020782,
-0.07304339110851288,
-0.1663302630186081,
0.0612851157784462,
0.129813551902771,
0.00864990521222353,
0.08500105142593384,
-0.21198709309101105,
0.12319660931825638,
0.10352862626314163,
-0.01303863525390625,
0.008185532875359058,
-0.27745211124420166,
-0.13194257020950317,
0.05882779136300087,
0.11172308027744293,
0.040072426199913025,
-0.11511173099279404,
-0.03307410702109337,
-0.007631596177816391,
-0.1015877053141594,
0.11546216160058975,
-0.07230646908283234,
0.11411989480257034,
-0.01958094723522663,
0.1170782744884491,
0.02635098434984684,
-0.03442670404911041,
0.10950717329978943,
0.061155904084444046,
0.08777565509080887,
-0.037529509514570236,
0.006400334648787975,
0.05890640988945961,
-0.05939535051584244,
0.02671736478805542,
-0.042466435581445694,
0.0672287791967392,
-0.14716745913028717,
0.006496034096926451,
-0.08774428069591522,
0.05381002649664879,
-0.046395041048526764,
-0.0722256749868393,
-0.018398605287075043,
0.053668197244405746,
0.0702815055847168,
-0.040981993079185486,
0.03236657753586769,
-0.0029185267630964518,
0.10043848305940628,
0.10480953007936478,
0.0808466300368309,
-0.02572076953947544,
-0.08702743798494339,
0.014106747694313526,
0.003068821504712105,
0.05537698045372963,
-0.09607990086078644,
0.01440021488815546,
0.14177130162715912,
0.06526903808116913,
0.09596512466669083,
0.04592062532901764,
-0.043057847768068314,
0.004617628641426563,
0.014449037611484528,
-0.13368289172649384,
-0.10263244062662125,
0.02547510713338852,
-0.04193542152643204,
-0.15079239010810852,
0.02727973647415638,
0.12022838741540909,
-0.039690710604190826,
-0.021015590056777,
-0.004762652795761824,
0.004128391854465008,
-0.012478543445467949,
0.18303678929805756,
0.04515032097697258,
0.06264358013868332,
-0.09056556969881058,
0.11188063025474548,
0.03457454591989517,
-0.05069243162870407,
0.05401197448372841,
0.06680338829755783,
-0.10404685884714127,
0.011071798391640186,
0.08073540031909943,
0.13157998025417328,
-0.05560322105884552,
-0.011836132034659386,
-0.09578109532594681,
-0.084470234811306,
0.04170965775847435,
0.13773144781589508,
0.05497238412499428,
-0.0009514560806564987,
-0.0650935173034668,
0.03615454584360123,
-0.11957430839538574,
0.0689283013343811,
0.04899238049983978,
0.07542875409126282,
-0.10003236681222916,
0.1344919055700302,
-0.0019289812771603465,
0.025703687220811844,
-0.026554664596915245,
0.013101940043270588,
-0.10023251175880432,
-0.023679208010435104,
-0.10754699259996414,
-0.02352883853018284,
-0.011784292757511139,
-0.001389893819577992,
-0.022314704954624176,
-0.06959071010351181,
-0.028778165578842163,
0.03873276710510254,
-0.07907669991254807,
-0.047872330993413925,
0.01666501723229885,
0.03610331192612648,
-0.15498395264148712,
0.003515630727633834,
0.026009220629930496,
-0.08975712954998016,
0.08984586596488953,
0.06281289458274841,
0.01154936384409666,
0.024327579885721207,
-0.12056417763233185,
-0.0302329920232296,
-0.010478274896740913,
0.004352976568043232,
0.06903770565986633,
-0.09470479935407639,
-0.028363512828946114,
-0.03614937886595726,
0.04017737880349159,
0.0201340913772583,
0.1036229133605957,
-0.12070731073617935,
-0.002780322916805744,
-0.03572182357311249,
-0.03931373730301857,
-0.06490818411111832,
0.03636818379163742,
0.10695309937000275,
0.052803218364715576,
0.15152528882026672,
-0.07749210298061371,
0.055322691798210144,
-0.19865505397319794,
-0.03773228079080582,
0.012534555979073048,
-0.04761897400021553,
-0.0814046561717987,
-0.04712722823023796,
0.08859068900346756,
-0.047382958233356476,
0.11338584125041962,
-0.012431549839675426,
0.10112922638654709,
0.044737059623003006,
-0.012070054188370705,
-0.06533826142549515,
-0.006540624424815178,
0.18279007077217102,
0.05254780128598213,
-0.01781366392970085,
0.12634456157684326,
0.003290775464847684,
0.029035534709692,
0.08614029735326767,
0.2214576154947281,
0.154158353805542,
0.001578219118528068,
0.062161583453416824,
0.05988994985818863,
-0.06799402087926865,
-0.14868299663066864,
0.11962292343378067,
-0.01955171301960945,
0.10667166858911514,
-0.06619603931903839,
0.18942949175834656,
0.038606949150562286,
-0.18049179017543793,
0.06456339359283447,
-0.024987423792481422,
-0.10815376788377762,
-0.11952147632837296,
-0.027826538309454918,
-0.07062599062919617,
-0.12203259021043777,
0.024783866479992867,
-0.11708768457174301,
0.06248056888580322,
0.10438939183950424,
0.008376115933060646,
0.03852256014943123,
0.18276788294315338,
-0.04836170747876167,
0.012097309343516827,
0.08420031517744064,
0.01933716982603073,
0.002506035380065441,
-0.04106539115309715,
-0.0652151107788086,
0.03526536375284195,
0.03458655625581741,
0.06404303014278412,
-0.047257788479328156,
0.005611875560134649,
0.004599236883223057,
-0.008573560044169426,
-0.07653981447219849,
0.01152476854622364,
0.009089285507798195,
0.05192098021507263,
0.04712105914950371,
0.04798876494169235,
0.006712110713124275,
-0.05519988760352135,
0.29304370284080505,
-0.06941442936658859,
-0.07053632289171219,
-0.12920427322387695,
0.21793220937252045,
0.024392105638980865,
-0.025627240538597107,
0.05508260428905487,
-0.08716466277837753,
-0.015118775889277458,
0.16424164175987244,
0.13872388005256653,
-0.08989057689905167,
-0.01527425181120634,
-0.02410201169550419,
-0.010655703023076057,
-0.014230337925255299,
0.11459051817655563,
0.07488038390874863,
-0.01105409860610962,
-0.06972384452819824,
-0.01188662089407444,
-0.028398914262652397,
-0.055508289486169815,
-0.06857871264219284,
0.06957364082336426,
0.027344858273863792,
-0.00953882746398449,
-0.06574201583862305,
0.06655094027519226,
-0.0016604071715846658,
-0.2345096915960312,
0.042695172131061554,
-0.17482244968414307,
-0.1693745255470276,
-0.019114971160888672,
0.07142329216003418,
0.004156062845140696,
0.0560569129884243,
0.0022590558510273695,
0.023459656164050102,
0.11463809758424759,
-0.013500163331627846,
-0.0024006126914173365,
-0.11519146710634232,
0.11473379284143448,
-0.10100691020488739,
0.20080062747001648,
-0.007044518366456032,
0.05698377266526222,
0.09652748703956604,
0.03865114971995354,
-0.13655999302864075,
0.02276177518069744,
0.06378015875816345,
-0.12778347730636597,
-0.00513852946460247,
0.14803116023540497,
-0.03046226128935814,
0.0644594132900238,
0.02740400657057762,
-0.14619103074073792,
0.0029799730982631445,
0.016886573284864426,
-0.03599300980567932,
-0.06908047944307327,
-0.008545693941414356,
-0.04991793632507324,
0.16460497677326202,
0.21570399403572083,
-0.029214460402727127,
0.007711565122008324,
-0.09116244316101074,
0.012219537980854511,
0.04650931805372238,
0.05046838894486427,
-0.04125909134745598,
-0.20568452775478363,
0.01602616347372532,
0.07274090498685837,
-0.005902348551899195,
-0.19573253393173218,
-0.09641789644956589,
0.047305088490247726,
-0.03977055475115776,
-0.04231587052345276,
0.09277508407831192,
0.02068987488746643,
0.040981028228998184,
-0.013540824875235558,
-0.11382856965065002,
-0.021044814959168434,
0.14000773429870605,
-0.17461369931697845,
-0.032828282564878464
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09835928678512573,
0.10792353004217148,
-0.0023162623401731253,
0.09484295547008514,
0.12065975368022919,
0.015437674708664417,
0.09889842569828033,
0.13006944954395294,
-0.10475992411375046,
0.06792107969522476,
0.08730541169643402,
0.0356246717274189,
0.044633835554122925,
0.15093976259231567,
-0.006647657137364149,
-0.27378273010253906,
0.0001718653365969658,
0.00040029032970778644,
-0.04077345132827759,
0.12067890912294388,
0.08856835216283798,
-0.11075174808502197,
0.07697572559118271,
0.010379776358604431,
-0.15302467346191406,
0.019030321389436722,
-0.03199479728937149,
-0.03696335852146149,
0.12222494184970856,
-0.037727393209934235,
0.1066361740231514,
0.029724806547164917,
0.13755279779434204,
-0.20757059752941132,
0.00669450219720602,
0.07662007957696915,
0.05241841822862625,
0.09998258203268051,
0.04618920758366585,
0.010274280793964863,
0.08887901902198792,
-0.1495993435382843,
0.09376580268144608,
0.02930382825434208,
-0.09148070216178894,
-0.1514587700366974,
-0.09230643510818481,
0.0319691076874733,
0.05060139670968056,
0.07186470180749893,
0.0021014511585235596,
0.14822454750537872,
-0.06400066614151001,
0.08383265137672424,
0.26402196288108826,
-0.3220028579235077,
-0.06696359813213348,
0.030921516939997673,
0.058065030723810196,
0.06006632745265961,
-0.12019049376249313,
-0.003720856038853526,
0.020080769434571266,
0.027333175763487816,
0.12831580638885498,
-0.016347160562872887,
-0.10755365341901779,
-0.00974874384701252,
-0.12495765089988708,
-0.0015704118413850665,
0.05948881059885025,
0.02657073549926281,
-0.05271925404667854,
-0.10853414982557297,
-0.06754890084266663,
-0.08601409941911697,
-0.023144930601119995,
-0.055848538875579834,
0.05112648010253906,
-0.05564085394144058,
-0.09747249633073807,
-0.03947439044713974,
-0.05688325688242912,
-0.08017533272504807,
-0.007454846519976854,
0.1655576080083847,
0.03602893650531769,
0.019242633134126663,
-0.03059653379023075,
0.11800841242074966,
0.01966143772006035,
-0.14008387923240662,
-0.008879894390702248,
-0.004705870524048805,
-0.09408590197563171,
-0.04171975329518318,
-0.05283632129430771,
-0.012514876201748848,
0.004432625137269497,
0.16868406534194946,
-0.07901632785797119,
0.07452470809221268,
0.015022233128547668,
-0.024547915905714035,
-0.012570671737194061,
0.15268388390541077,
-0.04096166417002678,
-0.04034413397312164,
-0.01688326895236969,
0.08091077953577042,
0.0035992891062051058,
-0.0185812059789896,
-0.06675712764263153,
-0.02802339382469654,
0.09434111416339874,
0.055323276668787,
-0.06291671842336655,
0.039714619517326355,
-0.028859436511993408,
-0.026127221062779427,
0.019527463242411613,
-0.11966001242399216,
0.04179989919066429,
-0.0038321572355926037,
-0.08285336196422577,
-0.008026489987969398,
-0.0027432390488684177,
-0.008193759247660637,
-0.010780648328363895,
0.09891815483570099,
-0.0874139815568924,
-0.0011229265946894884,
-0.07005704939365387,
-0.08147893846035004,
-0.00036285078385844827,
-0.15811076760292053,
-0.013827156275510788,
-0.05878543108701706,
-0.16758398711681366,
-0.03413185104727745,
0.043105192482471466,
-0.07440666854381561,
-0.012122494168579578,
-0.044695962220430374,
-0.06236698105931282,
0.01598413474857807,
-0.012483501806855202,
0.19184857606887817,
-0.052022866904735565,
0.0806146189570427,
-0.00829494558274746,
0.05108599737286568,
0.027297120541334152,
0.036142490804195404,
-0.10369107872247696,
0.027861300855875015,
-0.13937610387802124,
0.07709986716508865,
-0.08527781069278717,
-0.0030822318512946367,
-0.13631509244441986,
-0.10223608464002609,
0.012712633237242699,
-0.020205089822411537,
0.0940595492720604,
0.13391831517219543,
-0.19989080727100372,
-0.01928146742284298,
0.12696850299835205,
-0.07507973164319992,
-0.051714442670345306,
0.05944732949137688,
-0.06189921870827675,
0.04101424291729927,
0.05313759297132492,
0.21186763048171997,
0.0563330203294754,
-0.15671749413013458,
-0.011232016608119011,
0.005013377405703068,
0.044592343270778656,
0.025908848270773888,
0.03753824532032013,
0.0036809106823056936,
0.05937333405017853,
0.015891315415501595,
-0.08878696709871292,
-0.024731749668717384,
-0.08950576186180115,
-0.06654886156320572,
-0.050332777202129364,
-0.07585718482732773,
0.05469181016087532,
0.008505214937031269,
0.04127879813313484,
-0.06538962572813034,
-0.10398828983306885,
0.1141173467040062,
0.09561603516340256,
-0.05162227153778076,
0.03772302344441414,
-0.08058367669582367,
0.013020575046539307,
-0.004405106883496046,
-0.03566480800509453,
-0.21110853552818298,
-0.11436805129051208,
0.05122557282447815,
-0.04608100652694702,
0.02537054941058159,
0.0030298817437142134,
0.0846172347664833,
0.055632106959819794,
-0.050701744854450226,
-0.015285846777260303,
-0.09715546667575836,
0.0019557327032089233,
-0.11481177061796188,
-0.1894952356815338,
-0.08530276268720627,
-0.04276033118367195,
0.09358853846788406,
-0.16824953258037567,
-0.004320905543863773,
0.020419679582118988,
0.13807755708694458,
0.027644701302051544,
-0.06802099198102951,
0.0017070281319320202,
0.04618200287222862,
0.013571321032941341,
-0.09701111912727356,
0.05412441864609718,
0.01202410738915205,
-0.10354349762201309,
-0.048359084874391556,
-0.13352124392986298,
-0.01950678788125515,
0.054282933473587036,
0.06155601516366005,
-0.10338230431079865,
-0.059501972049474716,
-0.07290558516979218,
-0.03682783618569374,
-0.07767820358276367,
0.02237740159034729,
0.21163438260555267,
0.038916222751140594,
0.11282292008399963,
-0.06561209261417389,
-0.08510365337133408,
-0.007476752623915672,
0.02733643539249897,
0.02268986962735653,
0.08569289743900299,
0.023667801171541214,
-0.037746235728263855,
0.06851476430892944,
0.10279234498739243,
-0.025868745520710945,
0.13213704526424408,
-0.055615395307540894,
-0.08172766864299774,
-0.030902137979865074,
-0.02266627922654152,
-0.026020299643278122,
0.12946142256259918,
-0.02834388054907322,
-0.0007285158499144018,
0.03466884419322014,
0.03774430602788925,
0.011077268049120903,
-0.16907161474227905,
0.0027472227811813354,
0.02815936878323555,
-0.054616544395685196,
-0.041818827390670776,
-0.006574534811079502,
0.020854612812399864,
0.08850209414958954,
0.029911788180470467,
-0.0072404928505420685,
0.007498207502067089,
-0.011250695213675499,
-0.05809367448091507,
0.18805518746376038,
-0.09599343687295914,
-0.08034069091081619,
-0.0708460658788681,
0.021857621148228645,
-0.049389902502298355,
-0.0384816899895668,
0.0076784840784966946,
-0.09312071651220322,
-0.029695499688386917,
-0.08838995546102524,
-0.027145475149154663,
-0.018756020814180374,
0.019244488328695297,
0.025078732520341873,
-0.016408847644925117,
0.08360111713409424,
-0.1383485198020935,
0.0054184147156775,
-0.04897082597017288,
-0.09376931935548782,
0.00782842468470335,
0.07533443719148636,
0.09224133938550949,
0.08259353041648865,
-0.019852206110954285,
0.028263667598366737,
-0.039677008986473083,
0.23208938539028168,
-0.0542224682867527,
0.01393378246575594,
0.11416422575712204,
-0.01045264769345522,
0.054776329547166824,
0.09303206950426102,
0.03819854557514191,
-0.09019184857606888,
0.02369372546672821,
0.07715616375207901,
-0.03738543763756752,
-0.22627811133861542,
-0.01823812536895275,
-0.0008101295097731054,
-0.07711508125066757,
0.10770019143819809,
0.03192819654941559,
-0.046215321868658066,
0.045173898339271545,
0.02227785624563694,
-0.008423932828009129,
-0.044637300074100494,
0.07450345158576965,
0.06932806968688965,
0.051314208656549454,
0.10604364424943924,
-0.005878135096281767,
-0.026915326714515686,
0.054817698895931244,
0.016545623540878296,
0.2534182667732239,
-0.042839016765356064,
0.10234014689922333,
0.03143003210425377,
0.1531732678413391,
-0.020507387816905975,
0.0681253969669342,
0.0015795822255313396,
-0.010127073153853416,
-0.010860384441912174,
-0.06599710136651993,
-0.02381734922528267,
0.01744323968887329,
-0.04607263579964638,
0.024462061002850533,
-0.07599085569381714,
0.023544762283563614,
0.0272949431091547,
0.2905285060405731,
0.02908414788544178,
-0.26067307591438293,
-0.07312965393066406,
-0.01573067344725132,
-0.04563923180103302,
-0.06140543520450592,
0.007689709309488535,
0.13283738493919373,
-0.13963666558265686,
0.05407834053039551,
-0.07858749479055405,
0.08877962827682495,
-0.045163922011852264,
0.012645904906094074,
0.04604052007198334,
0.1503371149301529,
-0.01906413398683071,
0.052120696753263474,
-0.19974324107170105,
0.25147244334220886,
0.020337771624326706,
0.10781422257423401,
-0.0660984143614769,
0.01088585052639246,
0.02000446431338787,
0.010000060312449932,
0.11211007833480835,
0.0025661755353212357,
-0.06492460519075394,
-0.14727163314819336,
-0.09069394320249557,
0.045322321355342865,
0.14108839631080627,
-0.038561493158340454,
0.08996398001909256,
-0.028369057923555374,
0.011126665398478508,
0.03023540787398815,
-0.04230150952935219,
-0.15310454368591309,
-0.07790738344192505,
0.0008324240916408598,
0.01406865008175373,
-0.006065763998776674,
-0.06087801232933998,
-0.10506485402584076,
-0.02168760448694229,
0.11024099588394165,
-0.0006482607568614185,
-0.0576653778553009,
-0.15436844527721405,
0.08343770354986191,
0.14204435050487518,
-0.05432449281215668,
0.013048126362264156,
0.017621684819459915,
0.1155271902680397,
0.028776168823242188,
-0.08030601590871811,
0.06391538679599762,
-0.056489378213882446,
-0.18101514875888824,
-0.05555632710456848,
0.12284096330404282,
0.08337085694074631,
0.049639854580163956,
-0.00020286401559133083,
0.05403552204370499,
0.002023647539317608,
-0.0949726402759552,
0.03873712569475174,
0.0023794397711753845,
0.04302521049976349,
0.01739412546157837,
-0.08390620350837708,
0.09373998641967773,
-0.038344260305166245,
0.011443180963397026,
0.12814541161060333,
0.21059755980968475,
-0.10738801211118698,
0.11449327319860458,
0.08555088937282562,
-0.07347559928894043,
-0.1659078747034073,
0.06201706454157829,
0.12983889877796173,
0.00936591625213623,
0.08547631651163101,
-0.21159030497074127,
0.12345132976770401,
0.10299207270145416,
-0.012395452708005905,
0.006829099729657173,
-0.27735960483551025,
-0.13143180310726166,
0.06006501615047455,
0.11187879741191864,
0.03772628679871559,
-0.11435409635305405,
-0.03292645141482353,
-0.008216344751417637,
-0.10157883912324905,
0.11504637449979782,
-0.07342647761106491,
0.1132088378071785,
-0.01893819123506546,
0.1160382479429245,
0.026307860389351845,
-0.03451169654726982,
0.1078086644411087,
0.06295035034418106,
0.0879577249288559,
-0.03725312277674675,
0.005353465210646391,
0.06147162243723869,
-0.0590602345764637,
0.028636064380407333,
-0.04140712320804596,
0.06685525923967361,
-0.14704416692256927,
0.005828267894685268,
-0.08725640922784805,
0.053531624376773834,
-0.04604130983352661,
-0.072453573346138,
-0.017906401306390762,
0.05357223004102707,
0.0699448212981224,
-0.04109610989689827,
0.030220752581954002,
-0.001551128225401044,
0.09969723969697952,
0.10081100463867188,
0.08209896832704544,
-0.024611061438918114,
-0.08616134524345398,
0.013978350907564163,
0.003266592975705862,
0.05469679459929466,
-0.09681305289268494,
0.013765030540525913,
0.14181111752986908,
0.06575501710176468,
0.09621801972389221,
0.045308563858270645,
-0.043137580156326294,
0.003993981517851353,
0.013813748024404049,
-0.13119928538799286,
-0.10398764908313751,
0.025283316150307655,
-0.04302097484469414,
-0.15147876739501953,
0.028643693774938583,
0.11812474578619003,
-0.04056414216756821,
-0.02198377065360546,
-0.005885944701731205,
0.0037860076408833265,
-0.012091226875782013,
0.18447661399841309,
0.045950084924697876,
0.06277215480804443,
-0.0910530611872673,
0.11148064583539963,
0.03488292172551155,
-0.0509674958884716,
0.05370473116636276,
0.06649444997310638,
-0.10470273345708847,
0.010676724836230278,
0.08200513571500778,
0.13207709789276123,
-0.05255752056837082,
-0.012182656675577164,
-0.09559564292430878,
-0.08440956473350525,
0.04211447387933731,
0.1372605264186859,
0.055186398327350616,
-0.002324828412383795,
-0.0647040531039238,
0.036596544086933136,
-0.12012625485658646,
0.06894603371620178,
0.04916049912571907,
0.07530121505260468,
-0.10037510097026825,
0.1347048133611679,
-0.0018074268009513617,
0.026268823072314262,
-0.026528649032115936,
0.014046603813767433,
-0.09934541583061218,
-0.02414056845009327,
-0.10600534081459045,
-0.0249645859003067,
-0.013320988975465298,
-0.0012558340094983578,
-0.022986970841884613,
-0.06910271942615509,
-0.02855801209807396,
0.03830540552735329,
-0.0788538008928299,
-0.04802272096276283,
0.016198206692934036,
0.035261932760477066,
-0.15405721962451935,
0.0035611060447990894,
0.025195159018039703,
-0.08914694935083389,
0.08909621089696884,
0.06210758164525032,
0.01242805551737547,
0.025024928152561188,
-0.11981412768363953,
-0.029648292809724808,
-0.009444473311305046,
0.004687913693487644,
0.06858396530151367,
-0.09439489990472794,
-0.027861518785357475,
-0.035870715975761414,
0.0413234643638134,
0.019253697246313095,
0.1007230281829834,
-0.1196984052658081,
-0.003832286223769188,
-0.03702269867062569,
-0.03878618776798248,
-0.06526179611682892,
0.0369357168674469,
0.10644923150539398,
0.05116226524114609,
0.152048721909523,
-0.0756903737783432,
0.0552835538983345,
-0.19953155517578125,
-0.03828231245279312,
0.011742685921490192,
-0.04762320965528488,
-0.08111494034528732,
-0.04782726615667343,
0.08870091289281845,
-0.04690929874777794,
0.11444473266601562,
-0.012343158014118671,
0.10213380306959152,
0.04383806884288788,
-0.008824852295219898,
-0.06492538005113602,
-0.0058599538169801235,
0.18329955637454987,
0.0527443066239357,
-0.018398785963654518,
0.12547944486141205,
0.003921925090253353,
0.029375316575169563,
0.08458785712718964,
0.21880002319812775,
0.15426740050315857,
0.0005891293985769153,
0.06214208900928497,
0.06074952706694603,
-0.06828762590885162,
-0.14758189022541046,
0.12094938009977341,
-0.020258832722902298,
0.10478127747774124,
-0.06592585891485214,
0.1917218565940857,
0.03821727633476257,
-0.18045318126678467,
0.06519272923469543,
-0.02417933940887451,
-0.10904724150896072,
-0.11825292557477951,
-0.02958036959171295,
-0.07092990726232529,
-0.12087976932525635,
0.025076840072870255,
-0.11715944111347198,
0.060873113572597504,
0.10458233207464218,
0.00832052156329155,
0.03774356469511986,
0.18409216403961182,
-0.04901155084371567,
0.012386237271130085,
0.084316685795784,
0.018942946568131447,
0.0030728215351700783,
-0.04145047068595886,
-0.06419949233531952,
0.036488406360149384,
0.03384143486618996,
0.06480724364519119,
-0.049645740538835526,
0.005014256574213505,
0.004560536239296198,
-0.0077115073800086975,
-0.07594896107912064,
0.011682065203785896,
0.00872486550360918,
0.05192459747195244,
0.045978154987096786,
0.04816622659564018,
0.006136672105640173,
-0.05560772866010666,
0.2915363013744354,
-0.06940098106861115,
-0.07142622023820877,
-0.12947501242160797,
0.2151784896850586,
0.025533998385071754,
-0.025693193078041077,
0.05593612417578697,
-0.08757321536540985,
-0.012781357392668724,
0.1651466339826584,
0.13887439668178558,
-0.08745827525854111,
-0.015866437926888466,
-0.02371249347925186,
-0.011001436039805412,
-0.014916124753654003,
0.1139206811785698,
0.07525120675563812,
-0.013456689193844795,
-0.06950852274894714,
-0.011175221763551235,
-0.02681797742843628,
-0.056702591478824615,
-0.06842300295829773,
0.0693538710474968,
0.028359079733490944,
-0.009943090379238129,
-0.06381437927484512,
0.06822256743907928,
0.0006771089974790812,
-0.23440685868263245,
0.04136648774147034,
-0.1737697571516037,
-0.1695079505443573,
-0.019719716161489487,
0.07122985273599625,
0.005960987415164709,
0.05587929114699364,
0.0025113699957728386,
0.02379113994538784,
0.11398378014564514,
-0.012808069586753845,
-0.0031397384591400623,
-0.11534526199102402,
0.11490476876497269,
-0.10228893160820007,
0.19970044493675232,
-0.007249940186738968,
0.057978034019470215,
0.09636574238538742,
0.036985933780670166,
-0.13641029596328735,
0.023218058049678802,
0.06385267525911331,
-0.12569649517536163,
-0.00350762065500021,
0.1485542207956314,
-0.03025377355515957,
0.06206890568137169,
0.02645866572856903,
-0.1465371698141098,
0.003304606769233942,
0.017153898254036903,
-0.03546251356601715,
-0.06988465040922165,
-0.005950660910457373,
-0.04915790259838104,
0.16533106565475464,
0.2154480665922165,
-0.029622001573443413,
0.008231977932155132,
-0.09165309369564056,
0.011622940190136433,
0.04677950218319893,
0.05063079297542572,
-0.04106493294239044,
-0.20546045899391174,
0.014864430762827396,
0.07013104856014252,
-0.005303601734340191,
-0.1948273926973343,
-0.0953359454870224,
0.046250827610492706,
-0.041219741106033325,
-0.04249618202447891,
0.09191557765007019,
0.02240811102092266,
0.04093742370605469,
-0.013271705247461796,
-0.11429888755083084,
-0.02132057584822178,
0.14005638659000397,
-0.17564378678798676,
-0.03195848688483238
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09763717651367188,
0.10790494084358215,
-0.0023779708426445723,
0.09604478627443314,
0.12131348997354507,
0.016188861802220345,
0.09927448630332947,
0.12997958064079285,
-0.10553684830665588,
0.06778178364038467,
0.08809467405080795,
0.03393664211034775,
0.04327230527997017,
0.14970123767852783,
-0.006511456798762083,
-0.2739230990409851,
-0.0002747909165918827,
-0.0005955110536888242,
-0.04189561307430267,
0.1207970529794693,
0.08816377818584442,
-0.11034512519836426,
0.07790806144475937,
0.010450340807437897,
-0.15480780601501465,
0.019535016268491745,
-0.031880468130111694,
-0.0361199826002121,
0.12229326367378235,
-0.03691815584897995,
0.10724063962697983,
0.030565330758690834,
0.1371053010225296,
-0.20673196017742157,
0.006625291425734758,
0.07573986053466797,
0.05222564563155174,
0.09971281886100769,
0.04648245498538017,
0.010059528984129429,
0.08742095530033112,
-0.14887355268001556,
0.09275275468826294,
0.029367197304964066,
-0.09158580750226974,
-0.1530686318874359,
-0.09206553548574448,
0.031009074300527573,
0.049941133707761765,
0.07174745947122574,
0.00235681957565248,
0.14600084722042084,
-0.06389373540878296,
0.0837772786617279,
0.26146507263183594,
-0.32403573393821716,
-0.06772024184465408,
0.029733918607234955,
0.057096216827631,
0.06115713715553284,
-0.11933815479278564,
-0.002266939962282777,
0.020200783386826515,
0.028653958812355995,
0.1276717633008957,
-0.01654645800590515,
-0.10713040828704834,
-0.009310636669397354,
-0.12492164969444275,
-0.0017830947181209922,
0.06069004908204079,
0.026226166635751724,
-0.053638044744729996,
-0.10707541555166245,
-0.06693268567323685,
-0.08491824567317963,
-0.022153014317154884,
-0.05553059279918671,
0.05142055079340935,
-0.055314235389232635,
-0.09779898822307587,
-0.03978167846798897,
-0.058217212557792664,
-0.07935354858636856,
-0.008099483326077461,
0.16602960228919983,
0.03594779968261719,
0.020458290353417397,
-0.02967127040028572,
0.11874213814735413,
0.021340401843190193,
-0.13932164013385773,
-0.008312885649502277,
-0.004869423341006041,
-0.0929238423705101,
-0.04110373184084892,
-0.053514156490564346,
-0.009943974204361439,
0.005535724572837353,
0.16773585975170135,
-0.0815577283501625,
0.07446593046188354,
0.014757243916392326,
-0.024196898564696312,
-0.012387764640152454,
0.15105822682380676,
-0.04042445495724678,
-0.038886506110429764,
-0.01680409535765648,
0.08052439242601395,
0.003566084196791053,
-0.01885906793177128,
-0.06689910590648651,
-0.02747778780758381,
0.0940154492855072,
0.05606374889612198,
-0.0636836588382721,
0.03947082534432411,
-0.029518645256757736,
-0.025874421000480652,
0.018107222393155098,
-0.1195109635591507,
0.04122558608651161,
-0.0034894244745373726,
-0.08142425119876862,
-0.007668534759432077,
-0.002657134085893631,
-0.008945663459599018,
-0.010158357210457325,
0.09796524047851562,
-0.08702471107244492,
-0.0008261502371169627,
-0.06892213225364685,
-0.0803958997130394,
-0.0010431853588670492,
-0.1545954942703247,
-0.013658136129379272,
-0.059240520000457764,
-0.16540935635566711,
-0.03367890417575836,
0.04380619525909424,
-0.07592684775590897,
-0.013541456311941147,
-0.04360092803835869,
-0.06220534071326256,
0.01671922206878662,
-0.013273871503770351,
0.19214612245559692,
-0.05261297523975372,
0.08053257316350937,
-0.0066886222921311855,
0.050855278968811035,
0.026794089004397392,
0.035201169550418854,
-0.1024455651640892,
0.028321189805865288,
-0.14036515355110168,
0.07695713639259338,
-0.08480574190616608,
-0.004477689508348703,
-0.13649798929691315,
-0.10299066454172134,
0.015432384796440601,
-0.019989222288131714,
0.09275288134813309,
0.13357838988304138,
-0.19804778695106506,
-0.019120419397950172,
0.12484200298786163,
-0.07561077922582626,
-0.05176384001970291,
0.0607144832611084,
-0.06186923012137413,
0.041509512811899185,
0.051606714725494385,
0.212103933095932,
0.055728133767843246,
-0.1563863456249237,
-0.010201871395111084,
0.005808820482343435,
0.04458877444267273,
0.026946015655994415,
0.037832412868738174,
0.003622089745476842,
0.0568663589656353,
0.016324633732438087,
-0.09027499705553055,
-0.024980740621685982,
-0.08991216123104095,
-0.06682228296995163,
-0.04982546344399452,
-0.07531700283288956,
0.05547766014933586,
0.006858504377305508,
0.04160638526082039,
-0.06584057211875916,
-0.10345300287008286,
0.11263348162174225,
0.09568466991186142,
-0.05182631313800812,
0.03790134936571121,
-0.08127693831920624,
0.01372460462152958,
-0.005404231138527393,
-0.03581918776035309,
-0.2103487104177475,
-0.11671734601259232,
0.050839100033044815,
-0.043689098209142685,
0.025338776409626007,
0.003676127642393112,
0.0843474343419075,
0.05629622936248779,
-0.05126681551337242,
-0.015844831243157387,
-0.09631851315498352,
0.0019212214974686503,
-0.113898865878582,
-0.19078941643238068,
-0.08542545139789581,
-0.04246363788843155,
0.0945492833852768,
-0.1685745120048523,
-0.005059913266450167,
0.021964581683278084,
0.13713127374649048,
0.027351681143045425,
-0.06692356616258621,
0.0013155670603737235,
0.04663757607340813,
0.013844730332493782,
-0.09604813903570175,
0.05502491444349289,
0.013173874467611313,
-0.10342420637607574,
-0.04734772816300392,
-0.1315225213766098,
-0.016858359798789024,
0.05397162586450577,
0.05944535508751869,
-0.1039380207657814,
-0.05972687155008316,
-0.07296355813741684,
-0.037711773067712784,
-0.07715702801942825,
0.022474078461527824,
0.2128104716539383,
0.03813282400369644,
0.11282940953969955,
-0.06503081321716309,
-0.08391958475112915,
-0.007627173326909542,
0.026316402480006218,
0.02271091938018799,
0.08523088693618774,
0.022328021004796028,
-0.03734035789966583,
0.06784476339817047,
0.10239060968160629,
-0.026500673964619637,
0.13214437663555145,
-0.055833302438259125,
-0.08076594769954681,
-0.03197760507464409,
-0.021254640072584152,
-0.025924885645508766,
0.12958161532878876,
-0.02760547585785389,
-0.00003847705011139624,
0.03441013768315315,
0.038180042058229446,
0.011247093789279461,
-0.16901719570159912,
0.0025949706323444843,
0.027525799348950386,
-0.055551011115312576,
-0.04017294570803642,
-0.006166486535221338,
0.0210732389241457,
0.08849908411502838,
0.030401743948459625,
-0.00839909352362156,
0.009250999428331852,
-0.011239611543715,
-0.05853217467665672,
0.18771538138389587,
-0.09617488086223602,
-0.08065207302570343,
-0.07237473130226135,
0.02004346251487732,
-0.05106915906071663,
-0.038618940860033035,
0.008241893723607063,
-0.09338860213756561,
-0.02948264218866825,
-0.08852898329496384,
-0.027001865208148956,
-0.01990305632352829,
0.020148973912000656,
0.02479422837495804,
-0.016915684565901756,
0.08576037734746933,
-0.13718588650226593,
0.005269818007946014,
-0.04881812259554863,
-0.09527402371168137,
0.00931074470281601,
0.07659376412630081,
0.09200919419527054,
0.08139706403017044,
-0.01863035000860691,
0.02804725058376789,
-0.03897756710648537,
0.23386810719966888,
-0.052966270595788956,
0.013086049817502499,
0.11470181494951248,
-0.01135177817195654,
0.054451409727334976,
0.09262359142303467,
0.03884756565093994,
-0.09066728502511978,
0.023905033245682716,
0.07705160975456238,
-0.037434566766023636,
-0.2257937490940094,
-0.01828651688992977,
-0.0005388599820435047,
-0.07649021595716476,
0.10712479799985886,
0.03202611207962036,
-0.04978964850306511,
0.04466003179550171,
0.024005690589547157,
-0.008728861808776855,
-0.045465461909770966,
0.07413434982299805,
0.07264105975627899,
0.0507894866168499,
0.10670710355043411,
-0.0055518802255392075,
-0.02718944288790226,
0.055709999054670334,
0.017431112006306648,
0.2535436153411865,
-0.04324711859226227,
0.10140485316514969,
0.03278921917080879,
0.1542600691318512,
-0.01974836364388466,
0.06676185876131058,
0.0013321847654879093,
-0.010079942643642426,
-0.01126671303063631,
-0.065802201628685,
-0.0233727116137743,
0.017296411097049713,
-0.046374037861824036,
0.02398424968123436,
-0.07731685787439346,
0.024539140984416008,
0.027221865952014923,
0.2917000949382782,
0.028803296387195587,
-0.2610945999622345,
-0.07365384697914124,
-0.015794675797224045,
-0.045030441135168076,
-0.060353703796863556,
0.007947494275867939,
0.13422659039497375,
-0.14004142582416534,
0.053189877420663834,
-0.07826413959264755,
0.08744587749242783,
-0.04593293368816376,
0.013010031543672085,
0.04739804193377495,
0.1503039002418518,
-0.018729394301772118,
0.05200307071208954,
-0.19815121591091156,
0.2508665919303894,
0.020235782489180565,
0.10667862743139267,
-0.06447198241949081,
0.011067948304116726,
0.02014300972223282,
0.011641917750239372,
0.11203764379024506,
0.002950566355139017,
-0.06604588776826859,
-0.1468552201986313,
-0.0910530537366867,
0.04631698504090309,
0.14079901576042175,
-0.03907939791679382,
0.09029360860586166,
-0.028255121782422066,
0.011627092957496643,
0.029311450198292732,
-0.0413888543844223,
-0.15260115265846252,
-0.07784389704465866,
0.0004021448257844895,
0.014328702352941036,
-0.006150809582322836,
-0.06140207499265671,
-0.1057034283876419,
-0.019447512924671173,
0.11174094676971436,
0.0002224383206339553,
-0.05821409821510315,
-0.15431883931159973,
0.08224523812532425,
0.1416434943675995,
-0.0542355440557003,
0.011964808218181133,
0.017814399674534798,
0.11486048996448517,
0.03018944151699543,
-0.08011490851640701,
0.06397658586502075,
-0.05709027126431465,
-0.18151943385601044,
-0.055705320090055466,
0.1226387470960617,
0.08243591338396072,
0.04909590259194374,
-0.0005560568533837795,
0.05360199138522148,
0.002401570789515972,
-0.09558223187923431,
0.04048921912908554,
0.0007904635858722031,
0.043642569333314896,
0.0170162133872509,
-0.08452069014310837,
0.09676899015903473,
-0.037374094128608704,
0.01039269007742405,
0.12817761301994324,
0.21196219325065613,
-0.1072210744023323,
0.11432292312383652,
0.08558833599090576,
-0.07356404513120651,
-0.16541747748851776,
0.06083254516124725,
0.13030043244361877,
0.008979412727057934,
0.085703544318676,
-0.21282318234443665,
0.1242307722568512,
0.1023247092962265,
-0.012641343288123608,
0.007709125988185406,
-0.27559101581573486,
-0.13083121180534363,
0.058974720537662506,
0.11248573660850525,
0.04206180199980736,
-0.11436404287815094,
-0.033213045448064804,
-0.0069739301688969135,
-0.10117857158184052,
0.11356482654809952,
-0.0741979107260704,
0.11372094601392746,
-0.019447287544608116,
0.11751420795917511,
0.02563486434519291,
-0.03370410203933716,
0.1094408929347992,
0.06235715374350548,
0.0873197615146637,
-0.03714777156710625,
0.007000046316534281,
0.05920743569731712,
-0.059079308062791824,
0.027454812079668045,
-0.04207056015729904,
0.06687457114458084,
-0.14818687736988068,
0.00582526158541441,
-0.08774712681770325,
0.05293393135070801,
-0.046666160225868225,
-0.07188471406698227,
-0.016931626945734024,
0.05337425693869591,
0.06907663494348526,
-0.040859829634428024,
0.028369097039103508,
-0.001816059579141438,
0.09841415286064148,
0.10443327575922012,
0.0816214382648468,
-0.024387041106820107,
-0.0877523124217987,
0.01365607138723135,
0.0024398965761065483,
0.05462116375565529,
-0.09526757150888443,
0.013776185922324657,
0.14163216948509216,
0.06396138668060303,
0.0960395559668541,
0.04620938375592232,
-0.04277074337005615,
0.004740310367196798,
0.014870012179017067,
-0.13312719762325287,
-0.10386333614587784,
0.024834025651216507,
-0.04583507776260376,
-0.15158262848854065,
0.02784297801554203,
0.12098939716815948,
-0.03942922502756119,
-0.021591056138277054,
-0.005613245535641909,
0.003984966315329075,
-0.01301288977265358,
0.1839263141155243,
0.04490102082490921,
0.06284241378307343,
-0.09080429375171661,
0.11097417771816254,
0.03467169404029846,
-0.049936357885599136,
0.05338259041309357,
0.06665754318237305,
-0.10488678514957428,
0.010242537595331669,
0.08035171031951904,
0.13259801268577576,
-0.054199039936065674,
-0.012585636228322983,
-0.09685581177473068,
-0.0850323736667633,
0.041179168969392776,
0.13444164395332336,
0.05571736395359039,
-0.002288044663146138,
-0.06488993018865585,
0.03577541559934616,
-0.11951475590467453,
0.06889919191598892,
0.0490344762802124,
0.07581519335508347,
-0.10146211832761765,
0.13391388952732086,
-0.002452521352097392,
0.027282165363430977,
-0.026800528168678284,
0.013537462800741196,
-0.09990714490413666,
-0.02429237589240074,
-0.10814620554447174,
-0.023753514513373375,
-0.011771305464208126,
-0.0008669088128954172,
-0.022217830643057823,
-0.06873029470443726,
-0.029213864356279373,
0.03936637565493584,
-0.07917167991399765,
-0.047482214868068695,
0.017472779378294945,
0.03672892227768898,
-0.15327903628349304,
0.003096510423347354,
0.025256158784031868,
-0.08991863578557968,
0.09048760682344437,
0.06282089650630951,
0.012325574643909931,
0.024889061227440834,
-0.11642332375049591,
-0.030560530722141266,
-0.009850766509771347,
0.0037683513946831226,
0.06872616708278656,
-0.09517852216959,
-0.028526585549116135,
-0.03568245843052864,
0.041077420115470886,
0.019820960238575935,
0.10325058549642563,
-0.11967436969280243,
-0.0030364159028977156,
-0.03645918145775795,
-0.03893328458070755,
-0.06553873419761658,
0.03620945289731026,
0.10634410381317139,
0.052912767976522446,
0.15229478478431702,
-0.07720963656902313,
0.05476178601384163,
-0.19918352365493774,
-0.03821737319231033,
0.011573070660233498,
-0.04703817516565323,
-0.0810423344373703,
-0.047864533960819244,
0.08818250149488449,
-0.0476619191467762,
0.11481709778308868,
-0.013137692585587502,
0.10102536529302597,
0.04360120743513107,
-0.011117707006633282,
-0.06428532302379608,
-0.005640119779855013,
0.1817956268787384,
0.052111007273197174,
-0.018467728048563004,
0.125590980052948,
0.0034600580111145973,
0.029179319739341736,
0.08558418601751328,
0.21886678040027618,
0.15312990546226501,
0.0017848602728918195,
0.06247575953602791,
0.061217814683914185,
-0.06786945462226868,
-0.14832165837287903,
0.12053234130144119,
-0.019214099273085594,
0.10591533035039902,
-0.06572095304727554,
0.1884140521287918,
0.03812352195382118,
-0.1801193356513977,
0.0638074055314064,
-0.025117948651313782,
-0.10915175080299377,
-0.11856552213430405,
-0.027207864448428154,
-0.07113168388605118,
-0.12221761792898178,
0.024950936436653137,
-0.11754877120256424,
0.0605166032910347,
0.10524393618106842,
0.008188086561858654,
0.03808622062206268,
0.18291796743869781,
-0.04701121896505356,
0.013134585693478584,
0.08378361165523529,
0.018267381936311722,
0.0034695493523031473,
-0.039251457899808884,
-0.06431989371776581,
0.03528735786676407,
0.03444085642695427,
0.06348621100187302,
-0.04894040524959564,
0.00554128922522068,
0.0043001980520784855,
-0.00811738334596157,
-0.076044000685215,
0.011032108217477798,
0.009224525652825832,
0.05157638341188431,
0.045925602316856384,
0.048005830496549606,
0.005505900830030441,
-0.055158957839012146,
0.2917860448360443,
-0.06909123063087463,
-0.06951802968978882,
-0.12995676696300507,
0.21707062423229218,
0.02330974116921425,
-0.025521576404571533,
0.05583387613296509,
-0.0861828550696373,
-0.013509304262697697,
0.16451551020145416,
0.14004279673099518,
-0.09017328172922134,
-0.015924451872706413,
-0.023629888892173767,
-0.010945823974907398,
-0.014157485216856003,
0.11548317968845367,
0.07469368726015091,
-0.012570187449455261,
-0.07055781036615372,
-0.01177707128226757,
-0.027988340705633163,
-0.05533304437994957,
-0.06920906901359558,
0.06919442862272263,
0.028275825083255768,
-0.00875248946249485,
-0.06446287781000137,
0.06708896905183792,
-0.000036393354093888775,
-0.23352891206741333,
0.04234522953629494,
-0.1726761758327484,
-0.17026235163211823,
-0.019642986357212067,
0.07141492515802383,
0.004764500539749861,
0.0567837692797184,
0.0009513521217741072,
0.02319357916712761,
0.11556002497673035,
-0.013642232865095139,
-0.0024187578819692135,
-0.11446313560009003,
0.11592969298362732,
-0.10184352844953537,
0.19952619075775146,
-0.006810266524553299,
0.05870137736201286,
0.09626732766628265,
0.03735201060771942,
-0.1371658593416214,
0.0226956307888031,
0.06300923228263855,
-0.12646740674972534,
-0.003655396867543459,
0.1484236866235733,
-0.030287152156233788,
0.06281229108572006,
0.026912841945886612,
-0.14524123072624207,
0.0023058108054101467,
0.018502216786146164,
-0.03548143059015274,
-0.0689658597111702,
-0.00844538677483797,
-0.05077024921774864,
0.16472196578979492,
0.21608036756515503,
-0.029995253309607506,
0.008438521064817905,
-0.09075109660625458,
0.012629843316972256,
0.04740457981824875,
0.05099824443459511,
-0.0407547689974308,
-0.20560882985591888,
0.01492819469422102,
0.07244383543729782,
-0.005600171163678169,
-0.19707714021205902,
-0.09629134088754654,
0.046711333096027374,
-0.039590202271938324,
-0.04272126033902168,
0.09325674921274185,
0.022103015333414078,
0.04132242500782013,
-0.013356651179492474,
-0.11310400813817978,
-0.02204570733010769,
0.13914203643798828,
-0.17533327639102936,
-0.03302976116538048
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 12.573320719016083, 'f1': 22.855895753681814}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 12.573320719016083, 'f1': 22.855895753681814}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 12.573320719016083, 'f1': 22.855895753681814}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 12.573320719016083, 'f1': 22.855895753681814}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
46,
56,
6,
12,
8,
3,
104,
33,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results\n\n{'exact_match': 12.573320719016083, 'f1': 22.855895753681814}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.11064630001783371,
0.09763943403959274,
-0.0019479378825053573,
0.10145445168018341,
0.13598573207855225,
0.036059461534023285,
0.09791992604732513,
0.12309683859348297,
-0.084535613656044,
0.06979364901781082,
0.06504430621862411,
0.041208360344171524,
0.05194171145558357,
0.11598207801580429,
-0.01988000050187111,
-0.2757573425769806,
-0.01649329625070095,
-0.0008730520494282246,
-0.12208962440490723,
0.1163511872291565,
0.11404604464769363,
-0.09258270263671875,
0.07444823533296585,
0.012262255884706974,
-0.16387253999710083,
0.02467178739607334,
-0.017179716378450394,
-0.027167044579982758,
0.11562648415565491,
0.003548957407474518,
0.11425082385540009,
0.019707683473825455,
0.1404082328081131,
-0.20328855514526367,
0.007020028308033943,
0.08592621982097626,
0.04397903010249138,
0.10175924748182297,
0.06964089721441269,
-0.010523083619773388,
0.08564318716526031,
-0.14293959736824036,
0.07978805154561996,
0.04413049295544624,
-0.09913592040538788,
-0.16713403165340424,
-0.1008225753903389,
0.062168851494789124,
0.06668088585138321,
0.07875550538301468,
0.0033080249559134245,
0.11576522141695023,
-0.053031519055366516,
0.08162672072649002,
0.24656587839126587,
-0.30216333270072937,
-0.08311853557825089,
0.056073565036058426,
0.05742230266332626,
0.04616467282176018,
-0.12561285495758057,
0.00005344640158000402,
0.023413967341184616,
0.031058261170983315,
0.09671231359243393,
-0.020961787551641464,
-0.13743463158607483,
-0.00949440523982048,
-0.1266951858997345,
0.014427674002945423,
0.10022161155939102,
0.04596598073840141,
-0.04830940067768097,
-0.0768083930015564,
-0.05992928892374039,
-0.08817923814058304,
-0.015241830609738827,
-0.05153863877058029,
0.04220797121524811,
-0.054808370769023895,
-0.0744110718369484,
-0.06390266120433807,
-0.06888092309236526,
-0.08443617820739746,
-0.012844400480389595,
0.1710885614156723,
0.0258098803460598,
0.022148026153445244,
-0.025329792872071266,
0.11947059631347656,
0.010193097405135632,
-0.13714547455310822,
-0.0010248477337881923,
-0.002462774980813265,
-0.11430998891592026,
-0.04256424307823181,
-0.05247258394956589,
0.0008406716515310109,
-0.008812707848846912,
0.15944603085517883,
-0.08721785992383957,
0.07540726661682129,
0.03379572182893753,
-0.015759192407131195,
-0.03397310525178909,
0.13900607824325562,
-0.07864191383123398,
-0.04586544632911682,
-0.018734928220510483,
0.08564189821481705,
-0.007514681201428175,
-0.013618632219731808,
-0.062381576746702194,
-0.03905925527215004,
0.0688512921333313,
0.05622544884681702,
-0.033297497779130936,
0.04467662796378136,
-0.023520635440945625,
-0.04331140220165253,
0.014922612346708775,
-0.1238035336136818,
0.03628692403435707,
0.003301346441730857,
-0.10804396867752075,
-0.03398279845714569,
0.015269043855369091,
0.0024175159633159637,
-0.008286656811833382,
0.08681381493806839,
-0.09268007427453995,
0.0035302084870636463,
-0.07731226086616516,
-0.0867733433842659,
-0.002859254367649555,
-0.1250196248292923,
-0.016233833506703377,
-0.04262392967939377,
-0.1719173640012741,
-0.0470641627907753,
0.042754486203193665,
-0.07501725107431412,
-0.03275468945503235,
-0.026909762993454933,
-0.07214537262916565,
0.018825342878699303,
-0.0033819864038378,
0.20580823719501495,
-0.040906842797994614,
0.07294826209545135,
-0.002710116095840931,
0.04137516766786575,
0.011333109810948372,
0.03901257738471031,
-0.08327333629131317,
0.03066110610961914,
-0.1286170333623886,
0.08578338474035263,
-0.1057562381029129,
0.0012587928213179111,
-0.1371128261089325,
-0.09647426009178162,
0.014636656269431114,
-0.012645925395190716,
0.08259113878011703,
0.12723955512046814,
-0.20765185356140137,
-0.011101610027253628,
0.11580323427915573,
-0.06627710163593292,
-0.07657739520072937,
0.05785248428583145,
-0.04817429184913635,
0.04722050577402115,
0.03791561350226402,
0.1709180623292923,
0.10660412907600403,
-0.14478668570518494,
-0.01465565524995327,
0.016705119982361794,
0.050577688962221146,
0.03844757378101349,
0.04556694254279137,
-0.005898557137697935,
0.045311544090509415,
0.011346007697284222,
-0.12171167880296707,
-0.031167300418019295,
-0.0947677493095398,
-0.07043833285570145,
-0.05247680842876434,
-0.09026601165533066,
0.05273546651005745,
0.02876768447458744,
0.03388211503624916,
-0.05686198174953461,
-0.11523265391588211,
0.11681323498487473,
0.11230648308992386,
-0.046158019453287125,
0.021351179108023643,
-0.07470239698886871,
-0.008869918063282967,
0.015818940475583076,
-0.03944273293018341,
-0.2163621038198471,
-0.16651874780654907,
0.026633067056536674,
-0.05738384649157524,
0.04181625321507454,
0.015458296053111553,
0.08864776045084,
0.05423763021826744,
-0.05513474717736244,
-0.007394094951450825,
-0.08166965842247009,
-0.0065443748608231544,
-0.09920254349708557,
-0.20459873974323273,
-0.10397352278232574,
-0.02305339090526104,
0.16299037635326385,
-0.20468291640281677,
0.008572705090045929,
0.0011381512740626931,
0.14618618786334991,
0.011216040700674057,
-0.057787757366895676,
0.0031753259245306253,
0.03161236643791199,
0.008814606815576553,
-0.09545887261629105,
0.04749961569905281,
0.010178026743233204,
-0.103001669049263,
-0.04278237372636795,
-0.13682164251804352,
-0.003313570050522685,
0.05129639431834221,
0.0486275739967823,
-0.10915294289588928,
-0.04172883555293083,
-0.07186105102300644,
-0.04404468834400177,
-0.07125580310821533,
0.006508066318929195,
0.1847175806760788,
0.033604301512241364,
0.10152548551559448,
-0.05308322608470917,
-0.08018098026514053,
-0.01422901265323162,
0.017283516004681587,
0.012541438452899456,
0.09524795413017273,
0.03052496537566185,
-0.0651831179857254,
0.07932829111814499,
0.06824961304664612,
-0.048226404935121536,
0.12886644899845123,
-0.04740440100431442,
-0.07363642007112503,
-0.03146335855126381,
-0.004375674296170473,
-0.01599850319325924,
0.13070397078990936,
-0.027752043679356575,
0.012683541513979435,
0.03636379912495613,
0.019139772281050682,
0.025335820391774178,
-0.17185279726982117,
-0.005892484448850155,
0.01842915639281273,
-0.05502387508749962,
-0.03871787339448929,
-0.015289342030882835,
0.0359056256711483,
0.09449296444654465,
0.02349313162267208,
0.0005776691250503063,
0.008544146083295345,
-0.012501824647188187,
-0.07097290456295013,
0.2073363959789276,
-0.09951809793710709,
-0.10038191080093384,
-0.1091856136918068,
0.0444394052028656,
-0.067458376288414,
-0.03901367634534836,
-0.00835083331912756,
-0.08557671308517456,
-0.03436228632926941,
-0.06679030507802963,
0.007234431803226471,
-0.013391632586717606,
-0.004568134434521198,
0.01227013673633337,
-0.0007896085735410452,
0.10074853152036667,
-0.1449194699525833,
0.025158822536468506,
-0.041348882019519806,
-0.11511024832725525,
-0.007740349508821964,
0.0783471092581749,
0.09806563705205917,
0.1094604954123497,
-0.013663951307535172,
0.02028258703649044,
-0.025797652080655098,
0.22602564096450806,
-0.06701713055372238,
0.024088943377137184,
0.12206805497407913,
-0.005623720120638609,
0.05294974520802498,
0.10019919276237488,
0.049255963414907455,
-0.08864864706993103,
0.024134665727615356,
0.10769864916801453,
-0.023838242515921593,
-0.25441768765449524,
-0.026988472789525986,
-0.007096587680280209,
-0.0733940452337265,
0.09888079017400742,
0.026464691385626793,
-0.0194595605134964,
0.05878882482647896,
-0.0028107056859880686,
-0.002585010137408972,
-0.030693432316184044,
0.058431364595890045,
0.07704553753137589,
0.04776271432638168,
0.11639820784330368,
-0.013446141965687275,
-0.014951507560908794,
0.05478600040078163,
0.021308360621333122,
0.24776674807071686,
-0.05113491415977478,
0.11275581270456314,
0.03949800878763199,
0.15224115550518036,
-0.03278699517250061,
0.061726972460746765,
0.0064925141632556915,
-0.018017245456576347,
0.016394110396504402,
-0.0627804696559906,
-0.01017023529857397,
0.010949778370559216,
-0.034643422812223434,
0.03931162878870964,
-0.07454214245080948,
0.042011525481939316,
0.01979692652821541,
0.28166088461875916,
0.026726054027676582,
-0.2640853226184845,
-0.08658444136381149,
-0.007759448140859604,
-0.012646108865737915,
-0.07366399466991425,
-0.008169465698301792,
0.13079914450645447,
-0.1411266326904297,
0.06492333859205246,
-0.0770334005355835,
0.09089270234107971,
-0.026913389563560486,
-0.003535655327141285,
0.08117560297250748,
0.15461918711662292,
-0.017006585374474525,
0.06499705463647842,
-0.18610185384750366,
0.2491147220134735,
0.021517174318432808,
0.09880417585372925,
-0.050802476704120636,
0.016103364527225494,
0.024540208280086517,
0.028401315212249756,
0.10523154586553574,
0.00399006949737668,
-0.051098983734846115,
-0.15948857367038727,
-0.07307204604148865,
0.03824587166309357,
0.14711306989192963,
-0.05857478827238083,
0.08943864703178406,
-0.038548506796360016,
0.006146108265966177,
0.04160875827074051,
-0.06417015939950943,
-0.1565619856119156,
-0.08730895072221756,
0.014542044140398502,
-0.011030824854969978,
-0.01667049154639244,
-0.06637226045131683,
-0.10165993869304657,
-0.01388828456401825,
0.1295832246541977,
-0.03268210217356682,
-0.057167306542396545,
-0.15598291158676147,
0.09974341094493866,
0.1679159551858902,
-0.06702186912298203,
0.015307720750570297,
0.016419516876339912,
0.1252967268228531,
0.03641090914607048,
-0.08773569017648697,
0.06501713395118713,
-0.06921885907649994,
-0.17456530034542084,
-0.03884553164243698,
0.13792501389980316,
0.07797075062990189,
0.046269532293081284,
-0.003427889896556735,
0.044636912643909454,
0.006609043572098017,
-0.10387877374887466,
0.034895092248916626,
-0.023933378979563713,
0.027492163702845573,
0.039038874208927155,
-0.08593589812517166,
0.06575371325016022,
-0.04313292354345322,
0.009518051519989967,
0.11526412516832352,
0.20150426030158997,
-0.11317005753517151,
0.06272008270025253,
0.05865517631173134,
-0.07784879207611084,
-0.1754852533340454,
0.08453657478094101,
0.1519019603729248,
0.0034698627423495054,
0.07971315830945969,
-0.2148018181324005,
0.14955000579357147,
0.12777116894721985,
-0.019980058073997498,
0.0592714287340641,
-0.27779749035835266,
-0.13980700075626373,
0.0770663246512413,
0.09976744651794434,
-0.007549794390797615,
-0.1461760401725769,
-0.04902268946170807,
-0.007828357629477978,
-0.16598498821258545,
0.148215189576149,
-0.08795302361249924,
0.10574191063642502,
-0.007734512444585562,
0.10576455295085907,
0.03155207633972168,
-0.022696686908602715,
0.13001613318920135,
0.0691170021891594,
0.08722849190235138,
-0.03937968611717224,
-0.00515443691983819,
0.08158909529447556,
-0.05419408529996872,
0.034179236739873886,
-0.014109550043940544,
0.06839116662740707,
-0.14964525401592255,
0.010339091531932354,
-0.0998564288020134,
0.0738145187497139,
-0.06704650819301605,
-0.062471624463796616,
-0.02008558064699173,
0.061861906200647354,
0.037474215030670166,
-0.03896349295973778,
0.04666239395737648,
0.0005714298458769917,
0.1375628113746643,
0.10749313980340958,
0.08704029768705368,
-0.0018086830386891961,
-0.1074608638882637,
0.01523521076887846,
0.0036591754760593176,
0.04930354282259941,
-0.10279235243797302,
0.022431036457419395,
0.1374976485967636,
0.08097969740629196,
0.10193558037281036,
0.043683845549821854,
-0.04713615030050278,
-0.006511836778372526,
0.030705278739333153,
-0.11668144911527634,
-0.12637604773044586,
0.003963861148804426,
-0.06478524208068848,
-0.155874565243721,
0.027241481468081474,
0.1119026467204094,
-0.031077323481440544,
-0.014965006150305271,
-0.0066231475211679935,
0.010894686914980412,
-0.018820015713572502,
0.19988729059696198,
0.06922455877065659,
0.07609652727842331,
-0.08972419053316116,
0.11405029892921448,
0.04665127396583557,
-0.0453946627676487,
0.025458676740527153,
0.08795285224914551,
-0.08815868198871613,
-0.0005672572297044098,
0.062341999262571335,
0.09642647206783295,
-0.0911257266998291,
-0.017004545778036118,
-0.08542180061340332,
-0.10612764954566956,
0.028421755880117416,
0.1471976637840271,
0.04721743240952492,
-0.006064987275749445,
-0.03629138320684433,
0.037941351532936096,
-0.12877912819385529,
0.07455185800790787,
0.05643201246857643,
0.08319592475891113,
-0.11758613586425781,
0.15746791660785675,
0.005101328250020742,
0.037392277270555496,
-0.015357454307377338,
0.023581307381391525,
-0.09840670228004456,
-0.034863099455833435,
-0.11399950087070465,
-0.032966651022434235,
-0.007948126643896103,
0.004751034080982208,
-0.026680124923586845,
-0.07544808834791183,
-0.04220149666070938,
0.04622210934758186,
-0.07592248916625977,
-0.06252077966928482,
0.011409346014261246,
0.027168873697519302,
-0.1654893159866333,
-0.001977776875719428,
0.036817941814661026,
-0.09139031916856766,
0.08007370680570602,
0.06813081353902817,
0.0400710366666317,
0.0322231762111187,
-0.12426821142435074,
-0.024287879467010498,
-0.016607066616415977,
0.00969067495316267,
0.07499610632658005,
-0.10802671313285828,
-0.02039310708642006,
-0.05612611025571823,
0.06003721058368683,
0.01323664840310812,
0.08670643717050552,
-0.12272191047668457,
0.00259268912486732,
-0.04731031134724617,
-0.02442273311316967,
-0.06790561228990555,
0.04458849877119064,
0.10985948890447617,
0.0607636421918869,
0.1462865024805069,
-0.06697823107242584,
0.04815573990345001,
-0.2205304503440857,
-0.03272601217031479,
-0.007734114304184914,
-0.04558176174759865,
-0.05428094416856766,
-0.029737500473856926,
0.09131406992673874,
-0.05833590030670166,
0.09044168889522552,
-0.011476119980216026,
0.10398814082145691,
0.04501194506883621,
0.0003907583886757493,
-0.05415671318769455,
0.010993110947310925,
0.16130323708057404,
0.04507318511605263,
-0.00785102043300867,
0.1191633865237236,
0.011958830989897251,
0.02350819855928421,
0.04759740084409714,
0.23190106451511383,
0.13066457211971283,
-0.04783806949853897,
0.05136590078473091,
0.09327326714992523,
-0.10772613435983658,
-0.1277863085269928,
0.11882497370243073,
-0.02620667964220047,
0.0904526561498642,
-0.0659305676817894,
0.16581296920776367,
0.06296145170927048,
-0.19470414519309998,
0.06747851520776749,
-0.04905551299452782,
-0.12008675932884216,
-0.11844466626644135,
-0.007844164967536926,
-0.06589775532484055,
-0.11596190184354782,
0.020443705841898918,
-0.13493698835372925,
0.06172539293766022,
0.13392023742198944,
0.007825786247849464,
0.03698677569627762,
0.18691067397594452,
-0.05245308578014374,
0.002846569288522005,
0.06885949522256851,
0.026132110506296158,
0.009775704704225063,
-0.04257358983159065,
-0.05579932779073715,
0.04461189731955528,
0.03364413604140282,
0.05727502331137657,
-0.06843595206737518,
-0.009141038171947002,
0.012857124209403992,
-0.011612766422331333,
-0.07499091327190399,
0.008148755878210068,
0.0169165451079607,
0.049654584378004074,
0.03979909420013428,
0.04675600305199623,
0.015445143915712833,
-0.05706257000565529,
0.32155629992485046,
-0.06948967278003693,
-0.08191250264644623,
-0.1387905478477478,
0.20614077150821686,
0.015345138497650623,
-0.02112814597785473,
0.06245720386505127,
-0.10015349090099335,
-0.005105011630803347,
0.1525050848722458,
0.14916177093982697,
-0.09440474957227707,
-0.02073255367577076,
-0.012393332086503506,
-0.01224230881780386,
-0.0200791135430336,
0.12024671584367752,
0.08671484142541885,
0.022627772763371468,
-0.07588944584131241,
-0.010126742534339428,
-0.0152195505797863,
-0.048868920654058456,
-0.06743227690458298,
0.0956454649567604,
0.024399923160672188,
0.00541346101090312,
-0.03826330974698067,
0.07986106723546982,
-0.000645206484477967,
-0.2311360239982605,
0.043296944350004196,
-0.173956960439682,
-0.18389996886253357,
-0.046044208109378815,
0.05759849026799202,
-0.0027224274817854166,
0.06392352283000946,
0.004882234614342451,
-0.0062549905851483345,
0.12573358416557312,
-0.0072447387501597404,
-0.027311548590660095,
-0.13838011026382446,
0.10860319435596466,
-0.10796906054019928,
0.2215200960636139,
-0.01993430033326149,
0.056075986474752426,
0.09600316733121872,
0.01798715442419052,
-0.14335474371910095,
0.007495259400457144,
0.07108486443758011,
-0.11898573487997055,
0.01564957946538925,
0.15024298429489136,
-0.03582887351512909,
0.07100319862365723,
0.02654379978775978,
-0.16539674997329712,
0.0022249682806432247,
0.014766406267881393,
-0.032631080597639084,
-0.07524224370718002,
-0.000991908018477261,
-0.06294844299554825,
0.1597958654165268,
0.2411101758480072,
-0.038683295249938965,
0.006502718199044466,
-0.09748691320419312,
0.0007259503472596407,
0.055432695895433426,
0.10383066534996033,
-0.03619963303208351,
-0.2096835821866989,
0.028274375945329666,
0.043626997619867325,
0.007957851514220238,
-0.20732659101486206,
-0.08394096046686172,
0.06127751246094704,
-0.054801035672426224,
-0.01485280878841877,
0.10115214437246323,
0.056893300265073776,
0.0510004460811615,
-0.018766019493341446,
-0.10272336006164551,
-0.03434167802333832,
0.15205876529216766,
-0.17696581780910492,
-0.04136255010962486
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09775431454181671,
0.1076914370059967,
-0.0023343523498624563,
0.09570496529340744,
0.12145982682704926,
0.015692301094532013,
0.09881990402936935,
0.13050757348537445,
-0.10478448867797852,
0.06833452731370926,
0.08723822236061096,
0.0348428376019001,
0.04383012279868126,
0.14951620995998383,
-0.006560569163411856,
-0.27339500188827515,
-0.00003564513099263422,
-0.0003071820829063654,
-0.04183138534426689,
0.12079605460166931,
0.08815325796604156,
-0.1105550229549408,
0.07704444229602814,
0.00992493238300085,
-0.15493346750736237,
0.01972031034529209,
-0.032125867903232574,
-0.03575804829597473,
0.12209487706422806,
-0.03740759193897247,
0.10681012272834778,
0.030652156099677086,
0.13701878488063812,
-0.20743972063064575,
0.00664126081392169,
0.07641320675611496,
0.05236897990107536,
0.09991850703954697,
0.04735150560736656,
0.010297991335391998,
0.0893101766705513,
-0.14869171380996704,
0.09259922802448273,
0.029977932572364807,
-0.09142395108938217,
-0.15116767585277557,
-0.092636339366436,
0.030283749103546143,
0.050542399287223816,
0.07244835793972015,
0.0019364000763744116,
0.14710532128810883,
-0.06436875462532043,
0.08402916043996811,
0.2633202075958252,
-0.322406530380249,
-0.06766432523727417,
0.03078266605734825,
0.05691235139966011,
0.06007272005081177,
-0.12031479924917221,
-0.003248621942475438,
0.02028135024011135,
0.02827092818915844,
0.12656141817569733,
-0.016022944822907448,
-0.10867726802825928,
-0.009769868105649948,
-0.1252661645412445,
-0.0019069005502387881,
0.058872051537036896,
0.02629685029387474,
-0.05302419885993004,
-0.10682983696460724,
-0.06758908927440643,
-0.08442754298448563,
-0.021975580602884293,
-0.05541921779513359,
0.051639292389154434,
-0.05539872124791145,
-0.09711288660764694,
-0.039623573422431946,
-0.05789697915315628,
-0.08059701323509216,
-0.007899584248661995,
0.16580724716186523,
0.03592671453952789,
0.020235035568475723,
-0.030529649928212166,
0.11851643025875092,
0.02138669230043888,
-0.13959459960460663,
-0.009460695087909698,
-0.0040710377506911755,
-0.09332054853439331,
-0.04133347421884537,
-0.052871353924274445,
-0.011895190924406052,
0.005048038903623819,
0.16591525077819824,
-0.08172724395990372,
0.07493412494659424,
0.013679150491952896,
-0.02427145466208458,
-0.01311041321605444,
0.1513615846633911,
-0.039469171315431595,
-0.0373363271355629,
-0.01742708869278431,
0.0805177241563797,
0.0030781091190874577,
-0.01850968599319458,
-0.06618949770927429,
-0.02774551697075367,
0.09454318135976791,
0.055923715233802795,
-0.06308376044034958,
0.039060771465301514,
-0.029727397486567497,
-0.026000363752245903,
0.01843223348259926,
-0.11938923597335815,
0.04140418767929077,
-0.003939898684620857,
-0.08188235014677048,
-0.008747749030590057,
-0.0024237867910414934,
-0.009282057173550129,
-0.010523248463869095,
0.09876377880573273,
-0.08739449828863144,
-0.0006020650616846979,
-0.06992492079734802,
-0.08024053275585175,
-0.0005211018724367023,
-0.15631625056266785,
-0.014132223092019558,
-0.05842534825205803,
-0.16590504348278046,
-0.03385196626186371,
0.04332658275961876,
-0.07592187076807022,
-0.012823933735489845,
-0.04424494132399559,
-0.0628243163228035,
0.017081687226891518,
-0.012816151604056358,
0.1931278109550476,
-0.05233263969421387,
0.0813058614730835,
-0.007034297566860914,
0.050577182322740555,
0.02734176628291607,
0.035872362554073334,
-0.1034352108836174,
0.02804054692387581,
-0.13979604840278625,
0.07707106322050095,
-0.08578012138605118,
-0.004654903430491686,
-0.13747616112232208,
-0.10222131758928299,
0.014140564948320389,
-0.020612308755517006,
0.09363175183534622,
0.1343565434217453,
-0.19843602180480957,
-0.018755774945020676,
0.1254129260778427,
-0.07645862549543381,
-0.051792342215776443,
0.059674929827451706,
-0.0615348145365715,
0.04099756479263306,
0.05113156512379646,
0.21238645911216736,
0.055025216192007065,
-0.15613725781440735,
-0.011324000544846058,
0.005162524059414864,
0.04514511302113533,
0.02641613595187664,
0.03759239241480827,
0.0034773522056639194,
0.05802357569336891,
0.015959033742547035,
-0.09008252620697021,
-0.025234300643205643,
-0.09010714292526245,
-0.06616955995559692,
-0.05042973533272743,
-0.07541907578706741,
0.05480872094631195,
0.008177225477993488,
0.04121895506978035,
-0.06534210592508316,
-0.10290716588497162,
0.11260811984539032,
0.09569483995437622,
-0.05117543414235115,
0.0381988063454628,
-0.08070509880781174,
0.012574040330946445,
-0.005856841336935759,
-0.03572627902030945,
-0.21180222928524017,
-0.1169489175081253,
0.051328420639038086,
-0.04440830647945404,
0.025475187227129936,
0.0030358743388205767,
0.0842336043715477,
0.05542340874671936,
-0.051376473158597946,
-0.015805324539542198,
-0.09712255001068115,
0.0014187435153871775,
-0.114490807056427,
-0.19088692963123322,
-0.08545857667922974,
-0.043160442262887955,
0.0920758843421936,
-0.16794928908348083,
-0.005360277835279703,
0.021943887695670128,
0.13782159984111786,
0.027613194659352303,
-0.06746894866228104,
0.0013498567277565598,
0.046385932713747025,
0.013507024385035038,
-0.0963740274310112,
0.054764967411756516,
0.012257908470928669,
-0.10292491316795349,
-0.047467976808547974,
-0.13184252381324768,
-0.018727056682109833,
0.05363158881664276,
0.06105140224099159,
-0.1041472777724266,
-0.05981099605560303,
-0.07288912683725357,
-0.03790127485990524,
-0.07843493670225143,
0.023509962484240532,
0.2122940570116043,
0.038528427481651306,
0.11237959563732147,
-0.06526992470026016,
-0.08418471366167068,
-0.007050399202853441,
0.026651622727513313,
0.022292647510766983,
0.08635273575782776,
0.02426484227180481,
-0.039304617792367935,
0.06871086359024048,
0.10285333544015884,
-0.02589726448059082,
0.13249102234840393,
-0.05600107088685036,
-0.0812416672706604,
-0.03054424375295639,
-0.02082650735974312,
-0.025757046416401863,
0.1293507069349289,
-0.027316266670823097,
0.00040446469211019576,
0.034245628863573074,
0.03864172101020813,
0.011207695119082928,
-0.16918863356113434,
0.0026100068353116512,
0.027373194694519043,
-0.055408578366041183,
-0.04091397300362587,
-0.006171464454382658,
0.021112358197569847,
0.08891446143388748,
0.030065612867474556,
-0.007989565841853619,
0.008502340875566006,
-0.011302508413791656,
-0.058549586683511734,
0.18860457837581635,
-0.09576108306646347,
-0.07948590070009232,
-0.07096948474645615,
0.021029161289334297,
-0.05062054470181465,
-0.038934461772441864,
0.008388957940042019,
-0.09441749006509781,
-0.029503080993890762,
-0.08844419568777084,
-0.02673451602458954,
-0.01964280568063259,
0.019331667572259903,
0.024060098454356194,
-0.016607291996479034,
0.08498451858758926,
-0.13802500069141388,
0.005503326654434204,
-0.049308910965919495,
-0.09551946073770523,
0.008881098590791225,
0.07624734193086624,
0.09189999103546143,
0.08168981969356537,
-0.019116397947072983,
0.02809436246752739,
-0.039068836718797684,
0.23255859315395355,
-0.05351262539625168,
0.012547983787953854,
0.11452983319759369,
-0.010169100016355515,
0.0549515075981617,
0.09287425130605698,
0.03810378909111023,
-0.09068065881729126,
0.024193396791815758,
0.07777591794729233,
-0.0377228781580925,
-0.22693197429180145,
-0.01850196346640587,
-0.0007541078375652432,
-0.0766751766204834,
0.1071443036198616,
0.032379209995269775,
-0.04979957267642021,
0.04445018246769905,
0.023198552429676056,
-0.010120587423443794,
-0.04548987001180649,
0.07454954832792282,
0.0703563466668129,
0.05161972716450691,
0.1063164621591568,
-0.005564903374761343,
-0.026464713737368584,
0.05503617972135544,
0.017367931082844734,
0.25464338064193726,
-0.04365045949816704,
0.1011485680937767,
0.0322340652346611,
0.1539672464132309,
-0.020303290337324142,
0.06722170859575272,
0.0010774260153993964,
-0.0101466104388237,
-0.011054640635848045,
-0.06563033163547516,
-0.023680413141846657,
0.01729843020439148,
-0.04713877663016319,
0.02422456629574299,
-0.0771479606628418,
0.024845026433467865,
0.026651352643966675,
0.29247045516967773,
0.028861772269010544,
-0.25994083285331726,
-0.07286671549081802,
-0.016097143292427063,
-0.04540233314037323,
-0.06102359667420387,
0.007771589793264866,
0.1338430941104889,
-0.13963447511196136,
0.0529630184173584,
-0.07869863510131836,
0.08846183121204376,
-0.04458434507250786,
0.01227550208568573,
0.0467037707567215,
0.1505051553249359,
-0.018720710650086403,
0.05241619423031807,
-0.19925539195537567,
0.25300416350364685,
0.020355602726340294,
0.10675131529569626,
-0.0648033395409584,
0.01085782889276743,
0.019556475803256035,
0.00994318351149559,
0.11252721399068832,
0.002874798374250531,
-0.06662122160196304,
-0.14662882685661316,
-0.09038406610488892,
0.04603823646903038,
0.14171481132507324,
-0.03918446972966194,
0.0896482914686203,
-0.028244780376553535,
0.0116727901622653,
0.030167005956172943,
-0.04167325422167778,
-0.15288735926151276,
-0.07765644043684006,
0.0007307216292247176,
0.013703441247344017,
-0.006906373891979456,
-0.06105382367968559,
-0.10547274351119995,
-0.018252328038215637,
0.1113412007689476,
0.00006283808761509135,
-0.057527992874383926,
-0.15429633855819702,
0.08308181166648865,
0.14175809919834137,
-0.05468286573886871,
0.011983558535575867,
0.0174451544880867,
0.11442701518535614,
0.02922496572136879,
-0.08063679933547974,
0.06494408845901489,
-0.05707390606403351,
-0.18198031187057495,
-0.05563778802752495,
0.12217076867818832,
0.08308536559343338,
0.04953841120004654,
-0.0007766443304717541,
0.05402345582842827,
0.002691591391339898,
-0.09526845067739487,
0.04064371436834335,
0.0012294019106775522,
0.043598588556051254,
0.017559250816702843,
-0.08401715010404587,
0.09534578770399094,
-0.03801833093166351,
0.009829304181039333,
0.1278378665447235,
0.21303562819957733,
-0.10714848339557648,
0.1146843358874321,
0.08603032678365707,
-0.07394059747457504,
-0.16609394550323486,
0.0618150420486927,
0.13065573573112488,
0.008444413542747498,
0.08641283214092255,
-0.21261560916900635,
0.12407371401786804,
0.10221488028764725,
-0.013021995313465595,
0.006926980800926685,
-0.2764972746372223,
-0.13088911771774292,
0.05886385962367058,
0.11234325915575027,
0.03987109661102295,
-0.11433054506778717,
-0.033078256994485855,
-0.006621073465794325,
-0.10067925602197647,
0.11433276534080505,
-0.0727778822183609,
0.11393425613641739,
-0.019824348390102386,
0.11758676916360855,
0.02586887963116169,
-0.03429402410984039,
0.10797643661499023,
0.06243825703859329,
0.08761052787303925,
-0.036921802908182144,
0.006778469309210777,
0.05918342247605324,
-0.05921797454357147,
0.02755899354815483,
-0.042145080864429474,
0.06734944134950638,
-0.14694686233997345,
0.006234204396605492,
-0.0883101373910904,
0.05349801480770111,
-0.04657716304063797,
-0.0718565508723259,
-0.01712770201265812,
0.053831491619348526,
0.0695657879114151,
-0.04133514687418938,
0.02986242063343525,
-0.001773547031916678,
0.10025010257959366,
0.10453224182128906,
0.0821482315659523,
-0.024079373106360435,
-0.08665788173675537,
0.013120617717504501,
0.0032291014213114977,
0.05480360612273216,
-0.09633953124284744,
0.013564685359597206,
0.14178301393985748,
0.06528331339359283,
0.09588593244552612,
0.04657083377242088,
-0.043405383825302124,
0.00475613446906209,
0.013962852768599987,
-0.13239267468452454,
-0.1047205924987793,
0.025173287838697433,
-0.041956376284360886,
-0.15196478366851807,
0.028375068679451942,
0.11943445354700089,
-0.040482327342033386,
-0.021623220294713974,
-0.0053749349899590015,
0.004563795402646065,
-0.012303372845053673,
0.1845642626285553,
0.04499790444970131,
0.06347257643938065,
-0.09062759578227997,
0.11139783263206482,
0.03418373689055443,
-0.05115336924791336,
0.05361945554614067,
0.06696061789989471,
-0.10414858162403107,
0.010273130610585213,
0.08203082531690598,
0.1315595656633377,
-0.054287537932395935,
-0.011175893247127533,
-0.09570195525884628,
-0.08463986963033676,
0.04147370904684067,
0.13650448620319366,
0.055550478398799896,
-0.0023423251695930958,
-0.06456848978996277,
0.03626854345202446,
-0.12008275836706161,
0.06918267905712128,
0.04862750694155693,
0.07601187378168106,
-0.10102816671133041,
0.13265137374401093,
-0.0022647844161838293,
0.027429409325122833,
-0.026714520528912544,
0.013987070880830288,
-0.09962242841720581,
-0.024237144738435745,
-0.10596342384815216,
-0.02389344573020935,
-0.01182500272989273,
-0.0013799945591017604,
-0.02275564707815647,
-0.06924746185541153,
-0.02834361232817173,
0.03941652551293373,
-0.0793771967291832,
-0.04802684485912323,
0.016749538481235504,
0.036195892840623856,
-0.1537725180387497,
0.0032419227063655853,
0.025411726906895638,
-0.08931508660316467,
0.08939727395772934,
0.062051527202129364,
0.012408275157213211,
0.025203866884112358,
-0.11915651708841324,
-0.030160320922732353,
-0.009577061980962753,
0.0038632042706012726,
0.06866683810949326,
-0.09444645792245865,
-0.028785066679120064,
-0.03593182936310768,
0.04116680100560188,
0.019611921161413193,
0.10246321558952332,
-0.12023338675498962,
-0.0038241324946284294,
-0.037825535982847214,
-0.03964005783200264,
-0.06492532044649124,
0.03671132028102875,
0.10676530003547668,
0.05306214466691017,
0.15178239345550537,
-0.076932892203331,
0.055547405034303665,
-0.19916950166225433,
-0.038122352212667465,
0.011855600401759148,
-0.046867307275533676,
-0.08159323036670685,
-0.047023531049489975,
0.08858757466077805,
-0.047693055123090744,
0.11232573539018631,
-0.012616017833352089,
0.10173599421977997,
0.04405728355050087,
-0.010954853147268295,
-0.06474611908197403,
-0.005829334259033203,
0.18136700987815857,
0.05192524567246437,
-0.01787327416241169,
0.1272917240858078,
0.0037292679771780968,
0.02880581095814705,
0.08701644092798233,
0.22108076512813568,
0.1543562114238739,
0.0007731053628958762,
0.06257005035877228,
0.060904957354068756,
-0.06858396530151367,
-0.14816617965698242,
0.12042725831270218,
-0.018859636038541794,
0.10575900971889496,
-0.06609249114990234,
0.18923458456993103,
0.038057852536439896,
-0.1797941029071808,
0.06446538865566254,
-0.0255359448492527,
-0.10895445197820663,
-0.11870136857032776,
-0.027056032791733742,
-0.07106699794530869,
-0.12195707857608795,
0.025325942784547806,
-0.11772984266281128,
0.06141532212495804,
0.1054912656545639,
0.008184041827917099,
0.03828497976064682,
0.1835719645023346,
-0.04679412767291069,
0.013108811341226101,
0.08425630629062653,
0.01855018362402916,
0.00312166940420866,
-0.03966115787625313,
-0.06404328346252441,
0.03651207685470581,
0.03436167910695076,
0.06365141272544861,
-0.04874838888645172,
0.004494741093367338,
0.004145178943872452,
-0.008091290481388569,
-0.07621175795793533,
0.01127624325454235,
0.009557772427797318,
0.05178767815232277,
0.04666953533887863,
0.04795358330011368,
0.006117988843470812,
-0.055266812443733215,
0.2935585081577301,
-0.06950393319129944,
-0.06928805261850357,
-0.12899768352508545,
0.21792219579219818,
0.02444140426814556,
-0.025482622906565666,
0.05573158338665962,
-0.08713133633136749,
-0.013798542320728302,
0.1629212498664856,
0.1380762755870819,
-0.08906785398721695,
-0.015645727515220642,
-0.02403336949646473,
-0.010803325101733208,
-0.013993542641401291,
0.11508694291114807,
0.07505455613136292,
-0.011795572936534882,
-0.07080341875553131,
-0.01157066784799099,
-0.028005894273519516,
-0.056180957704782486,
-0.06815239787101746,
0.06927003711462021,
0.028580917045474052,
-0.009244512766599655,
-0.06467609852552414,
0.06738858669996262,
-0.0002421366807539016,
-0.23320819437503815,
0.04159238561987877,
-0.17384026944637299,
-0.16980096697807312,
-0.01955832727253437,
0.07129224389791489,
0.00490653607994318,
0.05640314891934395,
0.0011986519675701857,
0.0237982627004385,
0.11342720687389374,
-0.013543475419282913,
-0.0027085868641734123,
-0.11584464460611343,
0.11516474932432175,
-0.10242379456758499,
0.20009301602840424,
-0.00691541051492095,
0.057743415236473083,
0.0962427482008934,
0.03755711019039154,
-0.13771235942840576,
0.022429121658205986,
0.06336719542741776,
-0.12722130119800568,
-0.0038164921570569277,
0.14957647025585175,
-0.030333824455738068,
0.06320864707231522,
0.026567237451672554,
-0.1469048112630844,
0.002736880676820874,
0.01726686954498291,
-0.03503181412816048,
-0.06961428374052048,
-0.006599768064916134,
-0.05048026889562607,
0.16483399271965027,
0.21629598736763,
-0.029650242999196053,
0.008214427158236504,
-0.09129627794027328,
0.012287416495382786,
0.04627154394984245,
0.051841337233781815,
-0.04054822772741318,
-0.2056799829006195,
0.015381529927253723,
0.07216877490282059,
-0.005753830075263977,
-0.19655917584896088,
-0.09590291976928711,
0.046959683299064636,
-0.0406106635928154,
-0.04275667294859886,
0.09260381013154984,
0.022102534770965576,
0.0406857468187809,
-0.013435359112918377,
-0.1154397502541542,
-0.021631993353366852,
0.13972701132297516,
-0.17545630037784576,
-0.03263207525014877
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-128-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09824010729789734,
0.10852237045764923,
-0.0023614116944372654,
0.0961272120475769,
0.12146648019552231,
0.016354992985725403,
0.09847044944763184,
0.1304052770137787,
-0.10440272837877274,
0.06788332760334015,
0.08757714182138443,
0.03416566550731659,
0.04371022805571556,
0.1487341970205307,
-0.006712010595947504,
-0.2734770178794861,
-0.0005137796397320926,
0.00031895647407509387,
-0.03999117761850357,
0.1202917993068695,
0.08822812139987946,
-0.11062804609537125,
0.07692890614271164,
0.009765959344804287,
-0.1542588621377945,
0.019567538052797318,
-0.03161955624818802,
-0.0362950935959816,
0.12247056514024734,
-0.036699507385492325,
0.10688630491495132,
0.029895098879933357,
0.1370626837015152,
-0.20819012820720673,
0.006441728211939335,
0.07664965093135834,
0.05225919187068939,
0.09997865557670593,
0.046067818999290466,
0.011218865402042866,
0.08829407393932343,
-0.14914953708648682,
0.09286099672317505,
0.029558397829532623,
-0.09123373031616211,
-0.15164680778980255,
-0.09170422703027725,
0.03152851015329361,
0.0499463826417923,
0.07185624539852142,
0.002952634822577238,
0.14883917570114136,
-0.06314825266599655,
0.08444759994745255,
0.2635583281517029,
-0.3221454620361328,
-0.06687910854816437,
0.02977803163230419,
0.0566386803984642,
0.061354123055934906,
-0.11943139135837555,
-0.003081127069890499,
0.01992526464164257,
0.028096148744225502,
0.12706758081912994,
-0.01677887514233589,
-0.11013122648000717,
-0.009878215380012989,
-0.12469736486673355,
-0.00238270522095263,
0.05955313891172409,
0.02650885470211506,
-0.053213778883218765,
-0.10689175128936768,
-0.068299300968647,
-0.0852513313293457,
-0.022465167567133904,
-0.055796656757593155,
0.05103614181280136,
-0.05497314780950546,
-0.09625295549631119,
-0.039651721715927124,
-0.05737995356321335,
-0.07972181588411331,
-0.0076492768712341785,
0.16579416394233704,
0.036086536943912506,
0.02016415446996689,
-0.02972560003399849,
0.11810922622680664,
0.019422447308897972,
-0.13966472446918488,
-0.009003405459225178,
-0.0037779470440000296,
-0.09335912764072418,
-0.04151714965701103,
-0.05318174511194229,
-0.013015502132475376,
0.004646629095077515,
0.16750289499759674,
-0.08137880265712738,
0.07468588650226593,
0.014202606864273548,
-0.023664627224206924,
-0.012685463763773441,
0.15216705203056335,
-0.04011838510632515,
-0.038160767406225204,
-0.016501091420650482,
0.08012768626213074,
0.003974823746830225,
-0.019461339339613914,
-0.06728533655405045,
-0.02843385376036167,
0.09521313011646271,
0.055296849459409714,
-0.06303960829973221,
0.03853297233581543,
-0.029919913038611412,
-0.0262122955173254,
0.019240252673625946,
-0.11921004205942154,
0.041588012129068375,
-0.0039930278435349464,
-0.08154959976673126,
-0.007931300438940525,
-0.0016312351217493415,
-0.008584285154938698,
-0.010862943716347218,
0.09779959172010422,
-0.08655522763729095,
-0.00013919624325353652,
-0.06954233348369598,
-0.08042227476835251,
-0.0002143920719390735,
-0.15624897181987762,
-0.013287866488099098,
-0.05968109145760536,
-0.16604004800319672,
-0.033371422439813614,
0.04360165446996689,
-0.07530829310417175,
-0.012571039609611034,
-0.043172962963581085,
-0.06143655627965927,
0.01693127676844597,
-0.013437839224934578,
0.19097822904586792,
-0.05251527205109596,
0.08088421821594238,
-0.006980387959629297,
0.05023806914687157,
0.026018233969807625,
0.035954758524894714,
-0.10299649089574814,
0.027897607535123825,
-0.1398090422153473,
0.07699061930179596,
-0.08508556336164474,
-0.004544656258076429,
-0.13611921668052673,
-0.10212813317775726,
0.015178250148892403,
-0.019922111183404922,
0.0943581834435463,
0.13359251618385315,
-0.19834677875041962,
-0.018585653975605965,
0.12454827129840851,
-0.07606103271245956,
-0.05198553577065468,
0.0612458698451519,
-0.06174185127019882,
0.04096708446741104,
0.05168279632925987,
0.21233060956001282,
0.056678406894207,
-0.15615534782409668,
-0.010563304647803307,
0.006543274037539959,
0.04583970829844475,
0.025326481088995934,
0.03744100406765938,
0.0038887201808393,
0.058428049087524414,
0.01627550832927227,
-0.08889822661876678,
-0.02443018928170204,
-0.090047188103199,
-0.06661690026521683,
-0.05065886303782463,
-0.07515131682157516,
0.054470378905534744,
0.00890718400478363,
0.04129205644130707,
-0.06513811647891998,
-0.10348621010780334,
0.11418674886226654,
0.09534323215484619,
-0.05156811699271202,
0.03824787586927414,
-0.0803450271487236,
0.013133966363966465,
-0.005095975939184427,
-0.03563631698489189,
-0.21130940318107605,
-0.11575519293546677,
0.05149788409471512,
-0.04422564059495926,
0.024964116513729095,
0.003465467132627964,
0.08354286849498749,
0.05588360130786896,
-0.05082521587610245,
-0.01502190250903368,
-0.09656331688165665,
0.0017183359013870358,
-0.11534050107002258,
-0.18973691761493683,
-0.08563106507062912,
-0.042609915137290955,
0.09261206537485123,
-0.16876618564128876,
-0.0048045869916677475,
0.021470138803124428,
0.13712987303733826,
0.027160191908478737,
-0.06741118431091309,
0.0012868152698501945,
0.04710148274898529,
0.013266452588140965,
-0.09664461761713028,
0.05496962368488312,
0.012892769649624825,
-0.10384267568588257,
-0.048561710864305496,
-0.1322752833366394,
-0.017715901136398315,
0.053405847400426865,
0.0600610226392746,
-0.10406305640935898,
-0.060366515070199966,
-0.07235373556613922,
-0.03770189732313156,
-0.07767811417579651,
0.02266267128288746,
0.2132357507944107,
0.038789164274930954,
0.11337602138519287,
-0.06510482728481293,
-0.08355726301670074,
-0.006973559036850929,
0.02647239714860916,
0.02294851839542389,
0.08544372767210007,
0.02372632548213005,
-0.03701313957571983,
0.06794285029172897,
0.10198944061994553,
-0.026941437274217606,
0.13183671236038208,
-0.055688269436359406,
-0.08077758550643921,
-0.030794139951467514,
-0.021144118160009384,
-0.02552795223891735,
0.12931475043296814,
-0.029011111706495285,
-0.00022694618382956833,
0.03451074659824371,
0.03820760175585747,
0.011363593861460686,
-0.1691731959581375,
0.0024727655109018087,
0.02831152081489563,
-0.05487552285194397,
-0.039980992674827576,
-0.006924461107701063,
0.020688779652118683,
0.0880577340722084,
0.02970636822283268,
-0.008356808684766293,
0.009212225675582886,
-0.010974249802529812,
-0.05884597450494766,
0.1879332959651947,
-0.09575094282627106,
-0.080895334482193,
-0.07231607288122177,
0.02143101766705513,
-0.05074211210012436,
-0.03896178677678108,
0.00877514760941267,
-0.09268976747989655,
-0.02931782230734825,
-0.08883630484342575,
-0.028582952916622162,
-0.019007939845323563,
0.01927812397480011,
0.02451574243605137,
-0.017123185098171234,
0.0855465903878212,
-0.1378442645072937,
0.00523868203163147,
-0.04866153001785278,
-0.09460098296403885,
0.009407691657543182,
0.07601554691791534,
0.09277903288602829,
0.08220183104276657,
-0.01990104280412197,
0.02780437283217907,
-0.03875650838017464,
0.23239193856716156,
-0.05299743637442589,
0.012914996594190598,
0.11489206552505493,
-0.011213831603527069,
0.05514400452375412,
0.09227815270423889,
0.03845929726958275,
-0.0906972736120224,
0.023702019825577736,
0.07669460773468018,
-0.03837593272328377,
-0.22603102028369904,
-0.018405433744192123,
-0.0009054155671037734,
-0.0764288678765297,
0.10740789026021957,
0.03226090222597122,
-0.049241505563259125,
0.04493759199976921,
0.02339206263422966,
-0.008466150611639023,
-0.04629351198673248,
0.07443497329950333,
0.07106408476829529,
0.05159209296107292,
0.10618256777524948,
-0.005214238073676825,
-0.026585964486002922,
0.05530901998281479,
0.016699524596333504,
0.252509206533432,
-0.04384060949087143,
0.10207997262477875,
0.03114939108490944,
0.15465694665908813,
-0.020263876765966415,
0.06740543246269226,
0.0007595119532197714,
-0.010512019507586956,
-0.011281191371381283,
-0.0657908171415329,
-0.02522103860974312,
0.017828319221735,
-0.04771207645535469,
0.02458762191236019,
-0.07739461213350296,
0.025637304410338402,
0.026363486424088478,
0.2919926345348358,
0.02867516502737999,
-0.2602238357067108,
-0.07317081838846207,
-0.01642894372344017,
-0.045219194144010544,
-0.06157715991139412,
0.007720655761659145,
0.1346818059682846,
-0.13923867046833038,
0.05247126892209053,
-0.07796163111925125,
0.08859840035438538,
-0.045710571110248566,
0.012683862820267677,
0.045837756246328354,
0.15041868388652802,
-0.01840984635055065,
0.05272525176405907,
-0.1999647468328476,
0.251355916261673,
0.020699286833405495,
0.10706555843353271,
-0.06521369516849518,
0.011405936442315578,
0.01912611909210682,
0.011728663928806782,
0.1116592064499855,
0.0033119709696620703,
-0.06535285711288452,
-0.14812178909778595,
-0.09069129824638367,
0.04606449604034424,
0.1404189020395279,
-0.03760482743382454,
0.08927645534276962,
-0.02834326960146427,
0.011807802133262157,
0.030474552884697914,
-0.04082547128200531,
-0.15281443297863007,
-0.07843762636184692,
0.00025196492788381875,
0.015001139603555202,
-0.00660703657194972,
-0.06081756576895714,
-0.1050911396741867,
-0.019677693024277687,
0.11156310886144638,
0.0009344826685264707,
-0.05777885764837265,
-0.15429088473320007,
0.08364039659500122,
0.14088228344917297,
-0.05487058684229851,
0.011781272478401661,
0.017415011301636696,
0.11455637961626053,
0.02895430289208889,
-0.07975535839796066,
0.06498999148607254,
-0.05684075132012367,
-0.18101926147937775,
-0.05600403621792793,
0.1212652325630188,
0.08261659741401672,
0.04970153793692589,
-0.00032814181759022176,
0.05371944606304169,
0.0025543055962771177,
-0.09530635923147202,
0.03937770798802376,
0.0018072595121338964,
0.042633362114429474,
0.01741929166018963,
-0.08403463661670685,
0.09635206311941147,
-0.037674833089113235,
0.010036366060376167,
0.12903524935245514,
0.21257877349853516,
-0.10750790685415268,
0.11382166296243668,
0.08687609434127808,
-0.07383059710264206,
-0.16575828194618225,
0.06131347268819809,
0.1300634890794754,
0.00850024912506342,
0.08653620630502701,
-0.21204183995723724,
0.12412106990814209,
0.10325278341770172,
-0.01254283171147108,
0.006590924225747585,
-0.27652743458747864,
-0.13113215565681458,
0.059739600867033005,
0.11231810599565506,
0.04219629615545273,
-0.11492308974266052,
-0.03297875076532364,
-0.007429268676787615,
-0.1020054891705513,
0.11342855542898178,
-0.07241076976060867,
0.11372975260019302,
-0.019636815413832664,
0.1160908192396164,
0.026013216003775597,
-0.03450511395931244,
0.10838295519351959,
0.06311783939599991,
0.08757010102272034,
-0.03717995807528496,
0.00714905047789216,
0.058996133506298065,
-0.05936184152960777,
0.02842225879430771,
-0.0420050285756588,
0.06737764179706573,
-0.14877693355083466,
0.005952158942818642,
-0.08697351813316345,
0.05402502045035362,
-0.04626080393791199,
-0.07222477346658707,
-0.016985241323709488,
0.05269724503159523,
0.06945329904556274,
-0.041180551052093506,
0.030947452411055565,
-0.0013992772437632084,
0.0994822084903717,
0.10614018142223358,
0.08061618357896805,
-0.02744927443563938,
-0.08684127032756805,
0.01312428992241621,
0.003014507470652461,
0.05517769232392311,
-0.09593547880649567,
0.014105824753642082,
0.14195555448532104,
0.06527986377477646,
0.09617653489112854,
0.04556196182966232,
-0.042803309857845306,
0.004837207030504942,
0.013430070132017136,
-0.13220791518688202,
-0.10387367755174637,
0.02458983100950718,
-0.04231617599725723,
-0.15142682194709778,
0.027362745255231857,
0.1200190857052803,
-0.04125307500362396,
-0.02095073089003563,
-0.005657557863742113,
0.0032286434434354305,
-0.012261949479579926,
0.18416300415992737,
0.045630957931280136,
0.0632660910487175,
-0.0905323326587677,
0.11092531681060791,
0.03469163924455643,
-0.05025535821914673,
0.054036322981119156,
0.06645556539297104,
-0.10447248071432114,
0.010078540071845055,
0.08210084587335587,
0.13151121139526367,
-0.05513964965939522,
-0.012275880202651024,
-0.0963023453950882,
-0.08362074941396713,
0.04106469079852104,
0.13524964451789856,
0.055941566824913025,
-0.002695081988349557,
-0.0648917704820633,
0.03548770397901535,
-0.12027870863676071,
0.06862948834896088,
0.04825175553560257,
0.0762975662946701,
-0.10126758366823196,
0.1340554803609848,
-0.0015350370667874813,
0.02737746387720108,
-0.026659080758690834,
0.013514168560504913,
-0.09966377168893814,
-0.02394731529057026,
-0.10742637515068054,
-0.023688672110438347,
-0.01172523945569992,
-0.000929660105612129,
-0.02279786579310894,
-0.06893757730722427,
-0.028337467461824417,
0.03930240124464035,
-0.07865958660840988,
-0.04783293604850769,
0.017138969153165817,
0.035812605172395706,
-0.15327434241771698,
0.0029041022062301636,
0.025044593960046768,
-0.08919725567102432,
0.08973462134599686,
0.061477553099393845,
0.011942467652261257,
0.024812811985611916,
-0.11935862898826599,
-0.03007989376783371,
-0.009557941928505898,
0.0048003848642110825,
0.06869850307703018,
-0.0927712619304657,
-0.027941865846514702,
-0.035475559532642365,
0.04106893017888069,
0.0194905586540699,
0.10293392837047577,
-0.12043556571006775,
-0.0036968619097024202,
-0.037623897194862366,
-0.03970753028988838,
-0.06540181487798691,
0.036680854856967926,
0.10647016018629074,
0.05226289480924606,
0.15174667537212372,
-0.07682304084300995,
0.055373694747686386,
-0.1994360238313675,
-0.03837583214044571,
0.012146315537393093,
-0.046578045934438705,
-0.08140677958726883,
-0.047969236969947815,
0.0882977768778801,
-0.047105953097343445,
0.11414601653814316,
-0.01256234385073185,
0.10230106115341187,
0.043591685593128204,
-0.011626205407083035,
-0.06463932991027832,
-0.006569638382643461,
0.18235360085964203,
0.05311482399702072,
-0.01785755716264248,
0.12642551958560944,
0.00405671214684844,
0.029927588999271393,
0.08640284836292267,
0.21854513883590698,
0.15430505573749542,
0.000002906289182647015,
0.06261012703180313,
0.06066626310348511,
-0.06802976131439209,
-0.14829115569591522,
0.12009426951408386,
-0.01900664158165455,
0.10616936534643173,
-0.06602416187524796,
0.1891307830810547,
0.0380856953561306,
-0.17965683341026306,
0.06417183578014374,
-0.024851230904459953,
-0.10918393731117249,
-0.11885406821966171,
-0.026542074978351593,
-0.0711381807923317,
-0.1220606118440628,
0.02487204037606716,
-0.11716145277023315,
0.060955774039030075,
0.10585851967334747,
0.0077101788483560085,
0.03810068592429161,
0.18292950093746185,
-0.04680835083127022,
0.013512082397937775,
0.08387923985719681,
0.018366500735282898,
0.003465288784354925,
-0.040830351412296295,
-0.06488717347383499,
0.036450471729040146,
0.03444765508174896,
0.06421016156673431,
-0.048814740031957626,
0.006444588303565979,
0.004559635184705257,
-0.00777240376919508,
-0.07669566571712494,
0.011205854825675488,
0.008900020271539688,
0.05149940401315689,
0.04540235549211502,
0.04821287468075752,
0.00624112319201231,
-0.05547311156988144,
0.291853666305542,
-0.06880838423967361,
-0.06998316198587418,
-0.1289091855287552,
0.21688275039196014,
0.02487240545451641,
-0.02518347091972828,
0.055704351514577866,
-0.08706676959991455,
-0.013114796951413155,
0.16317118704319,
0.13742859661579132,
-0.08980634808540344,
-0.01561492495238781,
-0.023670362308621407,
-0.010929194279015064,
-0.015107672661542892,
0.1156994104385376,
0.07523074001073837,
-0.012933430261909962,
-0.07001171261072159,
-0.011739818379282951,
-0.028181470930576324,
-0.05616436153650284,
-0.06943150609731674,
0.06851431727409363,
0.028787653893232346,
-0.008506163023412228,
-0.06423886120319366,
0.0668281763792038,
0.000672591442707926,
-0.2337201088666916,
0.04187135025858879,
-0.1741233915090561,
-0.1696421504020691,
-0.019399819895625114,
0.07156472653150558,
0.005107779521495104,
0.0560234934091568,
0.0012853244552388787,
0.02419327013194561,
0.11419381946325302,
-0.013803116977214813,
-0.0031981454230844975,
-0.1145891398191452,
0.11447469890117645,
-0.10116930305957794,
0.19979128241539001,
-0.0068757254630327225,
0.0584847554564476,
0.09622032195329666,
0.03795112669467926,
-0.1368536651134491,
0.022777311503887177,
0.0630507692694664,
-0.12616237998008728,
-0.00411027017980814,
0.14796218276023865,
-0.03033408895134926,
0.06309903413057327,
0.027046358212828636,
-0.14667309820652008,
0.001997576327994466,
0.016145315021276474,
-0.03503915295004845,
-0.06921783834695816,
-0.008163757622241974,
-0.04985172674059868,
0.1648663729429245,
0.21542026102542877,
-0.029401766136288643,
0.007565552368760109,
-0.09138200432062149,
0.012111099436879158,
0.04683655500411987,
0.05168560519814491,
-0.040783267468214035,
-0.20542652904987335,
0.015892822295427322,
0.07202686369419098,
-0.005544777028262615,
-0.19626641273498535,
-0.09646060317754745,
0.046842873096466064,
-0.04108588770031929,
-0.04246840253472328,
0.09262087941169739,
0.022225182503461838,
0.041027285158634186,
-0.013576498255133629,
-0.11561396718025208,
-0.022155432030558586,
0.13973359763622284,
-0.1754245162010193,
-0.03289424255490303
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09829450398683548,
0.09709472954273224,
-0.0023748641833662987,
0.09156016260385513,
0.12394271045923233,
0.01850278116762638,
0.09246095269918442,
0.13024857640266418,
-0.0999494343996048,
0.06923040002584457,
0.08870573341846466,
0.03368956968188286,
0.04199942946434021,
0.14071868360042572,
-0.006002838723361492,
-0.2772011160850525,
-0.0002596975537016988,
-0.003706175833940506,
-0.05366656929254532,
0.12008771300315857,
0.08902724832296371,
-0.10974354296922684,
0.07481221109628677,
0.008359597064554691,
-0.15194760262966156,
0.016069604083895683,
-0.030972126871347427,
-0.03493021801114082,
0.12324855476617813,
-0.026495778933167458,
0.10713408142328262,
0.028874337673187256,
0.1353640854358673,
-0.21116790175437927,
0.007429519668221474,
0.07898002862930298,
0.05510524660348892,
0.09838856756687164,
0.0480228066444397,
0.010965046472847462,
0.10092318058013916,
-0.14706513285636902,
0.09601375460624695,
0.030661506578326225,
-0.09041936695575714,
-0.15567126870155334,
-0.08863934129476547,
0.027039121836423874,
0.050301216542720795,
0.07618563622236252,
0.0030996643472462893,
0.13339214026927948,
-0.07225249707698822,
0.08580467104911804,
0.2523339092731476,
-0.3111744821071625,
-0.06795576959848404,
0.024350520223379135,
0.05988123267889023,
0.061243072152137756,
-0.1256113052368164,
-0.00219927285797894,
0.016566995531320572,
0.027417384088039398,
0.12401600182056427,
-0.01151309721171856,
-0.10456912964582443,
-0.010505497455596924,
-0.12198521941900253,
-0.0023653602693229914,
0.061065103858709335,
0.028136085718870163,
-0.04884110763669014,
-0.10835133492946625,
-0.06751483678817749,
-0.08129175752401352,
-0.022792557254433632,
-0.050824593752622604,
0.04658236354589462,
-0.05142145976424217,
-0.09785826504230499,
-0.04327040910720825,
-0.059402044862508774,
-0.08057490736246109,
-0.007277751341462135,
0.17022627592086792,
0.03330274298787117,
0.01982766017317772,
-0.031339868903160095,
0.116045281291008,
0.030325734987854958,
-0.1387307196855545,
-0.012037744745612144,
-0.003799339523538947,
-0.09745847433805466,
-0.0396789014339447,
-0.05566885322332382,
-0.006864475551992655,
0.003276390954852104,
0.1661568135023117,
-0.0729995146393776,
0.07436038553714752,
0.013745120726525784,
-0.026483068242669106,
-0.015286273322999477,
0.15417714416980743,
-0.0422273650765419,
-0.04430839791893959,
-0.0158817358314991,
0.08128239959478378,
-0.0026425018440932035,
-0.022143801674246788,
-0.06397894024848938,
-0.02794981189072132,
0.09480508416891098,
0.05536995083093643,
-0.05749930068850517,
0.03821466118097305,
-0.027234885841608047,
-0.02521583065390587,
0.01716623827815056,
-0.11871006339788437,
0.04023775830864906,
0.0019828379154205322,
-0.07648853957653046,
-0.003357397625222802,
0.0014703919878229499,
-0.010906987823545933,
-0.005273409187793732,
0.1015372946858406,
-0.08632826805114746,
-0.005722635425627232,
-0.0672631561756134,
-0.07850954681634903,
-0.0005475805955938995,
-0.14295704662799835,
-0.009658903814852238,
-0.058331217616796494,
-0.16144798696041107,
-0.03709466755390167,
0.0434134267270565,
-0.07516968250274658,
-0.01489628292620182,
-0.04355931654572487,
-0.06277554482221603,
0.022985920310020447,
-0.012216244824230671,
0.19996675848960876,
-0.050778333097696304,
0.08402664214372635,
-0.010646404698491096,
0.04827409237623215,
0.03059162013232708,
0.0391179695725441,
-0.09678445756435394,
0.026123549789190292,
-0.1342313587665558,
0.0845433846116066,
-0.0853850468993187,
-0.005219138693064451,
-0.13642162084579468,
-0.09844427555799484,
0.007242946419864893,
-0.01893770322203636,
0.08920315653085709,
0.13415637612342834,
-0.19564557075500488,
-0.022524941712617874,
0.12490208446979523,
-0.07416684180498123,
-0.04458237439393997,
0.06338287889957428,
-0.06437040120363235,
0.03737964481115341,
0.05381889268755913,
0.20694419741630554,
0.06055759638547897,
-0.15066947042942047,
-0.005676434841006994,
0.013170144520699978,
0.050663817673921585,
0.03132036700844765,
0.042664382606744766,
0.00023601159045938402,
0.0550103634595871,
0.013542641885578632,
-0.09305332601070404,
-0.02130136266350746,
-0.09120544791221619,
-0.06465001404285431,
-0.04996037110686302,
-0.07480007410049438,
0.0543956384062767,
0.010649765841662884,
0.037880294024944305,
-0.05972130224108696,
-0.10653219372034073,
0.11502379924058914,
0.09978044778108597,
-0.055863380432128906,
0.03906576335430145,
-0.0769185945391655,
0.01116734929382801,
-0.0038733009714633226,
-0.03406559303402901,
-0.21272559463977814,
-0.12396269291639328,
0.04721327871084213,
-0.03601572662591934,
0.020830772817134857,
0.013075326569378376,
0.0851435735821724,
0.05634886026382446,
-0.05296391621232033,
-0.0136136244982481,
-0.09887660294771194,
0.0024126488715410233,
-0.11338751763105392,
-0.1922667771577835,
-0.08635653555393219,
-0.04491455480456352,
0.09727583825588226,
-0.1772645264863968,
-0.009081882424652576,
0.02425404265522957,
0.13262638449668884,
0.026919230818748474,
-0.06831099092960358,
-0.0016476957825943828,
0.04284998029470444,
0.011741232126951218,
-0.09664749354124069,
0.05455455929040909,
0.012735635042190552,
-0.10712400078773499,
-0.0439138300716877,
-0.12687638401985168,
-0.017154661938548088,
0.051978304982185364,
0.05918188765645027,
-0.09949769824743271,
-0.05839407071471214,
-0.07463622838258743,
-0.03769697621464729,
-0.07913235574960709,
0.0162209365516901,
0.2130429446697235,
0.04039758816361427,
0.11060227453708649,
-0.06199667230248451,
-0.08216460794210434,
-0.008113238029181957,
0.030373727902770042,
0.02389976568520069,
0.08985218405723572,
0.02248367667198181,
-0.041860297322273254,
0.06725049018859863,
0.10312562435865402,
-0.021209899336099625,
0.13064205646514893,
-0.055199116468429565,
-0.0843358114361763,
-0.029768045991659164,
-0.018706323578953743,
-0.02536638081073761,
0.12479478865861893,
-0.03747769817709923,
0.0017643835162743926,
0.035305462777614594,
0.040971215814352036,
0.011376630514860153,
-0.16791924834251404,
0.0016571221640333533,
0.03130748122930527,
-0.05632440373301506,
-0.04331963509321213,
-0.004136048723012209,
0.019503355026245117,
0.08694709837436676,
0.03139505907893181,
-0.0014077842934057117,
0.006327543407678604,
-0.013668672181665897,
-0.056768905371427536,
0.19086022675037384,
-0.09435328841209412,
-0.07727131992578506,
-0.07203851640224457,
0.017621172592043877,
-0.04449982941150665,
-0.036529023200273514,
0.006742802448570728,
-0.09314153343439102,
-0.029806287959218025,
-0.08150076866149902,
-0.019861403852701187,
-0.027789393439888954,
0.019784731790423393,
0.02311587706208229,
-0.018420955166220665,
0.07948871701955795,
-0.13686521351337433,
0.007279909215867519,
-0.0483705960214138,
-0.09824427962303162,
0.00371549092233181,
0.07512085884809494,
0.09065482765436172,
0.08492626994848251,
-0.013562934473156929,
0.024452820420265198,
-0.039744358509778976,
0.23294122517108917,
-0.05668725445866585,
0.011268511414527893,
0.11689258366823196,
-0.015008814632892609,
0.051934026181697845,
0.0941966250538826,
0.03795351833105087,
-0.09202925115823746,
0.023349199444055557,
0.07979975640773773,
-0.03764800354838371,
-0.2287396937608719,
-0.015486096031963825,
-0.007026387378573418,
-0.08261334896087646,
0.10243735462427139,
0.032253559678792953,
-0.04969998449087143,
0.04002884402871132,
0.0176689513027668,
-0.010077378712594509,
-0.0399315282702446,
0.06896892189979553,
0.07828939706087112,
0.04766857996582985,
0.10841953754425049,
-0.004754399880766869,
-0.01846490614116192,
0.05555367097258568,
0.014663849957287312,
0.2613676190376282,
-0.041596587747335434,
0.10526077449321747,
0.03339599445462227,
0.14913256466388702,
-0.02110118232667446,
0.06296171993017197,
0.000005264575065666577,
-0.009774130769073963,
-0.011987765319645405,
-0.06228538602590561,
-0.030059903860092163,
0.012834048829972744,
-0.04294207692146301,
0.023014049977064133,
-0.08152934908866882,
0.025812720879912376,
0.02255932427942753,
0.2857562303543091,
0.02885838784277439,
-0.25348055362701416,
-0.07792355120182037,
-0.014467259868979454,
-0.05054715648293495,
-0.06046149134635925,
0.008442101068794727,
0.1377985179424286,
-0.14013265073299408,
0.0456932932138443,
-0.078079454600811,
0.08712978661060333,
-0.050483036786317825,
0.011673958040773869,
0.051252007484436035,
0.14913587272167206,
-0.018083129078149796,
0.0539836585521698,
-0.19543537497520447,
0.2549082934856415,
0.017506862059235573,
0.10370160639286041,
-0.06546631455421448,
0.012878094799816608,
0.02287215180695057,
0.01900799572467804,
0.1151588037610054,
0.0024757892824709415,
-0.07071568071842194,
-0.14554360508918762,
-0.09112237393856049,
0.04720153659582138,
0.14134733378887177,
-0.04586198180913925,
0.08946072310209274,
-0.0371873639523983,
0.012788699008524418,
0.03662547469139099,
-0.03497938811779022,
-0.14704135060310364,
-0.08744435012340546,
-0.0002517788379918784,
0.00768678542226553,
-0.006600075867027044,
-0.0617462582886219,
-0.10525048524141312,
-0.010622957721352577,
0.10517159104347229,
0.0022025874350219965,
-0.054598696529865265,
-0.15850727260112762,
0.08790872246026993,
0.14379872381687164,
-0.05814240500330925,
0.01227859128266573,
0.015289084985852242,
0.11266247183084488,
0.03557859733700752,
-0.07858686149120331,
0.06164899095892906,
-0.06184162572026253,
-0.18080227077007294,
-0.055979035794734955,
0.12299884110689163,
0.08232495188713074,
0.04968076944351196,
-0.0012358573731034994,
0.05160876363515854,
0.0001530001754872501,
-0.09616357833147049,
0.03419682756066322,
0.008977126330137253,
0.03563252463936806,
0.016468103975057602,
-0.08681429177522659,
0.09785570204257965,
-0.036231521517038345,
0.009237932972609997,
0.12988293170928955,
0.20820008218288422,
-0.10587359964847565,
0.11397287249565125,
0.08576204627752304,
-0.07349364459514618,
-0.16757242381572723,
0.060031257569789886,
0.13010361790657043,
0.011916718445718288,
0.08351290971040726,
-0.2135182023048401,
0.12254373729228973,
0.10011760145425797,
-0.011204574257135391,
0.010728719644248486,
-0.2787013649940491,
-0.1289641410112381,
0.058358658105134964,
0.10874481499195099,
0.04116152226924896,
-0.11757323890924454,
-0.03598697856068611,
-0.004132296424359083,
-0.09288942068815231,
0.11226142942905426,
-0.07156509160995483,
0.11553532630205154,
-0.016276372596621513,
0.1110759899020195,
0.025928610935807228,
-0.03070804476737976,
0.10910791158676147,
0.059306807816028595,
0.08000384271144867,
-0.03521817550063133,
0.0078065223060548306,
0.054410967975854874,
-0.056064702570438385,
0.014879070222377777,
-0.043573785573244095,
0.0670648142695427,
-0.15026451647281647,
-0.00017584467423148453,
-0.09225495904684067,
0.05082296207547188,
-0.04901030287146568,
-0.07184378057718277,
-0.015213582664728165,
0.05396602302789688,
0.07536239922046661,
-0.039690084755420685,
0.027745697647333145,
-0.007098501548171043,
0.09872179478406906,
0.09533385932445526,
0.08033160120248795,
-0.016871435567736626,
-0.09274943172931671,
0.011811889708042145,
0.004651019815355539,
0.054785970598459244,
-0.10518359392881393,
0.014331430196762085,
0.13745944201946259,
0.06596256792545319,
0.09558582305908203,
0.04761115461587906,
-0.04030551016330719,
0.0031661950051784515,
0.013401472009718418,
-0.1216253861784935,
-0.11296777427196503,
0.024021228775382042,
-0.045097555965185165,
-0.15430141985416412,
0.020701756700873375,
0.11979275941848755,
-0.04021970555186272,
-0.01736251451075077,
-0.007338900119066238,
0.00549387326464057,
-0.013216356746852398,
0.1844644993543625,
0.04561168700456619,
0.06294871866703033,
-0.08802932500839233,
0.10747586935758591,
0.03625423461198807,
-0.05358898639678955,
0.05076829344034195,
0.06321705877780914,
-0.10338647663593292,
0.008664434775710106,
0.07594799995422363,
0.12500403821468353,
-0.048326801508665085,
-0.009704707190394402,
-0.08851394802331924,
-0.08485639840364456,
0.042230378836393356,
0.1326042264699936,
0.05337066203355789,
0.00009620092896511778,
-0.07080680131912231,
0.041788939386606216,
-0.11838442087173462,
0.07159547507762909,
0.044907476752996445,
0.06960581243038177,
-0.09989722818136215,
0.13058273494243622,
-0.00104982138145715,
0.025753304362297058,
-0.026054617017507553,
0.014590050093829632,
-0.09628821909427643,
-0.024093881249427795,
-0.10677645355463028,
-0.024793002754449844,
-0.00908924825489521,
0.0006698721554130316,
-0.022183286026120186,
-0.07486563920974731,
-0.02737783081829548,
0.038325559347867966,
-0.07603303343057632,
-0.05022304132580757,
0.014235859736800194,
0.040472619235515594,
-0.15180857479572296,
0.0018059647409245372,
0.02957422286272049,
-0.09364920854568481,
0.09121572226285934,
0.06314271688461304,
0.014281290583312511,
0.026756778359413147,
-0.11503738909959793,
-0.02834232896566391,
-0.011533851735293865,
0.005454335827380419,
0.06556101888418198,
-0.09799706935882568,
-0.02698635309934616,
-0.0393732413649559,
0.04502980038523674,
0.017895784229040146,
0.09861602634191513,
-0.11793851107358932,
-0.004356529098004103,
-0.03805458918213844,
-0.04148922488093376,
-0.06260739266872406,
0.035604141652584076,
0.10236096382141113,
0.05456843972206116,
0.14917774498462677,
-0.07396338880062103,
0.059147708117961884,
-0.2014572024345398,
-0.03554864227771759,
0.010959709994494915,
-0.043902330100536346,
-0.08310866355895996,
-0.051653724163770676,
0.0894385278224945,
-0.04475979506969452,
0.1055169478058815,
-0.01977592706680298,
0.11036589741706848,
0.043405961245298386,
-0.010404085740447044,
-0.06077183037996292,
-0.006653591524809599,
0.18825259804725647,
0.05872873216867447,
-0.01642971858382225,
0.13021700084209442,
-0.0009114417480304837,
0.029481718316674232,
0.08620315790176392,
0.22541333734989166,
0.16145873069763184,
0.0018557613948360085,
0.06350503861904144,
0.06005939096212387,
-0.07304099947214127,
-0.15223416686058044,
0.11752483248710632,
-0.020217349752783775,
0.10296063125133514,
-0.06743332743644714,
0.19007304310798645,
0.040206748992204666,
-0.1836308091878891,
0.06469570845365524,
-0.02595364861190319,
-0.11042958498001099,
-0.12247905135154724,
-0.024875793606042862,
-0.06884387135505676,
-0.12045957148075104,
0.023577073588967323,
-0.11714135110378265,
0.06331629306077957,
0.10144171118736267,
0.009307043626904488,
0.03833242505788803,
0.18466633558273315,
-0.045757874846458435,
0.010029518976807594,
0.0831124410033226,
0.020250804722309113,
0.0058133648708462715,
-0.044967085123062134,
-0.06668685376644135,
0.035016581416130066,
0.03314737230539322,
0.06302117556333542,
-0.0508153960108757,
-0.000256241241004318,
0.00927953515201807,
-0.007686547003686428,
-0.07739683240652084,
0.010796301066875458,
0.009850588627159595,
0.05398297682404518,
0.052293162792921066,
0.04580262303352356,
0.006443222053349018,
-0.053517088294029236,
0.2983665466308594,
-0.06966891139745712,
-0.0696927160024643,
-0.12986880540847778,
0.2064570188522339,
0.02238611876964569,
-0.02239811420440674,
0.054006192833185196,
-0.08429910987615585,
-0.01551196537911892,
0.17259633541107178,
0.1326870620250702,
-0.09337177127599716,
-0.015707965940237045,
-0.014484062790870667,
-0.009782547131180763,
-0.013836028054356575,
0.11654354631900787,
0.07643810659646988,
-0.009699365124106407,
-0.068518728017807,
-0.018624652177095413,
-0.02139252796769142,
-0.056236062198877335,
-0.06187541037797928,
0.07042807340621948,
0.025008324533700943,
-0.007475970778614283,
-0.0627015009522438,
0.0689878910779953,
-0.002292902674525976,
-0.24411946535110474,
0.04318203404545784,
-0.17273321747779846,
-0.16982179880142212,
-0.026618897914886475,
0.07267650216817856,
0.005819874815642834,
0.05734208971261978,
0.0024809781461954117,
0.019324084743857384,
0.12289246916770935,
-0.012346675619482994,
-0.0030295527540147305,
-0.1102660521864891,
0.11780577152967453,
-0.08429649472236633,
0.198884055018425,
-0.006909064017236233,
0.052365370094776154,
0.09687831997871399,
0.04159923270344734,
-0.13862600922584534,
0.017034703865647316,
0.06583819538354874,
-0.13056083023548126,
-0.0025255181826651096,
0.14883096516132355,
-0.03315158933401108,
0.0626278892159462,
0.026946749538183212,
-0.15295055508613586,
0.007992030121386051,
0.015190577134490013,
-0.038065824657678604,
-0.06666766107082367,
-0.008258827961981297,
-0.051190972328186035,
0.16701337695121765,
0.21903640031814575,
-0.029040953144431114,
0.004551206715404987,
-0.08952679485082626,
0.01005101203918457,
0.04498061537742615,
0.06344200670719147,
-0.042804013937711716,
-0.20480483770370483,
0.011168856173753738,
0.06455040723085403,
-0.004792249761521816,
-0.19365116953849792,
-0.09895007312297821,
0.053268298506736755,
-0.040050212293863297,
-0.0418836772441864,
0.09559036046266556,
0.019308244809508324,
0.036889854818582535,
-0.011973610147833824,
-0.11847561597824097,
-0.020931990817189217,
0.139374241232872,
-0.1775072067975998,
-0.028783900663256645
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09905190020799637,
0.09672187268733978,
-0.002308567287400365,
0.09113916009664536,
0.12371189892292023,
0.018570708110928535,
0.09283178299665451,
0.12922386825084686,
-0.0988721027970314,
0.0679488331079483,
0.08751877397298813,
0.034546103328466415,
0.04232078790664673,
0.14203961193561554,
-0.005110401660203934,
-0.27809467911720276,
-0.0005122079164721072,
-0.0030658382456749678,
-0.05198236182332039,
0.11984632909297943,
0.08941325545310974,
-0.10977775603532791,
0.07391153275966644,
0.008298934437334538,
-0.15137608349323273,
0.01652933843433857,
-0.03124571032822132,
-0.03457918018102646,
0.12322663515806198,
-0.028072739019989967,
0.10667021572589874,
0.028655147179961205,
0.13725388050079346,
-0.2101542353630066,
0.007412288337945938,
0.07800254970788956,
0.05490003526210785,
0.09796737134456635,
0.04760441929101944,
0.011842355132102966,
0.10034201294183731,
-0.1480027735233307,
0.09666553139686584,
0.02983228862285614,
-0.08995208144187927,
-0.15367138385772705,
-0.08860158920288086,
0.02787916734814644,
0.05260833725333214,
0.07553872466087341,
0.0026691285893321037,
0.1334342062473297,
-0.07284170389175415,
0.08582080155611038,
0.2520219385623932,
-0.3120526373386383,
-0.06781028211116791,
0.026172081008553505,
0.061671823263168335,
0.06160947307944298,
-0.12586238980293274,
-0.003071038518100977,
0.017084436491131783,
0.02669009007513523,
0.12488999217748642,
-0.012257309630513191,
-0.10446517914533615,
-0.010738803073763847,
-0.12304435670375824,
-0.0014001323143020272,
0.06139446049928665,
0.02889692783355713,
-0.04893844202160835,
-0.10944347828626633,
-0.06743834912776947,
-0.08191707730293274,
-0.023815909400582314,
-0.05194130167365074,
0.04646169766783714,
-0.052019547671079636,
-0.09720813482999802,
-0.043628912419080734,
-0.05878791958093643,
-0.08109930902719498,
-0.005524755921214819,
0.16873784363269806,
0.03373780846595764,
0.018869252875447273,
-0.030773242935538292,
0.11568216979503632,
0.027933329343795776,
-0.13828961551189423,
-0.01107034832239151,
-0.0038772127591073513,
-0.0988752692937851,
-0.04080329090356827,
-0.055968087166547775,
-0.006543343886733055,
0.0020261278841644526,
0.16685150563716888,
-0.07023906707763672,
0.07432443648576736,
0.01565323956310749,
-0.027283716946840286,
-0.014998194761574268,
0.15413036942481995,
-0.043001946061849594,
-0.04642404243350029,
-0.016083428636193275,
0.08212073147296906,
-0.0029915824998170137,
-0.02084578014910221,
-0.06478533893823624,
-0.028520843014121056,
0.09511737525463104,
0.05528028681874275,
-0.05897048860788345,
0.03816872090101242,
-0.026313381269574165,
-0.025109387934207916,
0.018382374197244644,
-0.11935427784919739,
0.04058467224240303,
0.0011103266151621938,
-0.07754604518413544,
-0.0044628954492509365,
0.0001635975349927321,
-0.009920782409608364,
-0.005378659814596176,
0.10033511370420456,
-0.08604622632265091,
-0.005968366749584675,
-0.06765396893024445,
-0.07907713949680328,
-0.0002601974119897932,
-0.14620114862918854,
-0.00835571438074112,
-0.0574115514755249,
-0.16472482681274414,
-0.03779354318976402,
0.0423995703458786,
-0.07430437207221985,
-0.015439524315297604,
-0.04457539692521095,
-0.06257332861423492,
0.02121044136583805,
-0.011560874991118908,
0.20070543885231018,
-0.049727268517017365,
0.08288751542568207,
-0.010634818114340305,
0.04904282093048096,
0.030763698741793633,
0.03940589725971222,
-0.0963360145688057,
0.025973303243517876,
-0.1334296613931656,
0.08444372564554214,
-0.08566176891326904,
-0.0034011623356491327,
-0.13613161444664001,
-0.09855511784553528,
0.0057029202580451965,
-0.018852483481168747,
0.08887152373790741,
0.13413459062576294,
-0.19654090702533722,
-0.021441657096147537,
0.12618131935596466,
-0.07370045781135559,
-0.04385778680443764,
0.062271229922771454,
-0.06470878422260284,
0.03868439793586731,
0.05543334409594536,
0.20643772184848785,
0.06290427595376968,
-0.14994601905345917,
-0.006269671022891998,
0.013197156600654125,
0.05049832910299301,
0.02936328761279583,
0.04273976758122444,
0.001690227654762566,
0.05618786811828613,
0.013688723556697369,
-0.09232575446367264,
-0.021419325843453407,
-0.09042590111494064,
-0.06506277620792389,
-0.04956863075494766,
-0.07571369409561157,
0.05482928827404976,
0.011100457981228828,
0.03823015093803406,
-0.05985606461763382,
-0.1062893271446228,
0.1157502681016922,
0.10038284212350845,
-0.055714741349220276,
0.03779250383377075,
-0.07677533477544785,
0.010669605806469917,
-0.004774056375026703,
-0.0341065376996994,
-0.21256454288959503,
-0.12239545583724976,
0.04754810035228729,
-0.03653966262936592,
0.02081344462931156,
0.014930110424757004,
0.08573820441961288,
0.05600736290216446,
-0.05261945351958275,
-0.01410958543419838,
-0.09884103387594223,
0.0022882993798702955,
-0.11442643404006958,
-0.19070853292942047,
-0.08732330054044724,
-0.045429814606904984,
0.09796604514122009,
-0.17837268114089966,
-0.008159150369465351,
0.02232561819255352,
0.1321365237236023,
0.02611670084297657,
-0.06840628385543823,
-0.0006705896812491119,
0.04208756610751152,
0.012520440854132175,
-0.09690681099891663,
0.054271336644887924,
0.011492564342916012,
-0.10674450546503067,
-0.04557975009083748,
-0.12800739705562592,
-0.018751226365566254,
0.05123131722211838,
0.061266325414180756,
-0.09919378906488419,
-0.0588282011449337,
-0.07446194440126419,
-0.03708307817578316,
-0.07799698412418365,
0.01562667079269886,
0.2120368629693985,
0.039464764297008514,
0.10998643189668655,
-0.062018830329179764,
-0.08319641649723053,
-0.00816697720438242,
0.0318160355091095,
0.024551674723625183,
0.089558444917202,
0.022509831935167313,
-0.04192829504609108,
0.066655732691288,
0.10417022556066513,
-0.020829761400818825,
0.12990233302116394,
-0.055325306951999664,
-0.08507811278104782,
-0.029460135847330093,
-0.019092092290520668,
-0.026438774541020393,
0.12447045743465424,
-0.03777140751481056,
0.00006754631613148376,
0.03485535457730293,
0.039643026888370514,
0.011381191201508045,
-0.16822078824043274,
0.0019185889977961779,
0.03124500811100006,
-0.05543772503733635,
-0.04483092203736305,
-0.0050491453148424625,
0.018339211121201515,
0.08629685640335083,
0.030697578564286232,
-0.002034191507846117,
0.0057583944872021675,
-0.01337359193712473,
-0.05628112703561783,
0.190881609916687,
-0.09331563860177994,
-0.07598163932561874,
-0.07138344645500183,
0.018126387149095535,
-0.04267722740769386,
-0.03660906106233597,
0.005838521756231785,
-0.0930553525686264,
-0.029315708205103874,
-0.08096598088741302,
-0.02071364037692547,
-0.027730733156204224,
0.019152510911226273,
0.024467570707201958,
-0.01787017472088337,
0.07775680720806122,
-0.13691601157188416,
0.007932775653898716,
-0.048832569271326065,
-0.09787725657224655,
0.003136038314551115,
0.0742737203836441,
0.09099826961755753,
0.08525702357292175,
-0.014191159047186375,
0.024502119049429893,
-0.04028560593724251,
0.2321888953447342,
-0.057094864547252655,
0.012766197323799133,
0.1167280524969101,
-0.014516984112560749,
0.05171697959303856,
0.09481847286224365,
0.03724338114261627,
-0.09170126914978027,
0.02294207736849785,
0.0791039913892746,
-0.037151213735342026,
-0.22941620647907257,
-0.014876016415655613,
-0.006293329875916243,
-0.0839228704571724,
0.10306593030691147,
0.03196805715560913,
-0.04778992757201195,
0.04175179451704025,
0.017496243119239807,
-0.008604521863162518,
-0.03879190608859062,
0.0686781033873558,
0.07615304738283157,
0.046899497509002686,
0.10851555317640305,
-0.004981684032827616,
-0.019551098346710205,
0.054116055369377136,
0.015448777005076408,
0.2620692849159241,
-0.040438130497932434,
0.10504212230443954,
0.03244926035404205,
0.14849744737148285,
-0.021872978657484055,
0.06531880795955658,
0.000862426939420402,
-0.010072259232401848,
-0.011919713579118252,
-0.062019526958465576,
-0.028643561527132988,
0.013359662145376205,
-0.04257826507091522,
0.02263507805764675,
-0.0808996632695198,
0.025328613817691803,
0.021968262270092964,
0.28446272015571594,
0.030253950506448746,
-0.25434356927871704,
-0.0772809237241745,
-0.01416116114705801,
-0.05158686637878418,
-0.06013346090912819,
0.008076678030192852,
0.13658173382282257,
-0.13978666067123413,
0.046515461057424545,
-0.0781528502702713,
0.08730430901050568,
-0.04958898574113846,
0.011871067807078362,
0.05005718767642975,
0.1489803045988083,
-0.017802704125642776,
0.05514686927199364,
-0.19550658762454987,
0.25360798835754395,
0.017554165795445442,
0.10487637668848038,
-0.06642717868089676,
0.012915405444800854,
0.02286188304424286,
0.018426328897476196,
0.11632589995861053,
0.0021777169313281775,
-0.07072801142930984,
-0.14609883725643158,
-0.09133245795965195,
0.047144677489995956,
0.14158709347248077,
-0.045425258576869965,
0.09001480787992477,
-0.03617994859814644,
0.011750038713216782,
0.03679827228188515,
-0.03641660884022713,
-0.14828583598136902,
-0.08704125136137009,
-0.0005847944994457066,
0.006654342170804739,
-0.006677471566945314,
-0.061197370290756226,
-0.10518432408571243,
-0.01281527616083622,
0.10365201532840729,
0.0033903378061950207,
-0.05470350757241249,
-0.1583169847726822,
0.0892893522977829,
0.14422070980072021,
-0.05786735191941261,
0.012981127016246319,
0.016663027927279472,
0.11325114965438843,
0.035028401762247086,
-0.07825271785259247,
0.06091758981347084,
-0.06189608573913574,
-0.17965103685855865,
-0.055012304335832596,
0.12417779117822647,
0.08268878608942032,
0.04980418458580971,
0.000221052163396962,
0.05090276896953583,
0.000007093598469509743,
-0.09591152518987656,
0.03373072296380997,
0.008624764159321785,
0.03514542058110237,
0.016664257273077965,
-0.08786129206418991,
0.09648751467466354,
-0.03683273121714592,
0.011294793337583542,
0.13005536794662476,
0.2052704244852066,
-0.10565926879644394,
0.1132005825638771,
0.0856204554438591,
-0.073916956782341,
-0.16712647676467896,
0.06058277562260628,
0.13002005219459534,
0.012460633181035519,
0.08394139260053635,
-0.21319080889225006,
0.12280736118555069,
0.09973977506160736,
-0.01062037330120802,
0.009707430377602577,
-0.27869805693626404,
-0.1284739077091217,
0.059446629136800766,
0.10899554193019867,
0.039004381746053696,
-0.1170150563120842,
-0.03586496040225029,
-0.004679826553910971,
-0.092837855219841,
0.11180508136749268,
-0.07267426699399948,
0.11470580101013184,
-0.015711644664406776,
0.1102064847946167,
0.025844993069767952,
-0.030806338414549828,
0.10744432359933853,
0.06114459037780762,
0.08012449741363525,
-0.034865815192461014,
0.006751248147338629,
0.05667708441615105,
-0.05572142079472542,
0.01668565534055233,
-0.04248623922467232,
0.06667105853557587,
-0.15010255575180054,
-0.0007267981418408453,
-0.09171424061059952,
0.05067215487360954,
-0.04863933101296425,
-0.07203927636146545,
-0.014626316726207733,
0.05382027104496956,
0.07495024055242538,
-0.03974276781082153,
0.025727884843945503,
-0.005726304370909929,
0.09792207926511765,
0.09162913262844086,
0.08136750757694244,
-0.015934964641928673,
-0.09185373038053513,
0.011708681471645832,
0.004758972208946943,
0.05420459434390068,
-0.10575717687606812,
0.013728884980082512,
0.13749098777770996,
0.06643309444189072,
0.09574884921312332,
0.04701332002878189,
-0.04045119881629944,
0.002623134758323431,
0.012792456895112991,
-0.11947020143270493,
-0.11411251872777939,
0.023795103654265404,
-0.0460355319082737,
-0.15472574532032013,
0.022143667563796043,
0.11790020018815994,
-0.040989458560943604,
-0.01840546354651451,
-0.008459481410682201,
0.005168788135051727,
-0.012770931236445904,
0.18580836057662964,
0.0463857427239418,
0.06310942769050598,
-0.08844897896051407,
0.10699297487735748,
0.03652242571115494,
-0.05397559702396393,
0.050387684255838394,
0.06317564100027084,
-0.10404932498931885,
0.008137676864862442,
0.0770280584692955,
0.12527704238891602,
-0.04580064117908478,
-0.00995266530662775,
-0.08834725618362427,
-0.08474064618349075,
0.042477644979953766,
0.13233840465545654,
0.05362354964017868,
-0.0013492349535226822,
-0.07026789337396622,
0.04220564663410187,
-0.11885581910610199,
0.07158105820417404,
0.04500380903482437,
0.06963962316513062,
-0.10017141699790955,
0.13080808520317078,
-0.0009681586525402963,
0.026360251009464264,
-0.026013990864157677,
0.015572191216051579,
-0.09556484967470169,
-0.02458721399307251,
-0.10527082532644272,
-0.026151832193136215,
-0.010467088781297207,
0.000749311177060008,
-0.022846393287181854,
-0.07434248179197311,
-0.027056977152824402,
0.03796011954545975,
-0.07590967416763306,
-0.050466567277908325,
0.013740815222263336,
0.039538055658340454,
-0.15095621347427368,
0.001794531592167914,
0.02876191772520542,
-0.09302374720573425,
0.09050130099058151,
0.06246037781238556,
0.015139018185436726,
0.0274569783359766,
-0.11445498466491699,
-0.027686424553394318,
-0.010635764338076115,
0.00575343007221818,
0.06511826068162918,
-0.09758742898702621,
-0.026492241770029068,
-0.0391431488096714,
0.046133581548929214,
0.017046380788087845,
0.09590905904769897,
-0.11706913262605667,
-0.005326858256012201,
-0.03935597836971283,
-0.04107404872775078,
-0.06301794201135635,
0.03597906976938248,
0.10180219262838364,
0.05306757986545563,
0.14964501559734344,
-0.07228148728609085,
0.05912122502923012,
-0.20204107463359833,
-0.03608199581503868,
0.010167472064495087,
-0.043991126120090485,
-0.0828094333410263,
-0.05225260183215141,
0.08947991579771042,
-0.04442485421895981,
0.10649401694536209,
-0.019645914435386658,
0.11139438301324844,
0.04252558946609497,
-0.007423103787004948,
-0.06051633507013321,
-0.006061384454369545,
0.18859274685382843,
0.05887666344642639,
-0.016914915293455124,
0.12924323976039886,
-0.00021257383923511952,
0.029780777171254158,
0.08464707434177399,
0.2228250652551651,
0.16155843436717987,
0.00103598868008703,
0.0634494423866272,
0.06084176525473595,
-0.07326152175664902,
-0.15119169652462006,
0.11851844191551208,
-0.020690811797976494,
0.10117433220148087,
-0.06725634634494781,
0.19233669340610504,
0.039768658578395844,
-0.18351665139198303,
0.06525677442550659,
-0.025101354345679283,
-0.1112140491604805,
-0.12126775085926056,
-0.02637629024684429,
-0.06911548972129822,
-0.11924780905246735,
0.02382974699139595,
-0.11715563386678696,
0.06183723360300064,
0.10163106769323349,
0.009146388620138168,
0.0376179963350296,
0.18593940138816833,
-0.046456653624773026,
0.01032247394323349,
0.08317695558071136,
0.01987299509346485,
0.006356673780828714,
-0.045012038201093674,
-0.06578796356916428,
0.036107271909713745,
0.03237413242459297,
0.06364588439464569,
-0.05302391201257706,
-0.0007733333623036742,
0.009122326970100403,
-0.0069920518435537815,
-0.07681462913751602,
0.010949330404400826,
0.009610598906874657,
0.053987979888916016,
0.05126318335533142,
0.046026211231946945,
0.005815473385155201,
-0.05394791066646576,
0.29701367020606995,
-0.0696331337094307,
-0.0704881101846695,
-0.1299072802066803,
0.20403119921684265,
0.023532526567578316,
-0.022497203201055527,
0.05470840632915497,
-0.08471249788999557,
-0.013161222450435162,
0.17337250709533691,
0.13278770446777344,
-0.09102210402488708,
-0.01622902974486351,
-0.014157959260046482,
-0.010064804926514626,
-0.014417540282011032,
0.11602114140987396,
0.07695157080888748,
-0.011874649673700333,
-0.06829604506492615,
-0.017875809222459793,
-0.019988730549812317,
-0.05725551396608353,
-0.062072787433862686,
0.0701625719666481,
0.02595706656575203,
-0.007879190146923065,
-0.060827549546957016,
0.070585697889328,
-0.000136919945362024,
-0.2440619319677353,
0.04194194823503494,
-0.17167872190475464,
-0.16984888911247253,
-0.02717496082186699,
0.07253733277320862,
0.007549970876425505,
0.057206351310014725,
0.0027079638093709946,
0.019592124968767166,
0.12222655117511749,
-0.01163344644010067,
-0.0037626565899699926,
-0.11031034588813782,
0.11790911108255386,
-0.08566615730524063,
0.19783663749694824,
-0.007092002779245377,
0.05332167074084282,
0.09676486998796463,
0.03991116210818291,
-0.13851645588874817,
0.017577780410647392,
0.06591065227985382,
-0.12877033650875092,
-0.0009135424043051898,
0.14941063523292542,
-0.03291615843772888,
0.060360297560691833,
0.025975840166211128,
-0.15319111943244934,
0.008262790739536285,
0.015546678565442562,
-0.03750016912817955,
-0.0674193799495697,
-0.005841284990310669,
-0.05052795261144638,
0.16774800419807434,
0.21883414685726166,
-0.029470285400748253,
0.005071335472166538,
-0.09007856994867325,
0.009423218667507172,
0.045294877141714096,
0.06348373740911484,
-0.042685702443122864,
-0.20429302752017975,
0.010050295852124691,
0.06224707141518593,
-0.004155866336077452,
-0.19273461401462555,
-0.09783826023340225,
0.05211857706308365,
-0.041315168142318726,
-0.042065221816301346,
0.09472043067216873,
0.021038629114627838,
0.03681444004178047,
-0.011712606996297836,
-0.11901866644620895,
-0.021224448457360268,
0.13941213488578796,
-0.17845295369625092,
-0.02798205055296421
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09767517447471619,
0.09746559709310532,
-0.002414519200101495,
0.09147490561008453,
0.1232576072216034,
0.019031671807169914,
0.09287071973085403,
0.12994502484798431,
-0.09946189075708389,
0.06850350648164749,
0.08814138919115067,
0.03304193913936615,
0.04187311604619026,
0.14106465876102448,
-0.005284883081912994,
-0.27831655740737915,
-0.0006054692785255611,
-0.004678011871874332,
-0.05404416844248772,
0.11982809752225876,
0.0902014970779419,
-0.10876557230949402,
0.07441965490579605,
0.008786770515143871,
-0.15217870473861694,
0.016359390690922737,
-0.030234873294830322,
-0.033843379467725754,
0.1231190487742424,
-0.027251090854406357,
0.1072205901145935,
0.02978697046637535,
0.13703957200050354,
-0.2108667641878128,
0.007069655694067478,
0.07729741185903549,
0.055350176990032196,
0.09837166219949722,
0.04830604046583176,
0.011894596740603447,
0.09992241114377975,
-0.14726592600345612,
0.09566176682710648,
0.030082330107688904,
-0.09000758826732635,
-0.15466170012950897,
-0.0893256813287735,
0.027981124818325043,
0.05261367931962013,
0.07427547872066498,
0.003609908977523446,
0.13203908503055573,
-0.07232417911291122,
0.08540047705173492,
0.25202423334121704,
-0.3129979968070984,
-0.06810466945171356,
0.024748871102929115,
0.06038178130984306,
0.06247555464506149,
-0.12409325689077377,
-0.002302836859598756,
0.017017319798469543,
0.026967978104948997,
0.12379835546016693,
-0.011802795343101025,
-0.1038798838853836,
-0.010674149729311466,
-0.12305869162082672,
-0.0033283482771366835,
0.06158335134387016,
0.028281059116125107,
-0.05004773661494255,
-0.10739240050315857,
-0.06809449195861816,
-0.07956640422344208,
-0.02224547602236271,
-0.05261867493391037,
0.0467626117169857,
-0.050578970462083817,
-0.09693349897861481,
-0.045871712267398834,
-0.060202017426490784,
-0.08138945698738098,
-0.006545725744217634,
0.17045725882053375,
0.03370213508605957,
0.020507821813225746,
-0.030452163890004158,
0.11705376207828522,
0.029727991670370102,
-0.1380399763584137,
-0.011523211374878883,
-0.0033561289310455322,
-0.09801459312438965,
-0.04026927426457405,
-0.05587223544716835,
-0.005890472326427698,
0.002801676047965884,
0.1657188981771469,
-0.07246245443820953,
0.07341940701007843,
0.0139244319871068,
-0.026052208617329597,
-0.01495354063808918,
0.15410858392715454,
-0.04095812141895294,
-0.043253421783447266,
-0.015745051205158234,
0.08137736469507217,
-0.0020122486166656017,
-0.021154070273041725,
-0.06486879289150238,
-0.029238924384117126,
0.09577637165784836,
0.056631896644830704,
-0.05856839567422867,
0.03602583706378937,
-0.028014689683914185,
-0.025205427780747414,
0.01666426472365856,
-0.1194302961230278,
0.0403890386223793,
0.0016125717666000128,
-0.07598321884870529,
-0.004768162965774536,
0.001042460440658033,
-0.009583603590726852,
-0.004248383920639753,
0.09946845471858978,
-0.08475526422262192,
-0.004817180801182985,
-0.06623230129480362,
-0.07707583159208298,
-0.0003756083606276661,
-0.1430344581604004,
-0.009403804317116737,
-0.058138273656368256,
-0.16216090321540833,
-0.03566624969244003,
0.04344026744365692,
-0.07601352035999298,
-0.017853664234280586,
-0.043468523770570755,
-0.06161192059516907,
0.022498808801174164,
-0.012272704392671585,
0.20031332969665527,
-0.04994348809123039,
0.08408749848604202,
-0.010440167970955372,
0.04906930774450302,
0.03057022951543331,
0.03835300728678703,
-0.09525474160909653,
0.02718832530081272,
-0.1344156712293625,
0.08492841571569443,
-0.08441213518381119,
-0.006750765256583691,
-0.13745325803756714,
-0.09803637117147446,
0.0075060417875647545,
-0.019804906100034714,
0.08821175247430801,
0.13392284512519836,
-0.19528646767139435,
-0.021105164662003517,
0.12469983845949173,
-0.07542919367551804,
-0.04376882314682007,
0.06371378898620605,
-0.06460591405630112,
0.04041450843214989,
0.05316004529595375,
0.2066390961408615,
0.061900969594717026,
-0.1504402905702591,
-0.0038519068621098995,
0.01479131169617176,
0.049930550158023834,
0.031415458768606186,
0.04456731677055359,
0.0001227383327204734,
0.05371101573109627,
0.013220744207501411,
-0.0947403758764267,
-0.021621044725179672,
-0.09137845784425735,
-0.06583380699157715,
-0.049964871257543564,
-0.07518991827964783,
0.056083209812641144,
0.008705985732376575,
0.038602571934461594,
-0.05917385220527649,
-0.1052604392170906,
0.11484326422214508,
0.10070924460887909,
-0.05534353107213974,
0.03945938125252724,
-0.07725948840379715,
0.01069133635610342,
-0.0056258090771734715,
-0.03486586734652519,
-0.21217244863510132,
-0.12351903319358826,
0.048366714268922806,
-0.036855246871709824,
0.020763272419571877,
0.015229681506752968,
0.08479376882314682,
0.0563860647380352,
-0.05283806100487709,
-0.013970814645290375,
-0.0991721823811531,
0.0018562624463811517,
-0.11388025432825089,
-0.1917458027601242,
-0.08698432147502899,
-0.04529913514852524,
0.09830459952354431,
-0.1774325966835022,
-0.009364615194499493,
0.023697692900896072,
0.1323537975549698,
0.02652127668261528,
-0.06778556853532791,
-0.0022329255007207394,
0.04102623462677002,
0.011763060465455055,
-0.09614937752485275,
0.05434372276067734,
0.012769201770424843,
-0.1077253669500351,
-0.0434412881731987,
-0.1257452517747879,
-0.01790526509284973,
0.0499362088739872,
0.0601787343621254,
-0.09920895099639893,
-0.05916869640350342,
-0.07480815798044205,
-0.037964072078466415,
-0.07922609895467758,
0.015712810680270195,
0.21199199557304382,
0.03921155259013176,
0.1104087084531784,
-0.062411606311798096,
-0.08249294012784958,
-0.008454094640910625,
0.029348697513341904,
0.023384351283311844,
0.08911649882793427,
0.022396672517061234,
-0.04364091157913208,
0.06567355245351791,
0.10475748777389526,
-0.02104610577225685,
0.12904979288578033,
-0.05533178150653839,
-0.08475575596094131,
-0.0312055591493845,
-0.01664886809885502,
-0.02564341574907303,
0.12390795350074768,
-0.03646773844957352,
0.002265848917886615,
0.0349728949368,
0.04093068465590477,
0.011187366209924221,
-0.1692870855331421,
0.0018208069959655404,
0.031462784856557846,
-0.05748879164457321,
-0.042143531143665314,
-0.005058724898844957,
0.019352605566382408,
0.08686964958906174,
0.0315542072057724,
-0.0018260431243106723,
0.007590583059936762,
-0.01370408944785595,
-0.057461224496364594,
0.18980170786380768,
-0.09326229244470596,
-0.07732626795768738,
-0.07326111942529678,
0.017854776233434677,
-0.04330899566411972,
-0.03619774803519249,
0.006642848253250122,
-0.09191552549600601,
-0.02900133840739727,
-0.0813523605465889,
-0.01932356506586075,
-0.02896801196038723,
0.020343439653515816,
0.025350375100970268,
-0.01798841916024685,
0.08044349402189255,
-0.13616858422756195,
0.007478958461433649,
-0.048439230769872665,
-0.09923508763313293,
0.0038379787001758814,
0.07512275129556656,
0.0903155580163002,
0.08445531129837036,
-0.013256911188364029,
0.024363238364458084,
-0.03933253139257431,
0.23324468731880188,
-0.055834896862506866,
0.0116911381483078,
0.11738152801990509,
-0.01655956730246544,
0.052573222666978836,
0.09450887143611908,
0.03689496964216232,
-0.09174003452062607,
0.023319777101278305,
0.07855229824781418,
-0.03770880028605461,
-0.22917470335960388,
-0.015610589645802975,
-0.005672270432114601,
-0.08346566557884216,
0.10254346579313278,
0.032031379640102386,
-0.0517800934612751,
0.04027615487575531,
0.018242737278342247,
-0.00980332400649786,
-0.03914034366607666,
0.06881441175937653,
0.07771365344524384,
0.047014910727739334,
0.10807755589485168,
-0.004525963682681322,
-0.019538942724466324,
0.05589298903942108,
0.016141610220074654,
0.26148512959480286,
-0.04058017209172249,
0.10424721240997314,
0.03311774134635925,
0.1500452607870102,
-0.021644746884703636,
0.06317207962274551,
0.0010605431161820889,
-0.009349608793854713,
-0.012643855065107346,
-0.062099043279886246,
-0.02947157993912697,
0.014241301454603672,
-0.041753821074962616,
0.02285785786807537,
-0.08201282471418381,
0.02795291692018509,
0.02181049808859825,
0.28607386350631714,
0.030156666412949562,
-0.2521117925643921,
-0.07676403224468231,
-0.013274069875478745,
-0.05134258046746254,
-0.0596257708966732,
0.008099845610558987,
0.13845352828502655,
-0.1408270001411438,
0.045010168105363846,
-0.07790496200323105,
0.08640354126691818,
-0.050328101962804794,
0.011218471452593803,
0.049425818026065826,
0.1478804647922516,
-0.01664663664996624,
0.05531163886189461,
-0.19360150396823883,
0.2542085647583008,
0.017339108511805534,
0.10302307456731796,
-0.06473252922296524,
0.013232012279331684,
0.022071395069360733,
0.0186212919652462,
0.11653679609298706,
0.0031092579010874033,
-0.07201338559389114,
-0.14569827914237976,
-0.09270196408033371,
0.046775758266448975,
0.14234189689159393,
-0.04693707823753357,
0.08956265449523926,
-0.03712388500571251,
0.01271600741893053,
0.03632868453860283,
-0.03435339406132698,
-0.14732587337493896,
-0.08640291541814804,
-0.00021242169896140695,
0.005442412104457617,
-0.0072487820871174335,
-0.06245797500014305,
-0.1053382158279419,
-0.009170006029307842,
0.10596908628940582,
0.0027898848056793213,
-0.05495688319206238,
-0.1575707495212555,
0.08873937278985977,
0.14365500211715698,
-0.05881080776453018,
0.01197521761059761,
0.016157234087586403,
0.11357785016298294,
0.03530099615454674,
-0.07840648293495178,
0.061020515859127045,
-0.06158503144979477,
-0.18167665600776672,
-0.05528264865279198,
0.1249701976776123,
0.08209406584501266,
0.05012805014848709,
0.00008133075607474893,
0.05069879814982414,
0.0013251908821985126,
-0.0957605242729187,
0.035103537142276764,
0.009365266188979149,
0.03464630991220474,
0.01675669476389885,
-0.08749911934137344,
0.09948499500751495,
-0.03654244914650917,
0.009326622821390629,
0.13186600804328918,
0.20979924499988556,
-0.10629290342330933,
0.11529436707496643,
0.08506366610527039,
-0.07431749999523163,
-0.16699320077896118,
0.05847395211458206,
0.1316998451948166,
0.011409093625843525,
0.0856715738773346,
-0.2133738398551941,
0.12217055261135101,
0.0996171087026596,
-0.012029821053147316,
0.008600123226642609,
-0.2798539102077484,
-0.1286894530057907,
0.05785120651125908,
0.10867099463939667,
0.042660053819417953,
-0.11655481159687042,
-0.03689087927341461,
-0.003466773545369506,
-0.09335026890039444,
0.11137363314628601,
-0.07062895596027374,
0.11506800353527069,
-0.015903789550065994,
0.11117231845855713,
0.02607172355055809,
-0.030047809705138206,
0.1101183295249939,
0.059468239545822144,
0.07845607399940491,
-0.03468313813209534,
0.007317188195884228,
0.054570358246564865,
-0.056537926197052,
0.015427074395120144,
-0.041977833956480026,
0.06721397489309311,
-0.15066921710968018,
-0.0008089604671113193,
-0.09124428033828735,
0.050153348594903946,
-0.049526944756507874,
-0.07172189652919769,
-0.014873581938445568,
0.05330809950828552,
0.07534712553024292,
-0.03938833251595497,
0.025559071451425552,
-0.004947112873196602,
0.09628228843212128,
0.09619705379009247,
0.08033112436532974,
-0.014312883839011192,
-0.0922495648264885,
0.010254347696900368,
0.004376656841486692,
0.054233744740486145,
-0.10472693294286728,
0.015192892402410507,
0.1365729719400406,
0.06529174745082855,
0.09595822542905807,
0.04707418754696846,
-0.04124007374048233,
0.003911754582077265,
0.012889103032648563,
-0.12128154933452606,
-0.11510009318590164,
0.023207878693938255,
-0.0451493114233017,
-0.15516875684261322,
0.020134156569838524,
0.12085485458374023,
-0.03982219099998474,
-0.01813233457505703,
-0.008041119202971458,
0.006123912986367941,
-0.013553675264120102,
0.1838310807943344,
0.04517478495836258,
0.06349656730890274,
-0.08701585233211517,
0.10706585645675659,
0.03651930019259453,
-0.05220508947968483,
0.05002432316541672,
0.06254434585571289,
-0.10321350395679474,
0.00906454585492611,
0.07614775747060776,
0.12451379746198654,
-0.04855605587363243,
-0.00939863920211792,
-0.08833841234445572,
-0.08494626730680466,
0.04151534289121628,
0.13101133704185486,
0.05437002703547478,
-0.00023591023636981845,
-0.07022947072982788,
0.042075008153915405,
-0.11751288175582886,
0.07188892364501953,
0.04572947695851326,
0.06981498003005981,
-0.10118606686592102,
0.13080771267414093,
-0.0016127537237480283,
0.028068413957953453,
-0.026283282786607742,
0.014564690180122852,
-0.09536822885274887,
-0.024646325036883354,
-0.10724428296089172,
-0.023970995098352432,
-0.008556724525988102,
0.0005988482153043151,
-0.021744053810834885,
-0.07545915246009827,
-0.02704945020377636,
0.038878824561834335,
-0.07583737373352051,
-0.050453219562768936,
0.01309671625494957,
0.04019881412386894,
-0.15094207227230072,
0.0009465691982768476,
0.030208786949515343,
-0.09396444261074066,
0.09214918315410614,
0.06314525008201599,
0.015281811356544495,
0.026986217126250267,
-0.11302629113197327,
-0.028077587485313416,
-0.010507244616746902,
0.005631625186651945,
0.06479823589324951,
-0.09910949319601059,
-0.027796415612101555,
-0.0388321578502655,
0.045595962554216385,
0.017420941963791847,
0.09934215992689133,
-0.11784880608320236,
-0.004746938124299049,
-0.03943708539009094,
-0.04268168285489082,
-0.062767393887043,
0.034888915717601776,
0.10102199018001556,
0.05548173561692238,
0.14941473305225372,
-0.07376062870025635,
0.05950849875807762,
-0.20145194232463837,
-0.03555154800415039,
0.0101970499381423,
-0.04194125533103943,
-0.08353061228990555,
-0.052260156720876694,
0.08819730579853058,
-0.04457792267203331,
0.10534536838531494,
-0.020105499774217606,
0.10955306142568588,
0.042912956327199936,
-0.009828321635723114,
-0.058828648179769516,
-0.006429313216358423,
0.18743619322776794,
0.058711372315883636,
-0.016251027584075928,
0.12943702936172485,
-0.0021340183448046446,
0.030377401039004326,
0.084166020154953,
0.22502051293849945,
0.16137537360191345,
0.0013945306418463588,
0.06356174498796463,
0.06072181463241577,
-0.07242655754089355,
-0.15318815410137177,
0.1172000840306282,
-0.019946111366152763,
0.10145141184329987,
-0.06601681560277939,
0.19040578603744507,
0.040506839752197266,
-0.1842053085565567,
0.06347029656171799,
-0.025184230878949165,
-0.1110842153429985,
-0.1229851245880127,
-0.024924684315919876,
-0.07001162320375443,
-0.11990845203399658,
0.02357054315507412,
-0.1169709712266922,
0.06310713291168213,
0.10109860450029373,
0.008138549514114857,
0.038486234843730927,
0.18304234743118286,
-0.04587595537304878,
0.01039949432015419,
0.0820607990026474,
0.01995987258851528,
0.007182114291936159,
-0.04305307939648628,
-0.06680411100387573,
0.03537696972489357,
0.03458967059850693,
0.0628502294421196,
-0.05138158053159714,
0.000583076209295541,
0.008025672286748886,
-0.008090375922620296,
-0.077300526201725,
0.010651013813912868,
0.00964298564940691,
0.05392655357718468,
0.05017358809709549,
0.04634876921772957,
0.006260526832193136,
-0.05334712937474251,
0.29842409491539,
-0.06941768527030945,
-0.06999623030424118,
-0.12971076369285583,
0.20618799328804016,
0.021450847387313843,
-0.02158779837191105,
0.05554642155766487,
-0.0844004899263382,
-0.015246614813804626,
0.17136558890342712,
0.1320212036371231,
-0.09395060688257217,
-0.015957409515976906,
-0.014562267810106277,
-0.009986472316086292,
-0.013402231968939304,
0.11708424240350723,
0.07627732306718826,
-0.010227222926914692,
-0.06922844052314758,
-0.018637871369719505,
-0.021765179932117462,
-0.05587154999375343,
-0.06288269907236099,
0.07073207944631577,
0.02457572892308235,
-0.006323776673525572,
-0.06235681101679802,
0.06923556327819824,
-0.00014837112394161522,
-0.24337884783744812,
0.04199947789311409,
-0.17139555513858795,
-0.17019467055797577,
-0.02579478733241558,
0.07352332770824432,
0.00539009366184473,
0.05724696069955826,
0.0011385729303583503,
0.019150136038661003,
0.12311558425426483,
-0.012141366489231586,
-0.003974608611315489,
-0.10855229943990707,
0.1188347190618515,
-0.08556955307722092,
0.19906598329544067,
-0.005836084950715303,
0.05399252101778984,
0.09609593451023102,
0.0407891608774662,
-0.13929703831672668,
0.01624813675880432,
0.06590525060892105,
-0.12902694940567017,
-0.001944621792063117,
0.15012994408607483,
-0.033129170536994934,
0.06192530319094658,
0.027859404683113098,
-0.15289784967899323,
0.006457508075982332,
0.01667219027876854,
-0.0379934161901474,
-0.06650234013795853,
-0.008667290210723877,
-0.052486028522253036,
0.16645514965057373,
0.21788832545280457,
-0.029863998293876648,
0.005629756022244692,
-0.08915390074253082,
0.010179087519645691,
0.04530775174498558,
0.06526117026805878,
-0.04156235605478287,
-0.20462563633918762,
0.009946283884346485,
0.06378138810396194,
-0.003881397657096386,
-0.1937512904405594,
-0.10014920681715012,
0.05245188996195793,
-0.039843954145908356,
-0.042146019637584686,
0.0961293950676918,
0.019649790599942207,
0.03612140566110611,
-0.0112460283562541,
-0.12013301253318787,
-0.021862942725419998,
0.13875122368335724,
-0.17787672579288483,
-0.028308726847171783
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09827876091003418,
0.09642748534679413,
-0.0023708809167146683,
0.09231962263584137,
0.12443501502275467,
0.019458238035440445,
0.09363048523664474,
0.12916314601898193,
-0.0996934175491333,
0.06778281182050705,
0.08820310235023499,
0.03274570032954216,
0.041096944361925125,
0.14080674946308136,
-0.005020953249186277,
-0.2783583998680115,
-0.0007915299502201378,
-0.004006769508123398,
-0.05318006873130798,
0.12002873420715332,
0.08893363177776337,
-0.10938476771116257,
0.07485168427228928,
0.008370551280677319,
-0.15293504297733307,
0.01694858819246292,
-0.03121659718453884,
-0.033826425671577454,
0.12328343093395233,
-0.027232136577367783,
0.10736020654439926,
0.029552219435572624,
0.1369258314371109,
-0.20950092375278473,
0.007304368074983358,
0.07719111442565918,
0.05461917817592621,
0.0976158082485199,
0.0478798933327198,
0.011572235263884068,
0.09900350868701935,
-0.14727121591567993,
0.09578876942396164,
0.029814541339874268,
-0.09014252573251724,
-0.15526655316352844,
-0.08838751167058945,
0.026827583089470863,
0.05209667608141899,
0.07550887018442154,
0.0028352641966193914,
0.1313733607530594,
-0.07304158806800842,
0.08570608496665955,
0.2494143843650818,
-0.31428417563438416,
-0.06852702796459198,
0.02504454180598259,
0.06075552478432655,
0.06254451721906662,
-0.12504272162914276,
-0.0016054121078923345,
0.017175480723381042,
0.028053011745214462,
0.1242043524980545,
-0.01233324222266674,
-0.10376333445310593,
-0.010089787654578686,
-0.12302205711603165,
-0.0015952239045873284,
0.062369782477617264,
0.028521429747343063,
-0.049863629043102264,
-0.10824967175722122,
-0.06665205955505371,
-0.08098024874925613,
-0.022752495482563972,
-0.0517125278711319,
0.046736110001802444,
-0.05188754200935364,
-0.09761817753314972,
-0.043622881174087524,
-0.06006883829832077,
-0.08030775189399719,
-0.006096188444644213,
0.16894680261611938,
0.03361160308122635,
0.020164083689451218,
-0.029919851571321487,
0.11647403985261917,
0.029375210404396057,
-0.13749071955680847,
-0.01041423063725233,
-0.004228103440254927,
-0.09805342555046082,
-0.04025433212518692,
-0.05651635676622391,
-0.004100745543837547,
0.0029511095490306616,
0.16594862937927246,
-0.07270155847072601,
0.07427316904067993,
0.015332055278122425,
-0.02702937461435795,
-0.0147023256868124,
0.15249228477478027,
-0.042692117393016815,
-0.04514016583561897,
-0.016306115314364433,
0.08192010223865509,
-0.003102220594882965,
-0.0212552472949028,
-0.06494755297899246,
-0.027797695249319077,
0.09475881606340408,
0.05599551647901535,
-0.05978957563638687,
0.03812876716256142,
-0.026745231822133064,
-0.024905964732170105,
0.017050359398126602,
-0.11908738315105438,
0.04012942314147949,
0.0016283411532640457,
-0.07634992897510529,
-0.0040852683596313,
0.00019233096099924296,
-0.010627350769937038,
-0.0048813596367836,
0.09977196156978607,
-0.08566906303167343,
-0.0057214731350541115,
-0.06660804897546768,
-0.07834798842668533,
-0.0009579550824128091,
-0.1427476853132248,
-0.008314200676977634,
-0.05769288167357445,
-0.16276927292346954,
-0.03741912171244621,
0.04315849393606186,
-0.07591799646615982,
-0.01659221388399601,
-0.043691325932741165,
-0.06255098432302475,
0.021953066810965538,
-0.012371782213449478,
0.20085996389389038,
-0.050415489822626114,
0.0828307494521141,
-0.00906430371105671,
0.048863984644412994,
0.030217614024877548,
0.03844809904694557,
-0.0951063260436058,
0.026370877400040627,
-0.1344110667705536,
0.08423583954572678,
-0.08517108112573624,
-0.004845356103032827,
-0.13607850670814514,
-0.09946999698877335,
0.008429031819105148,
-0.018532855436205864,
0.0876002386212349,
0.13379040360450745,
-0.19493639469146729,
-0.021424444392323494,
0.1239054724574089,
-0.074318528175354,
-0.04390726238489151,
0.06351595371961594,
-0.06471169739961624,
0.03894076868891716,
0.053854916244745255,
0.20666153728961945,
0.06220263987779617,
-0.14967741072177887,
-0.005505470559000969,
0.013611825183033943,
0.0504748560488224,
0.03052753582596779,
0.04301489517092705,
0.001811324036680162,
0.054192621260881424,
0.014029383659362793,
-0.09360581636428833,
-0.021759791299700737,
-0.09086661785840988,
-0.06536687165498734,
-0.04889021813869476,
-0.07511469721794128,
0.0556783527135849,
0.0096057103946805,
0.038521986454725266,
-0.06045237556099892,
-0.10608821362257004,
0.11437561362981796,
0.10036742687225342,
-0.056005630642175674,
0.037812814116477966,
-0.07767239958047867,
0.011384587734937668,
-0.005740626249462366,
-0.034226249903440475,
-0.21166791021823883,
-0.12488853186368942,
0.04710470885038376,
-0.034006133675575256,
0.020767267793416977,
0.015768392011523247,
0.08565892279148102,
0.05682571977376938,
-0.05335530638694763,
-0.014748159795999527,
-0.09801676124334335,
0.0023393852170556784,
-0.1134447455406189,
-0.19220072031021118,
-0.08767148107290268,
-0.045161210000514984,
0.09892618656158447,
-0.17902390658855438,
-0.00882444716989994,
0.024098610505461693,
0.13127760589122772,
0.025940924882888794,
-0.06722567230463028,
-0.0010255061788484454,
0.042655568569898605,
0.012760214507579803,
-0.09589394181966782,
0.05517067760229111,
0.012463145889341831,
-0.10649299621582031,
-0.04480340704321861,
-0.12624938786029816,
-0.01620735228061676,
0.05108753964304924,
0.059294380247592926,
-0.09966139495372772,
-0.05893228203058243,
-0.0746084451675415,
-0.03798115253448486,
-0.07739309221506119,
0.01566699892282486,
0.21304820477962494,
0.038828056305646896,
0.1099015548825264,
-0.061247311532497406,
-0.08204824477434158,
-0.00830768421292305,
0.030950749292969704,
0.02449633926153183,
0.08901863545179367,
0.02114054746925831,
-0.04139762371778488,
0.06607550382614136,
0.10351276397705078,
-0.021492542698979378,
0.13013353943824768,
-0.055486395955085754,
-0.08417504280805588,
-0.030440764501690865,
-0.017780162394046783,
-0.026231152936816216,
0.12486408650875092,
-0.03701755404472351,
0.0006818310357630253,
0.034611184149980545,
0.04010067135095596,
0.011542602442204952,
-0.1682329773902893,
0.0017588749760761857,
0.030585909262299538,
-0.05631035193800926,
-0.04349425435066223,
-0.004697263240814209,
0.018606625497341156,
0.08633644878864288,
0.03124941885471344,
-0.003372384700924158,
0.007452191784977913,
-0.013481386005878448,
-0.05676178261637688,
0.19045594334602356,
-0.09330904483795166,
-0.0762324258685112,
-0.07285814732313156,
0.01627061888575554,
-0.04442724958062172,
-0.03677960857748985,
0.006353290751576424,
-0.09357832372188568,
-0.029146824032068253,
-0.08113447576761246,
-0.020606625825166702,
-0.028608132153749466,
0.020118482410907745,
0.02420702949166298,
-0.018260229378938675,
0.07973475754261017,
-0.13572466373443604,
0.007796037010848522,
-0.048783715814352036,
-0.09927718341350555,
0.004696310497820377,
0.07544052600860596,
0.09078975021839142,
0.08434329926967621,
-0.012991621159017086,
0.02448194846510887,
-0.039744097739458084,
0.2338494509458542,
-0.055948760360479355,
0.01187323871999979,
0.1170920878648758,
-0.01548363734036684,
0.05123639479279518,
0.09459110349416733,
0.03791603818535805,
-0.09215117990970612,
0.023070702329277992,
0.07911881059408188,
-0.03715407848358154,
-0.22888846695423126,
-0.014973762445151806,
-0.0060506174340844154,
-0.08298701792955399,
0.1023608148097992,
0.03196730464696884,
-0.05113714933395386,
0.041067466139793396,
0.01924535445868969,
-0.008882613852620125,
-0.03930028900504112,
0.06828058511018753,
0.07915499806404114,
0.04640021547675133,
0.10923562943935394,
-0.004680476151406765,
-0.019933633506298065,
0.05494404211640358,
0.016320956870913506,
0.26219454407691956,
-0.04098714888095856,
0.10379429161548615,
0.033742837607860565,
0.1492242068052292,
-0.021152306348085403,
0.06416713446378708,
0.0005335986497811973,
-0.010166268795728683,
-0.012289393693208694,
-0.06191153824329376,
-0.028044864535331726,
0.013365623541176319,
-0.042702145874500275,
0.02213764563202858,
-0.08230581879615784,
0.02628183923661709,
0.022111590951681137,
0.2856285572052002,
0.030314847826957703,
-0.2550255358219147,
-0.07784847170114517,
-0.014154264703392982,
-0.05088697373867035,
-0.05907733365893364,
0.008272520266473293,
0.137958362698555,
-0.1400662064552307,
0.045907262712717056,
-0.07802904397249222,
0.0859159454703331,
-0.05039634928107262,
0.012303341180086136,
0.05164458602666855,
0.1487903892993927,
-0.017546553164720535,
0.05501502752304077,
-0.1939641386270523,
0.25300487875938416,
0.017458517104387283,
0.103948213160038,
-0.06503105908632278,
0.013059845194220543,
0.0229391660541296,
0.020054997876286507,
0.11627943813800812,
0.002380837919190526,
-0.07188159972429276,
-0.1457459181547165,
-0.09157312661409378,
0.04817112535238266,
0.14134550094604492,
-0.0457320474088192,
0.09060373902320862,
-0.03598947077989578,
0.012216886505484581,
0.03587735444307327,
-0.03538183867931366,
-0.14769293367862701,
-0.08698965609073639,
-0.0008521609124727547,
0.006891167256981134,
-0.006909375544637442,
-0.0616980716586113,
-0.10593877732753754,
-0.010686296038329601,
0.10538313537836075,
0.0042944299057126045,
-0.05516764149069786,
-0.15828822553157806,
0.08820774406194687,
0.14380203187465668,
-0.057537779211997986,
0.011793948709964752,
0.01707013137638569,
0.11258271336555481,
0.0363447405397892,
-0.07803785800933838,
0.061049796640872955,
-0.0624893493950367,
-0.18012909591197968,
-0.05526383966207504,
0.1239340603351593,
0.08174122124910355,
0.04935722425580025,
-0.00025006511714309454,
0.05049259588122368,
0.0006114306743256748,
-0.09645448625087738,
0.03555793687701225,
0.007031651213765144,
0.03561940789222717,
0.01636281982064247,
-0.08833122998476028,
0.09923551231622696,
-0.03583374619483948,
0.010425158776342869,
0.1297798603773117,
0.20673620700836182,
-0.10549099743366241,
0.11300477385520935,
0.08561336994171143,
-0.07378892600536346,
-0.16668559610843658,
0.05957816168665886,
0.13034968078136444,
0.012163368985056877,
0.0840192437171936,
-0.21421535313129425,
0.12337552011013031,
0.09885916113853455,
-0.01078480388969183,
0.010657422244548798,
-0.2768269181251526,
-0.12778593599796295,
0.05841407924890518,
0.10948745161294937,
0.043320950120687485,
-0.11700307577848434,
-0.03613909333944321,
-0.0035661226138472557,
-0.09260818362236023,
0.11043307185173035,
-0.07373224198818207,
0.11526785045862198,
-0.016225820407271385,
0.11161153018474579,
0.025234343484044075,
-0.03015800565481186,
0.10894794017076492,
0.06045922636985779,
0.07960021495819092,
-0.03483232855796814,
0.008177489042282104,
0.05466720089316368,
-0.055847253650426865,
0.015625957399606705,
-0.043196290731430054,
0.06652373820543289,
-0.15113778412342072,
-0.000754424836486578,
-0.09230608493089676,
0.049975860863924026,
-0.04922277480363846,
-0.07158349454402924,
-0.013701660558581352,
0.05377641320228577,
0.07399095594882965,
-0.03952663019299507,
0.023916160687804222,
-0.00607318663969636,
0.09676393866539001,
0.09512533992528915,
0.08113756775856018,
-0.01565157063305378,
-0.09321665018796921,
0.011465421877801418,
0.003943110816180706,
0.054147105664014816,
-0.10417565703392029,
0.013638590462505817,
0.13748499751091003,
0.06460938602685928,
0.09571946412324905,
0.04803653433918953,
-0.04022843763232231,
0.003332277061417699,
0.014024247415363789,
-0.12151969969272614,
-0.1139446571469307,
0.02341918647289276,
-0.04898080229759216,
-0.15484464168548584,
0.02156483754515648,
0.12081553786993027,
-0.039813484996557236,
-0.017911924049258232,
-0.008052987977862358,
0.005417780019342899,
-0.013799551874399185,
0.18547673523426056,
0.045184604823589325,
0.06323171406984329,
-0.08819389343261719,
0.10641894489526749,
0.03632884845137596,
-0.053024858236312866,
0.050149109214544296,
0.06344915926456451,
-0.104289710521698,
0.007788961753249168,
0.07523457705974579,
0.12589912116527557,
-0.047196581959724426,
-0.01049889624118805,
-0.08984542638063431,
-0.08542963117361069,
0.041585978120565414,
0.1296033412218094,
0.05403994023799896,
-0.001217724522575736,
-0.07046329230070114,
0.04149672016501427,
-0.11833004653453827,
0.07152862846851349,
0.04486164078116417,
0.07009802013635635,
-0.1013602763414383,
0.1301708221435547,
-0.0016707982867956161,
0.027370546013116837,
-0.026301879435777664,
0.015164089389145374,
-0.09629282355308533,
-0.024602238088846207,
-0.10741610080003738,
-0.02520676702260971,
-0.00897442176938057,
0.0011172274826094508,
-0.022058043628931046,
-0.07390253245830536,
-0.02783339098095894,
0.03891646862030029,
-0.07621920853853226,
-0.049804575741291046,
0.015052986331284046,
0.040925033390522,
-0.15030662715435028,
0.0013295008102431893,
0.028757547959685326,
-0.0938393697142601,
0.09194387495517731,
0.06308291852474213,
0.01493336446583271,
0.027370011433959007,
-0.11117285490036011,
-0.02873934805393219,
-0.010989971458911896,
0.0047808485105633736,
0.06551429629325867,
-0.09839054942131042,
-0.027131086215376854,
-0.038980551064014435,
0.045858047902584076,
0.017790919169783592,
0.09827572852373123,
-0.11683650314807892,
-0.004519691690802574,
-0.03872036933898926,
-0.041350677609443665,
-0.06323053687810898,
0.035521674901247025,
0.10179146379232407,
0.054700370877981186,
0.14998841285705566,
-0.0738021731376648,
0.05868390575051308,
-0.20173180103302002,
-0.03603731468319893,
0.010055329650640488,
-0.043363817036151886,
-0.08245581388473511,
-0.05214862525463104,
0.08907755464315414,
-0.04530417546629906,
0.10679788142442703,
-0.020392246544361115,
0.11036289483308792,
0.04228626936674118,
-0.009734852239489555,
-0.05998658388853073,
-0.005672157276421785,
0.1869731843471527,
0.05820072442293167,
-0.017117684707045555,
0.12913775444030762,
-0.0006917508435435593,
0.02944820187985897,
0.08587817847728729,
0.22309629619121552,
0.16019926965236664,
0.0022992321755737066,
0.06393295526504517,
0.061533015221357346,
-0.07303024083375931,
-0.15183503925800323,
0.11816881597042084,
-0.019678939133882523,
0.10239306092262268,
-0.06714993715286255,
0.18914784491062164,
0.03977075591683388,
-0.1831825077533722,
0.06394906342029572,
-0.026227887719869614,
-0.11137016862630844,
-0.1215139701962471,
-0.024182887747883797,
-0.06916795670986176,
-0.12066538631916046,
0.023845359683036804,
-0.1175089105963707,
0.06152186915278435,
0.10216102749109268,
0.00922676082700491,
0.03803358972072601,
0.18485164642333984,
-0.04456633701920509,
0.011376209557056427,
0.08259360492229462,
0.019245650619268417,
0.006618539337068796,
-0.0427977629005909,
-0.06567554175853729,
0.03485899791121483,
0.03281065821647644,
0.06233832240104675,
-0.0524294488132,
-0.0003520048048812896,
0.008941731415688992,
-0.0071167610585689545,
-0.07689324021339417,
0.010378846898674965,
0.010013660416007042,
0.05374902859330177,
0.05133245140314102,
0.04587779939174652,
0.00516129657626152,
-0.053510040044784546,
0.2971133589744568,
-0.06950928270816803,
-0.0687503069639206,
-0.130589097738266,
0.20601187646389008,
0.021256281062960625,
-0.02220275066792965,
0.05450988560914993,
-0.08346974849700928,
-0.013889813795685768,
0.17270876467227936,
0.13424788415431976,
-0.09365533292293549,
-0.016362445428967476,
-0.014053240418434143,
-0.00998383853584528,
-0.013757252134382725,
0.11729688942432404,
0.07626354694366455,
-0.010682172141969204,
-0.06932131201028824,
-0.018464379012584686,
-0.021038739010691643,
-0.0559709258377552,
-0.062396638095378876,
0.07024165987968445,
0.026074647903442383,
-0.006705723237246275,
-0.061545561999082565,
0.06953474879264832,
-0.000793942715972662,
-0.2431575059890747,
0.0430632121860981,
-0.170730859041214,
-0.17062947154045105,
-0.027022728696465492,
0.07283005118370056,
0.006261197850108147,
0.05827765911817551,
0.0011160257272422314,
0.01883736252784729,
0.12365557253360748,
-0.012460751459002495,
-0.003044714918360114,
-0.1094968244433403,
0.11885575205087662,
-0.08522844314575195,
0.1976613849401474,
-0.006721440702676773,
0.05409299582242966,
0.09677435457706451,
0.04027194529771805,
-0.1391255408525467,
0.017081864178180695,
0.06510402262210846,
-0.1295672208070755,
-0.0011891849571838975,
0.14944779872894287,
-0.03306723013520241,
0.061182573437690735,
0.026392582803964615,
-0.1520407646894455,
0.007428625598549843,
0.016888000071048737,
-0.03767954930663109,
-0.06647038459777832,
-0.008253255859017372,
-0.051973383873701096,
0.16712869703769684,
0.21955354511737823,
-0.02972530573606491,
0.0054349941201508045,
-0.08921439945697784,
0.010452395305037498,
0.04586643725633621,
0.0638914406299591,
-0.0423421747982502,
-0.20455315709114075,
0.010251834988594055,
0.06430257111787796,
-0.00438170088455081,
-0.19518454372882843,
-0.0988045483827591,
0.05269736051559448,
-0.039742931723594666,
-0.04235117509961128,
0.09610281884670258,
0.020953666418790817,
0.037360209971666336,
-0.01199491135776043,
-0.11775678396224976,
-0.021837761625647545,
0.1385280340909958,
-0.17820870876312256,
-0.029093977063894272
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 4.541154210028382, 'f1': 10.04181288563879}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
{'exact_match': 4.541154210028382, 'f1': 10.04181288563879}
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 4.541154210028382, 'f1': 10.04181288563879}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results\n\n{'exact_match': 4.541154210028382, 'f1': 10.04181288563879}",
"### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
46,
55,
6,
12,
8,
3,
104,
33,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-42\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results\n\n{'exact_match': 4.541154210028382, 'f1': 10.04181288563879}### Framework versions\n\n- Transformers 4.16.2\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] |
[
-0.11189243942499161,
0.08955613523721695,
-0.002014629542827606,
0.09633529931306839,
0.1398695707321167,
0.03289299085736275,
0.1007523238658905,
0.11851631850004196,
-0.09231875091791153,
0.0703907459974289,
0.06741346418857574,
0.04188104346394539,
0.048698294907808304,
0.11773545295000076,
-0.02498985454440117,
-0.2732337713241577,
-0.012148628011345863,
0.0031990332063287497,
-0.12357587367296219,
0.11287162452936172,
0.11121991276741028,
-0.09411292523145676,
0.07839492708444595,
0.014398854225873947,
-0.16238845884799957,
0.024168387055397034,
-0.0160051379352808,
-0.02819034643471241,
0.11663706600666046,
0.006062178406864405,
0.11811500787734985,
0.02294127643108368,
0.13595980405807495,
-0.20058631896972656,
0.007111791055649519,
0.08696731925010681,
0.04373161494731903,
0.10142233222723007,
0.06873521953821182,
-0.01559353806078434,
0.09539084136486053,
-0.14495989680290222,
0.07956483215093613,
0.04222025349736214,
-0.10135786980390549,
-0.1657862514257431,
-0.10116621851921082,
0.05501694977283478,
0.06524166464805603,
0.07604654878377914,
0.006018676795065403,
0.11834511905908585,
-0.051230233162641525,
0.08448273688554764,
0.25831687450408936,
-0.3071476221084595,
-0.08485609292984009,
0.04830831289291382,
0.06098465994000435,
0.042920514941215515,
-0.1194380447268486,
0.001210939371958375,
0.030234839767217636,
0.03081960417330265,
0.09851808100938797,
-0.023661311715841293,
-0.13880930840969086,
-0.01058198418468237,
-0.1317468136548996,
0.01577880047261715,
0.08931951969861984,
0.041517630219459534,
-0.05174470692873001,
-0.06847342103719711,
-0.06063171103596687,
-0.09185304492712021,
-0.015470834448933601,
-0.052233755588531494,
0.04340340942144394,
-0.05910949036478996,
-0.07893680781126022,
-0.06042978912591934,
-0.06487223505973816,
-0.08465978503227234,
-0.017677882686257362,
0.1754710078239441,
0.02454879693686962,
0.022389281541109085,
-0.028759906068444252,
0.1210237443447113,
0.009091099724173546,
-0.13461658358573914,
0.0003745590802282095,
-0.0011246675858274102,
-0.11097276210784912,
-0.04313958063721657,
-0.05828377231955528,
0.007622301112860441,
-0.004190813284367323,
0.16282667219638824,
-0.0978202149271965,
0.07332970947027206,
0.029399123042821884,
-0.013904213905334473,
-0.03715484216809273,
0.14217022061347961,
-0.07668887823820114,
-0.042467210441827774,
-0.0188875924795866,
0.08224230259656906,
-0.0038406180683523417,
-0.01242179237306118,
-0.06051161140203476,
-0.04475146532058716,
0.06749166548252106,
0.055470433086156845,
-0.03758314996957779,
0.045554764568805695,
-0.023180682212114334,
-0.03914366662502289,
0.019746212288737297,
-0.12159498035907745,
0.03266771510243416,
0.0024052439257502556,
-0.10988439619541168,
-0.03439659997820854,
0.016738606616854668,
0.00397613225504756,
-0.00862080417573452,
0.09180880337953568,
-0.0911451205611229,
0.0021456123795360327,
-0.07772332429885864,
-0.0931006446480751,
-0.0016351310769096017,
-0.13128513097763062,
-0.015048461966216564,
-0.04493080452084541,
-0.167831689119339,
-0.0486568883061409,
0.04144815355539322,
-0.06958422809839249,
-0.02698688954114914,
-0.02808382920920849,
-0.07099851220846176,
0.015820620581507683,
-0.006365410052239895,
0.20867492258548737,
-0.04512782394886017,
0.07128757238388062,
-0.0012672868324443698,
0.04614095389842987,
0.003652583807706833,
0.0353739894926548,
-0.08448576927185059,
0.03177052363753319,
-0.12990611791610718,
0.07962863147258759,
-0.09892264008522034,
0.002157814335078001,
-0.13366657495498657,
-0.09798163175582886,
0.015716800466179848,
-0.009477164596319199,
0.09158729016780853,
0.1231277659535408,
-0.20642782747745514,
-0.013501422479748726,
0.11823183298110962,
-0.07172717154026031,
-0.0808347687125206,
0.05406758189201355,
-0.046409785747528076,
0.04445924609899521,
0.039417918771505356,
0.17193178832530975,
0.10502877831459045,
-0.14695337414741516,
-0.014502485282719135,
0.013970605097711086,
0.048567451536655426,
0.03303360566496849,
0.04155271127820015,
-0.0057388050481677055,
0.04379529505968094,
0.010601668618619442,
-0.11724641919136047,
-0.030727792531251907,
-0.09933778643608093,
-0.06959591805934906,
-0.05159591883420944,
-0.0940716341137886,
0.04997202381491661,
0.0225833747535944,
0.03537346422672272,
-0.06266645342111588,
-0.11082341521978378,
0.11684741824865341,
0.11293818801641464,
-0.04633055999875069,
0.026561498641967773,
-0.07580742239952087,
-0.011892331764101982,
0.020546818152070045,
-0.03866852447390556,
-0.21655169129371643,
-0.1574033498764038,
0.02312343195080757,
-0.054808035492897034,
0.034619029611349106,
0.011139560490846634,
0.08664064854383469,
0.05351334810256958,
-0.053788721561431885,
-0.0037706862203776836,
-0.07896039634943008,
-0.00835033692419529,
-0.10318750888109207,
-0.2076280564069748,
-0.10177638381719589,
-0.02302447520196438,
0.15537749230861664,
-0.20724977552890778,
0.009670942090451717,
0.002446914790198207,
0.1438840627670288,
0.014264252968132496,
-0.05527032911777496,
0.0014402226079255342,
0.04223619028925896,
0.007754577323794365,
-0.09465523809194565,
0.049283720552921295,
0.011144177056849003,
-0.10381966084241867,
-0.03827335685491562,
-0.14569735527038574,
0.006224137730896473,
0.05469915643334389,
0.03890179470181465,
-0.10807473212480545,
-0.042548615485429764,
-0.07286626845598221,
-0.04448120668530464,
-0.06710310280323029,
0.011628339998424053,
0.18574972450733185,
0.03046259470283985,
0.10613652318716049,
-0.052454739809036255,
-0.07693005353212357,
-0.01305188238620758,
0.01577168144285679,
0.016556525602936745,
0.09708260744810104,
0.02737423963844776,
-0.04651472717523575,
0.07199050486087799,
0.07177937030792236,
-0.05355919897556305,
0.12485479563474655,
-0.046436019241809845,
-0.07312022149562836,
-0.028978068381547928,
-0.008064709603786469,
-0.013326141983270645,
0.12669143080711365,
-0.029321717098355293,
0.0103595657274127,
0.033692970871925354,
0.01977180317044258,
0.028260741382837296,
-0.17650176584720612,
-0.00449266517534852,
0.017562774941325188,
-0.05320007726550102,
-0.04143051430583,
-0.014173905365169048,
0.035619232803583145,
0.09604691714048386,
0.022824730724096298,
-0.00004846624506171793,
0.00515001779422164,
-0.012721981853246689,
-0.07283730804920197,
0.20749588310718536,
-0.09931834042072296,
-0.09552815556526184,
-0.10173800587654114,
0.04248866066336632,
-0.0643782764673233,
-0.037664979696273804,
-0.0030245925299823284,
-0.08845458924770355,
-0.03231251239776611,
-0.07001450657844543,
0.0036637766752392054,
-0.009568456560373306,
0.0016562857199460268,
0.017219360917806625,
0.00018727034330368042,
0.10047992318868637,
-0.1450304538011551,
0.021093420684337616,
-0.048743970692157745,
-0.11625059694051743,
-0.005208162125200033,
0.07930713146924973,
0.09886962175369263,
0.10900995880365372,
-0.01642758771777153,
0.019920969381928444,
-0.03079131431877613,
0.22801223397254944,
-0.061090029776096344,
0.01629614643752575,
0.12944772839546204,
-0.008726722560822964,
0.052113402634859085,
0.0929127112030983,
0.04753227159380913,
-0.09224136173725128,
0.027344178408384323,
0.1027417853474617,
-0.0263628251850605,
-0.2539743483066559,
-0.027737077325582504,
-0.00413119699805975,
-0.07418974488973618,
0.09880250692367554,
0.02152058109641075,
-0.023889485746622086,
0.05954907834529877,
-0.0015634705778211355,
-0.0012647839030250907,
-0.03520193696022034,
0.0585089810192585,
0.08058258891105652,
0.04830281063914299,
0.11974310874938965,
-0.016979718580842018,
-0.014490341767668724,
0.05318195000290871,
0.01587795838713646,
0.24906150996685028,
-0.05400130897760391,
0.10077883303165436,
0.04270002618432045,
0.1456872522830963,
-0.03111073188483715,
0.06245137378573418,
0.006260763853788376,
-0.021503670141100883,
0.015940284356474876,
-0.0672728568315506,
-0.01555607095360756,
0.010471790097653866,
-0.04707933962345123,
0.04050346091389656,
-0.08354561775922775,
0.041546158492565155,
0.018711013719439507,
0.2843042314052582,
0.025495411828160286,
-0.2666451930999756,
-0.08746767789125443,
-0.011713234707713127,
-0.005537665449082851,
-0.06918378174304962,
-0.006764846853911877,
0.13058510422706604,
-0.136027991771698,
0.06555719673633575,
-0.07171732187271118,
0.09458897262811661,
-0.027216177433729172,
0.0022300290875136852,
0.07795578241348267,
0.1571822464466095,
-0.020062223076820374,
0.05728236213326454,
-0.1947917342185974,
0.25223344564437866,
0.02139703370630741,
0.10280631482601166,
-0.04916878417134285,
0.013643463142216206,
0.023337556049227715,
0.030622966587543488,
0.0957193523645401,
0.005107393022626638,
-0.05890654772520065,
-0.16029955446720123,
-0.06518819183111191,
0.04263971000909805,
0.1497129201889038,
-0.048094164580106735,
0.08852609246969223,
-0.0346984900534153,
0.006173968780785799,
0.042559266090393066,
-0.06492848694324493,
-0.15581177175045013,
-0.07941460609436035,
0.011593729257583618,
-0.004606354050338268,
-0.016706787049770355,
-0.06919515132904053,
-0.10310136526823044,
-0.018649982288479805,
0.12683627009391785,
-0.0316564179956913,
-0.054922595620155334,
-0.15101954340934753,
0.09417311102151871,
0.16375377774238586,
-0.06451495736837387,
0.014136167243123055,
0.018510017544031143,
0.11788208782672882,
0.03594651445746422,
-0.08854330331087112,
0.06570450216531754,
-0.06992091983556747,
-0.17512641847133636,
-0.04058675840497017,
0.1350424438714981,
0.07575712352991104,
0.046163491904735565,
-0.006572005804628134,
0.04258319362998009,
0.008406594395637512,
-0.10142945498228073,
0.03325923532247543,
-0.019104618579149246,
0.02732032723724842,
0.03847796842455864,
-0.08774901181459427,
0.07001851499080658,
-0.04195742681622505,
0.006685934029519558,
0.1124405637383461,
0.20891131460666656,
-0.11164330691099167,
0.059441495686769485,
0.058030981570482254,
-0.07920275628566742,
-0.17464374005794525,
0.09446209669113159,
0.15292426943778992,
0.001680431654676795,
0.07490792125463486,
-0.21470622718334198,
0.15133610367774963,
0.12945307791233063,
-0.016173498705029488,
0.06500548869371414,
-0.27295783162117004,
-0.14068838953971863,
0.07788312435150146,
0.10249283164739609,
0.0085137989372015,
-0.1449919193983078,
-0.045263759791851044,
-0.014007192105054855,
-0.1623792201280594,
0.14637409150600433,
-0.09097755700349808,
0.10239675641059875,
-0.006963987834751606,
0.09666440635919571,
0.030556203797459602,
-0.022880418226122856,
0.12927228212356567,
0.06563065201044083,
0.09183937311172485,
-0.037158191204071045,
-0.0016770141664892435,
0.07833893597126007,
-0.055283818393945694,
0.027516508474946022,
-0.01410824153572321,
0.06810146570205688,
-0.14272992312908173,
0.013131649233400822,
-0.09617253392934799,
0.07576680183410645,
-0.06459899246692657,
-0.0613037645816803,
-0.017469149082899094,
0.05678410455584526,
0.03605000674724579,
-0.036446746438741684,
0.04706037789583206,
-0.0035580950789153576,
0.14033423364162445,
0.10499647259712219,
0.08473702520132065,
-0.010243529453873634,
-0.09060803055763245,
0.021527215838432312,
0.0030963828321546316,
0.05175239220261574,
-0.10336166620254517,
0.020840059965848923,
0.14154288172721863,
0.08060933649539948,
0.09710237383842468,
0.04665987193584442,
-0.04053255915641785,
-0.005316286813467741,
0.030439093708992004,
-0.11748936772346497,
-0.10816434025764465,
0.004681974183768034,
-0.06444156169891357,
-0.15419888496398926,
0.03610726445913315,
0.11804339289665222,
-0.03108491748571396,
-0.011449414305388927,
-0.009062433615326881,
0.006579871289432049,
-0.021184099838137627,
0.20235402882099152,
0.06895424425601959,
0.07554053515195847,
-0.09006989747285843,
0.10830628871917725,
0.04209135100245476,
-0.05187378078699112,
0.03041878715157509,
0.08038444072008133,
-0.08903194963932037,
0.0027514745015650988,
0.054560694843530655,
0.0983581468462944,
-0.09437131881713867,
-0.0235612653195858,
-0.08996538072824478,
-0.10418594628572464,
0.03149285912513733,
0.16324235498905182,
0.051373306661844254,
-0.004546866752207279,
-0.0324864462018013,
0.03897881135344505,
-0.12905356287956238,
0.0667204037308693,
0.053534138947725296,
0.08823218196630478,
-0.11363741010427475,
0.165944904088974,
0.0067969453521072865,
0.03869287669658661,
-0.015742698684334755,
0.020233191549777985,
-0.10628747195005417,
-0.031875550746917725,
-0.12395034730434418,
-0.03212695196270943,
-0.009069622494280338,
0.00356001197360456,
-0.025014806538820267,
-0.07378815114498138,
-0.04813813418149948,
0.0464748740196228,
-0.0752493292093277,
-0.05802115425467491,
0.012203985825181007,
0.026195675134658813,
-0.16537761688232422,
-0.0013056554598733783,
0.03269211947917938,
-0.09057150036096573,
0.0759897530078888,
0.06901048123836517,
0.03795057162642479,
0.031451404094696045,
-0.13195878267288208,
-0.026549849659204483,
-0.01323745772242546,
0.008166378363966942,
0.07569508999586105,
-0.10034686326980591,
-0.021780023351311684,
-0.052631545811891556,
0.06299803406000137,
0.014581222087144852,
0.09386834502220154,
-0.12255235761404037,
0.004230365622788668,
-0.04162759333848953,
-0.024561533704400063,
-0.06352030485868454,
0.04321981221437454,
0.10734683275222778,
0.0601450651884079,
0.1410265564918518,
-0.06943883746862411,
0.04544806852936745,
-0.21762694418430328,
-0.03503071144223213,
-0.01035361923277378,
-0.045707494020462036,
-0.05032776668667793,
-0.02961338870227337,
0.09461862593889236,
-0.05805119499564171,
0.09715740382671356,
-0.012717502191662788,
0.1104733943939209,
0.04398103803396225,
-0.006059996318072081,
-0.057152364403009415,
0.012625486589968204,
0.16955339908599854,
0.04778427630662918,
-0.0074532488361001015,
0.11716189235448837,
0.019520346075296402,
0.02841092459857464,
0.05118759348988533,
0.2335287630558014,
0.13316188752651215,
-0.045998018234968185,
0.05463045835494995,
0.08845771849155426,
-0.10452680289745331,
-0.12189985811710358,
0.11370687931776047,
-0.022897975519299507,
0.0939398780465126,
-0.0644489973783493,
0.16761739552021027,
0.061378318816423416,
-0.18858401477336884,
0.06310856342315674,
-0.04651522636413574,
-0.11441056430339813,
-0.12040302902460098,
0.00022992015874478966,
-0.06547275930643082,
-0.11947004497051239,
0.01871461048722267,
-0.13616642355918884,
0.0586433969438076,
0.13588696718215942,
0.006674023345112801,
0.04129154607653618,
0.18466182053089142,
-0.05314880982041359,
0.005967769771814346,
0.06717915832996368,
0.02410026080906391,
0.012166605331003666,
-0.03968905285000801,
-0.06073115020990372,
0.05041318014264107,
0.0330229252576828,
0.05483728647232056,
-0.067299984395504,
-0.007640013936907053,
0.014126230962574482,
-0.007769895251840353,
-0.07296628504991531,
0.009620076045393944,
0.019799111410975456,
0.05129905045032501,
0.03709465637803078,
0.04909893497824669,
0.015840156003832817,
-0.0558449886739254,
0.32270246744155884,
-0.07273497432470322,
-0.08448733389377594,
-0.1360490620136261,
0.21891561150550842,
0.022352971136569977,
-0.019820673391222954,
0.05880847945809364,
-0.09585902839899063,
-0.008214125409722328,
0.154434472322464,
0.15462954342365265,
-0.09578827023506165,
-0.019320890307426453,
-0.013736597262322903,
-0.010989515110850334,
-0.022332729771733284,
0.1247938796877861,
0.09366538375616074,
0.02188521809875965,
-0.07344579696655273,
-0.017465218901634216,
-0.0161441583186388,
-0.04431740939617157,
-0.06934059411287308,
0.08550263941287994,
0.02745995856821537,
0.0046033295802772045,
-0.034245025366544724,
0.07711616903543472,
0.001149054616689682,
-0.22780397534370422,
0.039212197065353394,
-0.1814146190881729,
-0.18217796087265015,
-0.04331362247467041,
0.05645471066236496,
-0.0008238903246819973,
0.0656762421131134,
0.002048302674666047,
-0.007112317252904177,
0.12083832919597626,
-0.009937980212271214,
-0.02245904505252838,
-0.14040477573871613,
0.10326877236366272,
-0.11171731352806091,
0.2132495492696762,
-0.021939394995570183,
0.0594530813395977,
0.09699691832065582,
0.02494780905544758,
-0.1401182860136032,
0.00987246073782444,
0.06638650596141815,
-0.12853305041790009,
0.013042271137237549,
0.14008617401123047,
-0.03925309330224991,
0.07071176171302795,
0.02859680913388729,
-0.1668129414319992,
0.0019026553491130471,
0.017849385738372803,
-0.03225169703364372,
-0.07174555957317352,
-0.008433455601334572,
-0.0630607157945633,
0.1583297848701477,
0.23994259536266327,
-0.037156812846660614,
0.01050233468413353,
-0.0945608839392662,
-0.0007238178513944149,
0.05608179047703743,
0.10068777948617935,
-0.03817865997552872,
-0.21140669286251068,
0.032961275428533554,
0.052814092487096786,
0.003075031563639641,
-0.20922712981700897,
-0.08055390417575836,
0.05732153728604317,
-0.055740367621183395,
-0.014960139989852905,
0.09592175483703613,
0.04821265861392021,
0.05326459929347038,
-0.018456188961863518,
-0.10788347572088242,
-0.030976640060544014,
0.15671761333942413,
-0.1759825497865677,
-0.04819740355014801
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09838918596506119,
0.09639986604452133,
-0.0023313064593821764,
0.09198450297117233,
0.12460868060588837,
0.018893128260970116,
0.09315813332796097,
0.1297142058610916,
-0.0989002212882042,
0.06844276189804077,
0.08736669272184372,
0.03379644826054573,
0.04162416607141495,
0.14076079428195953,
-0.005011874251067638,
-0.27772092819213867,
-0.0006284979754127562,
-0.0038221131544560194,
-0.053111642599105835,
0.11995968222618103,
0.08894788473844528,
-0.10963613539934158,
0.07391397655010223,
0.007762627676129341,
-0.15308071672916412,
0.017145512625575066,
-0.03147023171186447,
-0.03348227217793465,
0.12312845885753632,
-0.027762265875935555,
0.10680510848760605,
0.029642924666404724,
0.136808380484581,
-0.21031421422958374,
0.00736346747726202,
0.07785288989543915,
0.05478013679385185,
0.09782353788614273,
0.048845574259757996,
0.011966858990490437,
0.10082019120454788,
-0.14717929065227509,
0.09561404585838318,
0.030426882207393646,
-0.08997838944196701,
-0.15345044434070587,
-0.0890350267291069,
0.026249386370182037,
0.05274204537272453,
0.07615738362073898,
0.0024533849209547043,
0.132381409406662,
-0.07330472022294998,
0.086006298661232,
0.2514004111289978,
-0.3125437796115875,
-0.06845524162054062,
0.026057424023747444,
0.060702480375766754,
0.06157232075929642,
-0.12605759501457214,
-0.002596507081761956,
0.01729586347937584,
0.027593988925218582,
0.12312297523021698,
-0.011861920356750488,
-0.10527073591947556,
-0.01066287886351347,
-0.12333354353904724,
-0.001811234513297677,
0.06052032485604286,
0.02866683341562748,
-0.04921155050396919,
-0.10806972533464432,
-0.0673680379986763,
-0.08050257712602615,
-0.02267313376069069,
-0.051577817648649216,
0.04696803539991379,
-0.051937036216259,
-0.096852608025074,
-0.043551959097385406,
-0.059720151126384735,
-0.08146686106920242,
-0.005758700892329216,
0.16869542002677917,
0.03362603113055229,
0.019843611866235733,
-0.030755238607525826,
0.11616741865873337,
0.029665378853678703,
-0.13781732320785522,
-0.011611654423177242,
-0.003334595588967204,
-0.09843821823596954,
-0.040556617081165314,
-0.05584701523184776,
-0.005855298135429621,
0.002708416199311614,
0.16435754299163818,
-0.07279502600431442,
0.07482349127531052,
0.014210781082510948,
-0.027074351906776428,
-0.01544556487351656,
0.1526995599269867,
-0.041590720415115356,
-0.04381290078163147,
-0.016767021268606186,
0.0818934217095375,
-0.003545480314642191,
-0.02086576074361801,
-0.06419364362955093,
-0.02802455797791481,
0.09538853913545609,
0.05601957440376282,
-0.05899392068386078,
0.03759189695119858,
-0.026995042338967323,
-0.02501017414033413,
0.017340214923024178,
-0.11906522512435913,
0.04030001536011696,
0.0011591975344344974,
-0.07662011682987213,
-0.005199115257710218,
0.00033110755612142384,
-0.010951099917292595,
-0.0052116974256932735,
0.10038462281227112,
-0.08594580739736557,
-0.005513898562639952,
-0.06743499636650085,
-0.07809324562549591,
-0.00031171910814009607,
-0.14450635015964508,
-0.008716489188373089,
-0.056966789066791534,
-0.16317512094974518,
-0.03768118843436241,
0.042569514364004135,
-0.07583608478307724,
-0.016042323783040047,
-0.044178418815135956,
-0.06316202133893967,
0.022307055070996284,
-0.011919434182345867,
0.2018919140100479,
-0.05012922361493111,
0.0835479125380516,
-0.009450395591557026,
0.04855339601635933,
0.030941292643547058,
0.03911767527461052,
-0.09601413458585739,
0.02610122039914131,
-0.13384000957012177,
0.08435946702957153,
-0.08618447184562683,
-0.004814962390810251,
-0.13721349835395813,
-0.0985935777425766,
0.00717557268217206,
-0.01919872686266899,
0.08846015483140945,
0.1345789134502411,
-0.19535204768180847,
-0.02102595381438732,
0.12450194358825684,
-0.07515084743499756,
-0.04397900030016899,
0.06246034428477287,
-0.06439132988452911,
0.0384511724114418,
0.053425341844558716,
0.20689360797405243,
0.061455219984054565,
-0.1493559181690216,
-0.006473795510828495,
0.012973794713616371,
0.05104605853557587,
0.030075041577219963,
0.042786408215761185,
0.001588039449416101,
0.05532842129468918,
0.013660907745361328,
-0.0934443473815918,
-0.021983621641993523,
-0.0909954383969307,
-0.06472131609916687,
-0.049535807222127914,
-0.0751810222864151,
0.05483829602599144,
0.01086130179464817,
0.0381157211959362,
-0.05997277796268463,
-0.10533502697944641,
0.11435168236494064,
0.10034681111574173,
-0.055391859263181686,
0.038230929523706436,
-0.07690874487161636,
0.010239599272608757,
-0.00606179516762495,
-0.03408074378967285,
-0.213186115026474,
-0.12507681548595428,
0.04765264317393303,
-0.03490273654460907,
0.0208127461373806,
0.01509709469974041,
0.08542338758707047,
0.05591990426182747,
-0.05337673798203468,
-0.01467309519648552,
-0.09880834072828293,
0.0018827184103429317,
-0.11411619931459427,
-0.19234460592269897,
-0.08766935765743256,
-0.045941367745399475,
0.09645691514015198,
-0.17843148112297058,
-0.009106346406042576,
0.023995600640773773,
0.13192817568778992,
0.026132605969905853,
-0.06787233054637909,
-0.001007933751679957,
0.04234001040458679,
0.01237903255969286,
-0.09633217751979828,
0.05491442605853081,
0.01157031673938036,
-0.10598745942115784,
-0.044887810945510864,
-0.12658895552158356,
-0.01806296594440937,
0.05074033886194229,
0.06084108725190163,
-0.09995270520448685,
-0.05900062248110771,
-0.07455521821975708,
-0.03819512575864792,
-0.07880086451768875,
0.01662115380167961,
0.2124696522951126,
0.03920051455497742,
0.10949341207742691,
-0.061493173241615295,
-0.08238019049167633,
-0.007690455298870802,
0.03128436952829361,
0.024147115647792816,
0.09017179906368256,
0.02308923937380314,
-0.043340958654880524,
0.06697256118059158,
0.10413510352373123,
-0.020676715299487114,
0.13030153512954712,
-0.05572951212525368,
-0.08461421728134155,
-0.02903119847178459,
-0.017249416559934616,
-0.026064272969961166,
0.12449245154857635,
-0.03678414225578308,
0.001080123009160161,
0.03446263074874878,
0.04052947834134102,
0.01142768282443285,
-0.1683868169784546,
0.0018007426988333464,
0.030434325337409973,
-0.056193865835666656,
-0.04399622604250908,
-0.004737582989037037,
0.018646234646439552,
0.08674117177724838,
0.030904322862625122,
-0.0028874874114990234,
0.006719171069562435,
-0.013535736128687859,
-0.0568414069712162,
0.19135548174381256,
-0.0929306149482727,
-0.07516243308782578,
-0.07132559269666672,
0.01724849082529545,
-0.04388369992375374,
-0.03708362951874733,
0.0066334884613752365,
-0.09447098523378372,
-0.029183639213442802,
-0.08102991431951523,
-0.020375126972794533,
-0.028368329629302025,
0.01924549974501133,
0.02361694909632206,
-0.01794939860701561,
0.0790197104215622,
-0.13660889863967896,
0.008003775961697102,
-0.04920471832156181,
-0.09954071044921875,
0.004212986212223768,
0.07520979642868042,
0.09063379466533661,
0.08451594412326813,
-0.013371768407523632,
0.024403750896453857,
-0.03980037197470665,
0.23261968791484833,
-0.056548379361629486,
0.011438420042395592,
0.11693695932626724,
-0.01430984865874052,
0.051830828189849854,
0.09481924772262573,
0.037167083472013474,
-0.09219378978013992,
0.02333798259496689,
0.07984504848718643,
-0.03747778758406639,
-0.23014222085475922,
-0.015082921832799911,
-0.006269741803407669,
-0.08320446312427521,
0.10252389311790466,
0.03242575377225876,
-0.05117712914943695,
0.0408603735268116,
0.01837816834449768,
-0.0102074034512043,
-0.039278168231248856,
0.06880326569080353,
0.07693703472614288,
0.04724378138780594,
0.1088317409157753,
-0.004694676958024502,
-0.0191084872931242,
0.05430811643600464,
0.016245748847723007,
0.2632567882537842,
-0.04119715094566345,
0.10373038053512573,
0.0332619734108448,
0.1489609181880951,
-0.021744851022958755,
0.06448238343000412,
0.00038683286402374506,
-0.010163369588553905,
-0.012084070593118668,
-0.06169737130403519,
-0.02829892188310623,
0.01334697287529707,
-0.043561168015003204,
0.02232222817838192,
-0.08211132138967514,
0.02652807906270027,
0.021548323333263397,
0.28635290265083313,
0.030307063832879066,
-0.2536201477050781,
-0.07694584876298904,
-0.014423253946006298,
-0.05144896358251572,
-0.05963714420795441,
0.008130524307489395,
0.13753929734230042,
-0.13987234234809875,
0.04564684256911278,
-0.07838781177997589,
0.08704580366611481,
-0.04905899241566658,
0.011592354625463486,
0.050864722579717636,
0.14887037873268127,
-0.017520088702440262,
0.05536910519003868,
-0.19515161216259003,
0.25506773591041565,
0.01751115173101425,
0.1038893461227417,
-0.06533139944076538,
0.012948589399456978,
0.022459914907813072,
0.0182126984000206,
0.11674312502145767,
0.002384138060733676,
-0.07271113991737366,
-0.14554353058338165,
-0.09100218862295151,
0.0478258915245533,
0.1421830654144287,
-0.04596741870045662,
0.08989059180021286,
-0.03594840690493584,
0.012230313383042812,
0.03672266751527786,
-0.035610001534223557,
-0.14809535443782806,
-0.08684085309505463,
-0.0005085903103463352,
0.006131309550255537,
-0.007548199500888586,
-0.061382681131362915,
-0.10570470988750458,
-0.009682177565991879,
0.104959636926651,
0.004111692775040865,
-0.05453537032008171,
-0.15832021832466125,
0.08911964297294617,
0.14395882189273834,
-0.058085981756448746,
0.011969881132245064,
0.016677353531122208,
0.11215080320835114,
0.035370174795389175,
-0.0785369798541069,
0.06202084943652153,
-0.06253685057163239,
-0.18080838024616241,
-0.05517752841114998,
0.12353026866912842,
0.0823168009519577,
0.049776166677474976,
-0.0003783093416132033,
0.05091095715761185,
0.0008890284807421267,
-0.09620124101638794,
0.035683974623680115,
0.007538207806646824,
0.03561266139149666,
0.01676645688712597,
-0.08779551088809967,
0.09785543382167816,
-0.03655954822897911,
0.009754979982972145,
0.1296490877866745,
0.2078988254070282,
-0.10544224083423615,
0.11348098516464233,
0.08603310585021973,
-0.07435199618339539,
-0.16732320189476013,
0.060454998165369034,
0.13076753914356232,
0.011490067467093468,
0.08487933874130249,
-0.21407192945480347,
0.1233515813946724,
0.09888333082199097,
-0.011185236275196075,
0.009832603856921196,
-0.2778167426586151,
-0.127937451004982,
0.0582427904009819,
0.1094115674495697,
0.04102541133761406,
-0.11713376641273499,
-0.03609967231750488,
-0.0031393994577229023,
-0.09210588037967682,
0.11114823073148727,
-0.0721588209271431,
0.11540618538856506,
-0.016566870734095573,
0.11142680794000626,
0.025442376732826233,
-0.030724845826625824,
0.10752785950899124,
0.06069277599453926,
0.07978585362434387,
-0.03460108861327171,
0.00800241343677044,
0.054627109318971634,
-0.05594782531261444,
0.015521599911153316,
-0.04319874569773674,
0.06697314977645874,
-0.1499212682247162,
-0.00032162622665055096,
-0.09279219061136246,
0.05057232826948166,
-0.04924871772527695,
-0.07143409550189972,
-0.013839603401720524,
0.054162535816431046,
0.07466664165258408,
-0.03997396305203438,
0.02551034465432167,
-0.005921399686485529,
0.09864988923072815,
0.09515352547168732,
0.08147773891687393,
-0.015284217894077301,
-0.09227820485830307,
0.0108242928981781,
0.004714335314929485,
0.05421517416834831,
-0.1052526980638504,
0.01352702360600233,
0.13756966590881348,
0.06598994880914688,
0.09552686661481857,
0.04836248233914375,
-0.04076889157295227,
0.0033822129480540752,
0.013033383525907993,
-0.12063353508710861,
-0.11476677656173706,
0.023792903870344162,
-0.04513058066368103,
-0.15517622232437134,
0.021951479837298393,
0.11926163733005524,
-0.04082197695970535,
-0.01802939549088478,
-0.007867044769227505,
0.006024311296641827,
-0.012968341819941998,
0.1861274540424347,
0.04536086693406105,
0.06391312181949615,
-0.08803708106279373,
0.10681752115488052,
0.035869233310222626,
-0.05423538014292717,
0.05040016770362854,
0.06382765620946884,
-0.10350096970796585,
0.007785893511027098,
0.07701274007558823,
0.12479303032159805,
-0.04724931716918945,
-0.009001052938401699,
-0.08864301443099976,
-0.08502722531557083,
0.04188208654522896,
0.13161960244178772,
0.05387358367443085,
-0.001286987797357142,
-0.07009157538414001,
0.041984494775533676,
-0.11886410415172577,
0.0718965083360672,
0.04458668455481529,
0.07035411894321442,
-0.1008734405040741,
0.12889620661735535,
-0.0014156077522784472,
0.027499353513121605,
-0.02618018165230751,
0.015560859814286232,
-0.0960320234298706,
-0.024620912969112396,
-0.105268694460392,
-0.025219665840268135,
-0.009171930141746998,
0.00057419907534495,
-0.022651076316833496,
-0.07438798993825912,
-0.02686288394033909,
0.03899999335408211,
-0.07640115171670914,
-0.05049235746264458,
0.014275413937866688,
0.04048432782292366,
-0.15071460604667664,
0.001440435997210443,
0.028902383521199226,
-0.093331478536129,
0.09091182053089142,
0.06228776276111603,
0.015042795799672604,
0.0275831688195467,
-0.11382352560758591,
-0.0282987579703331,
-0.01071737427264452,
0.004887616261839867,
0.06530285626649857,
-0.09773819893598557,
-0.027377672493457794,
-0.039256662130355835,
0.045960456132888794,
0.017495477572083473,
0.09745827317237854,
-0.11749525368213654,
-0.0053282990120351315,
-0.04016849398612976,
-0.04207766801118851,
-0.06262467056512833,
0.03584001213312149,
0.10213373601436615,
0.054794423282146454,
0.14940886199474335,
-0.07351765781641006,
0.05953628942370415,
-0.20172682404518127,
-0.03597846254706383,
0.010258211754262447,
-0.04324157536029816,
-0.08313178271055222,
-0.05131981521844864,
0.08947182446718216,
-0.04525591433048248,
0.10450088977813721,
-0.020061058923602104,
0.11110924184322357,
0.042752400040626526,
-0.00951308198273182,
-0.060401398688554764,
-0.005868091247975826,
0.18668141961097717,
0.05805082246661186,
-0.01650058850646019,
0.130834698677063,
-0.00039829115848988295,
0.0291602686047554,
0.08711864799261093,
0.22522301971912384,
0.1616208404302597,
0.0013072679284960032,
0.06395559012889862,
0.061047762632369995,
-0.07374239712953568,
-0.15167006850242615,
0.11812741309404373,
-0.01931259036064148,
0.10227879881858826,
-0.06753704696893692,
0.18978743255138397,
0.039641764014959335,
-0.1829543262720108,
0.06458018720149994,
-0.026581058278679848,
-0.11112559586763382,
-0.1216837614774704,
-0.0239051952958107,
-0.0691627785563469,
-0.12031764537096024,
0.024191254749894142,
-0.11772765964269638,
0.06243725121021271,
0.10237441211938858,
0.009160619229078293,
0.03821227699518204,
0.1853107064962387,
-0.044227562844753265,
0.011201243847608566,
0.08308226615190506,
0.019551202654838562,
0.0063659329898655415,
-0.043247513473033905,
-0.06549090892076492,
0.036005232483148575,
0.032787591218948364,
0.06249316781759262,
-0.05216774344444275,
-0.0013217562809586525,
0.008843068033456802,
-0.007252118084579706,
-0.07710210978984833,
0.010643625631928444,
0.010290345177054405,
0.053926289081573486,
0.05213295295834541,
0.04583664983510971,
0.005787109024822712,
-0.05358852818608284,
0.29893648624420166,
-0.06979342550039291,
-0.06857907772064209,
-0.12959207594394684,
0.2066716104745865,
0.02247431129217148,
-0.022249598056077957,
0.05446821451187134,
-0.0843958631157875,
-0.014133348129689693,
0.17116519808769226,
0.132171630859375,
-0.0924597680568695,
-0.016009218990802765,
-0.014347083866596222,
-0.009869220666587353,
-0.013517705723643303,
0.11701670289039612,
0.07663612812757492,
-0.01007302850484848,
-0.06950876861810684,
-0.01829909160733223,
-0.021034184843301773,
-0.0568973645567894,
-0.06164044141769409,
0.0703057199716568,
0.02631935477256775,
-0.007185295689851046,
-0.06166786700487137,
0.06989067792892456,
-0.0009617937030270696,
-0.24299004673957825,
0.042272765189409256,
-0.17185404896736145,
-0.17022055387496948,
-0.026947928592562675,
0.07275231927633286,
0.0065215639770030975,
0.057709693908691406,
0.0015345755964517593,
0.019541539251804352,
0.12166135758161545,
-0.012355785816907883,
-0.0032916655763983727,
-0.11084424704313278,
0.11811627447605133,
-0.08595547080039978,
0.19823750853538513,
-0.00684838742017746,
0.05305914953351021,
0.09674685448408127,
0.04051864147186279,
-0.13981333374977112,
0.016743648797273636,
0.06552521139383316,
-0.13031071424484253,
-0.0012668659910559654,
0.15053154528141022,
-0.033030860126018524,
0.0614938959479332,
0.026110678911209106,
-0.15363486111164093,
0.007781572639942169,
0.015750521793961525,
-0.03716609254479408,
-0.06715506315231323,
-0.006380816455930471,
-0.051687318831682205,
0.1672428697347641,
0.21967481076717377,
-0.029492760077118874,
0.005183402448892593,
-0.08969541639089584,
0.010043359361588955,
0.04472146928310394,
0.06473338603973389,
-0.04218319058418274,
-0.20473617315292358,
0.010722261853516102,
0.06394492089748383,
-0.004471865016967058,
-0.1943143904209137,
-0.09850770235061646,
0.0529039092361927,
-0.04075957089662552,
-0.04244934767484665,
0.09547333419322968,
0.02085127867758274,
0.03654183819890022,
-0.011978868395090103,
-0.12019786238670349,
-0.021509112790226936,
0.1392178237438202,
-0.17835578322410583,
-0.028658106923103333
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-16-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09894105792045593,
0.09723123908042908,
-0.002355189761146903,
0.09236311167478561,
0.12456710636615753,
0.01952957920730114,
0.09263291209936142,
0.12964272499084473,
-0.09850653260946274,
0.06795328110456467,
0.08770941197872162,
0.033081259578466415,
0.041551291942596436,
0.13992412388324738,
-0.005153443198651075,
-0.27777063846588135,
-0.0011231901589781046,
-0.0032241009175777435,
-0.051239583641290665,
0.11947889626026154,
0.08901669085025787,
-0.10966302454471588,
0.0738266110420227,
0.0075375353917479515,
-0.15241459012031555,
0.016998762264847755,
-0.030950374901294708,
-0.03407140448689461,
0.12341894209384918,
-0.027046198025345802,
0.10684213042259216,
0.02888362668454647,
0.13690349459648132,
-0.21107937395572662,
0.007112462539225817,
0.0781061127781868,
0.0547059066593647,
0.09790081530809402,
0.04754716530442238,
0.012760059908032417,
0.09982486069202423,
-0.14769062399864197,
0.09592874348163605,
0.030055923387408257,
-0.08973146229982376,
-0.15394286811351776,
-0.08810518682003021,
0.027474476024508476,
0.052231721580028534,
0.07558105885982513,
0.0034819231368601322,
0.1340741664171219,
-0.07210100442171097,
0.0863889530301094,
0.25155743956565857,
-0.31222712993621826,
-0.06767212599515915,
0.02505720965564251,
0.06034889072179794,
0.06283747404813766,
-0.1251887083053589,
-0.0024083685129880905,
0.01696726679801941,
0.02739088609814644,
0.12363148480653763,
-0.012635638937354088,
-0.10676135122776031,
-0.01073154341429472,
-0.12279379367828369,
-0.0022578355856239796,
0.061351459473371506,
0.02890097349882126,
-0.04942229762673378,
-0.10808248817920685,
-0.0679529458284378,
-0.08127347379922867,
-0.02310258336365223,
-0.05196402594447136,
0.04635642096400261,
-0.05154343694448471,
-0.09602829813957214,
-0.04358264058828354,
-0.059189628809690475,
-0.08073347806930542,
-0.005547996144741774,
0.16877302527427673,
0.033768150955438614,
0.01977100782096386,
-0.029925094917416573,
0.11572293192148209,
0.027719179168343544,
-0.13790546357631683,
-0.011145598255097866,
-0.0030736338812857866,
-0.09857213497161865,
-0.04069060459733009,
-0.056163493543863297,
-0.0070309992879629135,
0.0022060966584831476,
0.16579191386699677,
-0.07246663421392441,
0.0745621994137764,
0.014766788110136986,
-0.026465291157364845,
-0.014999018050730228,
0.1535874903202057,
-0.04230997711420059,
-0.044526174664497375,
-0.015908824279904366,
0.08145076781511307,
-0.0026864793617278337,
-0.021776361390948296,
-0.06537161767482758,
-0.028821803629398346,
0.09597936272621155,
0.05539810284972191,
-0.05895240604877472,
0.03702739253640175,
-0.027197428047657013,
-0.02522173710167408,
0.01811951771378517,
-0.11893834918737411,
0.04052244499325752,
0.0010810302337631583,
-0.0763823464512825,
-0.004458056762814522,
0.001180913532152772,
-0.010344365611672401,
-0.005580799654126167,
0.09930243343114853,
-0.08510787039995193,
-0.005019776057451963,
-0.06710170954465866,
-0.07824181020259857,
-0.000039270100387511775,
-0.14442656934261322,
-0.007814223878085613,
-0.05814239755272865,
-0.16335663199424744,
-0.03714781627058983,
0.042898744344711304,
-0.07529591768980026,
-0.01575065404176712,
-0.04318384826183319,
-0.061743397265672684,
0.02214048244059086,
-0.012536287307739258,
0.1997756063938141,
-0.05029371753334999,
0.08320640027523041,
-0.009365183301270008,
0.04829147085547447,
0.029661357402801514,
0.039212483912706375,
-0.09557610750198364,
0.02596166543662548,
-0.13387364149093628,
0.084391288459301,
-0.0855015441775322,
-0.004769599996507168,
-0.13585805892944336,
-0.09852824360132217,
0.008120022714138031,
-0.018582167103886604,
0.08923356980085373,
0.13378597795963287,
-0.19516383111476898,
-0.02080497518181801,
0.12367171049118042,
-0.07471691817045212,
-0.04418976604938507,
0.06395858526229858,
-0.06456980854272842,
0.03847692906856537,
0.05396924167871475,
0.20689699053764343,
0.06323157250881195,
-0.1494757980108261,
-0.0057604131288826466,
0.01447305642068386,
0.05169776827096939,
0.02897939831018448,
0.04266281798481941,
0.0020291805267333984,
0.05558051913976669,
0.013977346941828728,
-0.09229546785354614,
-0.02115701325237751,
-0.09094469994306564,
-0.06519082933664322,
-0.04978805407881737,
-0.07497133314609528,
0.05456259474158287,
0.011632389388978481,
0.03813439980149269,
-0.059704117476940155,
-0.10597650706768036,
0.1159248873591423,
0.10004165768623352,
-0.055734019726514816,
0.038229476660490036,
-0.07655449956655502,
0.010785329155623913,
-0.005455952603369951,
-0.03399830684065819,
-0.2127159833908081,
-0.12388026714324951,
0.047817934304475784,
-0.034629374742507935,
0.020323457196354866,
0.01535322330892086,
0.08475523442029953,
0.05636162310838699,
-0.05282258614897728,
-0.013904405757784843,
-0.09831982105970383,
0.002134934300556779,
-0.11491116881370544,
-0.19114744663238525,
-0.08781404048204422,
-0.04536639153957367,
0.09699180722236633,
-0.17917510867118835,
-0.008609922602772713,
0.02352963574230671,
0.1312103420495987,
0.025622788816690445,
-0.06782164424657822,
-0.0010223388671875,
0.042964592576026917,
0.01217744592577219,
-0.09662739932537079,
0.05508529394865036,
0.01220696046948433,
-0.10697523504495621,
-0.045907359570264816,
-0.126937136054039,
-0.01714731566607952,
0.05048341676592827,
0.05991325527429581,
-0.09990686923265457,
-0.0595746785402298,
-0.07401398569345474,
-0.038006704300642014,
-0.07804427295923233,
0.01586787775158882,
0.21350160241127014,
0.039420973509550095,
0.11039938032627106,
-0.0613679513335228,
-0.08179808408021927,
-0.007557053118944168,
0.031082669273018837,
0.024793418124318123,
0.08929247409105301,
0.02259274199604988,
-0.041092321276664734,
0.06620338559150696,
0.10327430069446564,
-0.02168574556708336,
0.12962453067302704,
-0.05540072172880173,
-0.08422135561704636,
-0.029305854812264442,
-0.017548291012644768,
-0.025890333577990532,
0.12450657039880753,
-0.0383291095495224,
0.0004449633997865021,
0.03470812737941742,
0.04008171334862709,
0.011594832874834538,
-0.1683613657951355,
0.0016586276469752192,
0.03142679110169411,
-0.05566299706697464,
-0.0431552492082119,
-0.005604229401797056,
0.018191296607255936,
0.08585688471794128,
0.030581718310713768,
-0.003228292567655444,
0.007406657561659813,
-0.013180343434214592,
-0.057151827961206436,
0.1907035857439041,
-0.0928945541381836,
-0.0765952616930008,
-0.0727655366063118,
0.017751941457390785,
-0.043950773775577545,
-0.0370996929705143,
0.006902385037392378,
-0.09279367327690125,
-0.028944429010152817,
-0.08136904239654541,
-0.022167984396219254,
-0.027760570868849754,
0.019216176122426987,
0.024029606953263283,
-0.01838167943060398,
0.07954984158277512,
-0.1363925337791443,
0.00774107500910759,
-0.04853895679116249,
-0.0986320748925209,
0.0046554990112781525,
0.07484930753707886,
0.0915357768535614,
0.08503468334674835,
-0.014254082925617695,
0.0242014080286026,
-0.0395340770483017,
0.23231548070907593,
-0.05606517195701599,
0.01175647135823965,
0.1172252893447876,
-0.015338459052145481,
0.051970481872558594,
0.09419099986553192,
0.037459392100572586,
-0.09220991283655167,
0.02287290059030056,
0.07873785495758057,
-0.03816566988825798,
-0.22926682233810425,
-0.01502018328756094,
-0.006353309378027916,
-0.08293656259775162,
0.10274143517017365,
0.032252971082925797,
-0.05060984566807747,
0.041392307728528976,
0.018645670264959335,
-0.008591797202825546,
-0.04007590562105179,
0.06864698231220245,
0.07748803496360779,
0.047234855592250824,
0.10865111649036407,
-0.0043703289702534676,
-0.019252164289355278,
0.05458423122763634,
0.015596846118569374,
0.26120278239250183,
-0.0414741225540638,
0.1046038493514061,
0.032042406499385834,
0.1496841311454773,
-0.021789662539958954,
0.0647052749991417,
0.00008215284469770268,
-0.010499761439859867,
-0.012248926796019077,
-0.061894990503787994,
-0.029920874163508415,
0.013921953737735748,
-0.04404430836439133,
0.0227334164083004,
-0.08239344507455826,
0.027324886992573738,
0.02122051827609539,
0.2858733534812927,
0.030126864090561867,
-0.2540241479873657,
-0.07724165916442871,
-0.014810982160270214,
-0.05122248828411102,
-0.0602448545396328,
0.00806504487991333,
0.13846805691719055,
-0.1393948495388031,
0.04515935108065605,
-0.07751910388469696,
0.08714870363473892,
-0.05014670640230179,
0.011888762935996056,
0.04991912096738815,
0.1487300544977188,
-0.017168182879686356,
0.055793456733226776,
-0.19573061168193817,
0.2534012794494629,
0.01791793294250965,
0.1042553186416626,
-0.06569787114858627,
0.013471605256199837,
0.021967072039842606,
0.02008655108511448,
0.11603385210037231,
0.00281723914667964,
-0.0712740421295166,
-0.14706405997276306,
-0.09136231988668442,
0.04785457253456116,
0.14093641936779022,
-0.0444365069270134,
0.08953306823968887,
-0.036054570227861404,
0.012338051572442055,
0.036991678178310394,
-0.034930117428302765,
-0.14795733988285065,
-0.08755264431238174,
-0.0010758813004940748,
0.00735871959477663,
-0.007270497735589743,
-0.06110035628080368,
-0.10527059435844421,
-0.010934695601463318,
0.10520800948143005,
0.004937910940498114,
-0.05478907376527786,
-0.1583040952682495,
0.08967800438404083,
0.14310340583324432,
-0.058245476335287094,
0.011676570400595665,
0.016648804768919945,
0.11230231821537018,
0.035147927701473236,
-0.07772473990917206,
0.0620453767478466,
-0.06222246214747429,
-0.1796710342168808,
-0.05553295090794563,
0.12268160283565521,
0.081880584359169,
0.049995169043540955,
0.00009159970068139955,
0.05061594769358635,
0.000765131670050323,
-0.09623443335294724,
0.03446795791387558,
0.008084708824753761,
0.03471766412258148,
0.01671689935028553,
-0.0878298282623291,
0.09886081516742706,
-0.03628149256110191,
0.010006539523601532,
0.13080333173274994,
0.20730091631412506,
-0.10580452531576157,
0.11263766884803772,
0.08691943436861038,
-0.07425311952829361,
-0.16707764565944672,
0.06001760810613632,
0.13018624484539032,
0.011623536236584187,
0.08494177460670471,
-0.21352463960647583,
0.12335830181837082,
0.09984926879405975,
-0.010723834857344627,
0.009503412060439587,
-0.27779272198677063,
-0.12811748683452606,
0.05916128680109978,
0.10934282094240189,
0.0431600920855999,
-0.11763250827789307,
-0.03599619120359421,
-0.003960160072892904,
-0.09338343143463135,
0.11025669425725937,
-0.07186206430196762,
0.11519985646009445,
-0.016289539635181427,
0.10996514558792114,
0.025636348873376846,
-0.030903423205018044,
0.1079735979437828,
0.06128600239753723,
0.07975910604000092,
-0.0348571352660656,
0.008431079797446728,
0.05438391491770744,
-0.05612995848059654,
0.01661355048418045,
-0.042997125536203384,
0.06707921624183655,
-0.15173189342021942,
-0.000686849292833358,
-0.09143270552158356,
0.05112887918949127,
-0.04885254427790642,
-0.07186444103717804,
-0.013750048354268074,
0.0530376099050045,
0.07449464499950409,
-0.039849575608968735,
0.026464667171239853,
-0.005537963472306728,
0.09777985513210297,
0.09681672602891922,
0.08004585653543472,
-0.018532341346144676,
-0.09243535995483398,
0.010858905501663685,
0.0044853645376861095,
0.05463702231645584,
-0.1049627810716629,
0.014021906070411205,
0.13776777684688568,
0.06597796827554703,
0.09577295929193497,
0.047319840639829636,
-0.040180038660764694,
0.003402617061510682,
0.012537332251667976,
-0.12044604867696762,
-0.11391962319612503,
0.02315724454820156,
-0.045495372265577316,
-0.15463495254516602,
0.020913688465952873,
0.11978743225336075,
-0.04167905077338219,
-0.017336400225758553,
-0.008158833719789982,
0.004730673041194677,
-0.012986515648663044,
0.18576431274414062,
0.045997731387615204,
0.06368881464004517,
-0.08790803700685501,
0.10643690079450607,
0.036407582461833954,
-0.05342816561460495,
0.05090213567018509,
0.06325472146272659,
-0.10385167598724365,
0.007611595559865236,
0.07708225399255753,
0.12472265958786011,
-0.04805966839194298,
-0.010154594667255878,
-0.08920112997293472,
-0.08397317677736282,
0.041466161608695984,
0.1303914338350296,
0.05429137498140335,
-0.0016213607741519809,
-0.07029691338539124,
0.04120754823088646,
-0.11910545080900192,
0.07137279212474823,
0.04408548027276993,
0.07055649161338806,
-0.10114304721355438,
0.13022784888744354,
-0.0006387680186890066,
0.027579884976148605,
-0.02619689330458641,
0.015058579854667187,
-0.09603933990001678,
-0.024273335933685303,
-0.10667124390602112,
-0.02499525062739849,
-0.00902076531201601,
0.0010579974623396993,
-0.022661643102765083,
-0.074154332280159,
-0.026854438707232475,
0.03890277445316315,
-0.07568936049938202,
-0.05028946325182915,
0.014626368880271912,
0.04008140042424202,
-0.15027694404125214,
0.001060917042195797,
0.02857840247452259,
-0.09315013140439987,
0.09119265526533127,
0.061766937375068665,
0.014668113552033901,
0.02724575623869896,
-0.11395099759101868,
-0.028123943135142326,
-0.010673223994672298,
0.0058119408786296844,
0.0652933120727539,
-0.0961846187710762,
-0.026526428759098053,
-0.038774263113737106,
0.04582354426383972,
0.01738300919532776,
0.09793222695589066,
-0.11772798746824265,
-0.00521849887445569,
-0.04006574675440788,
-0.04214448109269142,
-0.06309386342763901,
0.035911381244659424,
0.101850725710392,
0.05405886098742485,
0.14928317070007324,
-0.07335328310728073,
0.05935065075755119,
-0.20203897356987,
-0.03624437004327774,
0.01052274089306593,
-0.04282228276133537,
-0.08284713327884674,
-0.0522582121193409,
0.08919264376163483,
-0.04469704255461693,
0.10623835772275925,
-0.019901584833860397,
0.1117250919342041,
0.04227413609623909,
-0.010144231840968132,
-0.06028490141034126,
-0.006590850651264191,
0.1875971555709839,
0.05927534028887749,
-0.01648654416203499,
0.130029559135437,
-0.00008159768185578287,
0.030267123132944107,
0.08652106672525406,
0.2225494235754013,
0.1614730954170227,
0.0005243022460490465,
0.06400444358587265,
0.06088050827383995,
-0.07319948822259903,
-0.1518172025680542,
0.11778085678815842,
-0.0194243174046278,
0.10256099700927734,
-0.06738993525505066,
0.18978416919708252,
0.03975936770439148,
-0.18283385038375854,
0.06434519588947296,
-0.025823399424552917,
-0.111399807035923,
-0.12185985594987869,
-0.023397047072649002,
-0.06920591741800308,
-0.12043291330337524,
0.02374386601150036,
-0.11709970980882645,
0.06193520501255989,
0.10272630304098129,
0.008688570000231266,
0.03802153095602989,
0.18477894365787506,
-0.04427243396639824,
0.011622540652751923,
0.08272451162338257,
0.019356323406100273,
0.006749111693352461,
-0.044420599937438965,
-0.06628817319869995,
0.036099810153245926,
0.03290153667330742,
0.06313260644674301,
-0.05223676189780235,
0.0006007922347635031,
0.00926295481622219,
-0.006889106705784798,
-0.07754816859960556,
0.01053713634610176,
0.009666122496128082,
0.05357031524181366,
0.05073637142777443,
0.046135056763887405,
0.005897964350879192,
-0.05382450297474861,
0.2972237169742584,
-0.06918913871049881,
-0.06922858208417892,
-0.12950871884822845,
0.20573623478412628,
0.022913921624422073,
-0.021888229995965958,
0.054444119334220886,
-0.08439681679010391,
-0.013490252196788788,
0.17139868438243866,
0.1314297318458557,
-0.09310282766819,
-0.016025107353925705,
-0.01403393130749464,
-0.009972193278372288,
-0.014670057222247124,
0.11759091913700104,
0.07681915909051895,
-0.01115326676517725,
-0.06880100071430206,
-0.018465349450707436,
-0.02126765437424183,
-0.05691201239824295,
-0.06280943006277084,
0.06950905174016953,
0.026474950835108757,
-0.006415003910660744,
-0.061260245740413666,
0.06936372816562653,
-0.00004320088555687107,
-0.24356156587600708,
0.04258216544985771,
-0.1720910370349884,
-0.16999952495098114,
-0.026798587292432785,
0.07298164069652557,
0.006618769373744726,
0.05737950652837753,
0.0015947073698043823,
0.019924696534872055,
0.12238777428865433,
-0.01257668249309063,
-0.0039123669266700745,
-0.10955021530389786,
0.11739682406187057,
-0.0846085473895073,
0.19794835150241852,
-0.006836653687059879,
0.053886257112026215,
0.09666524827480316,
0.040849994868040085,
-0.1389726847410202,
0.017152579501271248,
0.06522156298160553,
-0.1291395127773285,
-0.001505573745816946,
0.14902526140213013,
-0.03310904651880264,
0.06141487881541252,
0.026594869792461395,
-0.15333916246891022,
0.00700880354270339,
0.01449801865965128,
-0.03713136538863182,
-0.06675875931978226,
-0.007962332107126713,
-0.051055606454610825,
0.16729736328125,
0.21876654028892517,
-0.02922227419912815,
0.004569387529045343,
-0.08988143503665924,
0.009869735687971115,
0.045263078063726425,
0.06456246227025986,
-0.04233761131763458,
-0.20435670018196106,
0.01117687113583088,
0.06381885707378387,
-0.004271874204277992,
-0.19413837790489197,
-0.09901805222034454,
0.05286240205168724,
-0.04125669226050377,
-0.04204986244440079,
0.09549898654222488,
0.020963257178664207,
0.036917366087436676,
-0.012191231362521648,
-0.12041520327329636,
-0.021949883550405502,
0.13911408185958862,
-0.17830224335193634,
-0.028945138677954674
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09585482627153397,
0.11574802547693253,
-0.0023232675157487392,
0.09164950996637344,
0.11940724402666092,
0.02194148115813732,
0.1003817692399025,
0.12840045988559723,
-0.09804999083280563,
0.08735804259777069,
0.08824559301137924,
0.03891035169363022,
0.04693329334259033,
0.14471150934696198,
-0.019851794466376305,
-0.25978803634643555,
0.010819676332175732,
-0.0035582613199949265,
-0.03468380123376846,
0.11191434413194656,
0.08480123430490494,
-0.11075669527053833,
0.08644692599773407,
0.014785396866500378,
-0.1541179120540619,
0.0191914364695549,
-0.037324097007513046,
-0.035407230257987976,
0.11346729099750519,
-0.03228607401251793,
0.10853424668312073,
0.02519115060567856,
0.13295391201972961,
-0.21043100953102112,
0.004883426241576672,
0.07441530376672745,
0.04619016870856285,
0.10057712346315384,
0.05103877931833267,
0.015057035721838474,
0.09053568542003632,
-0.15366116166114807,
0.09273097664117813,
0.029710445553064346,
-0.09107404947280884,
-0.1306881457567215,
-0.0953712984919548,
0.024288004264235497,
0.051006488502025604,
0.06939321011304855,
0.0014297080924734473,
0.15234588086605072,
-0.05985381826758385,
0.07882615178823471,
0.26684918999671936,
-0.32615965604782104,
-0.06411067396402359,
0.031629182398319244,
0.05918838456273079,
0.05231211334466934,
-0.12299878150224686,
-0.0061492593958973885,
0.02673209458589554,
0.030025823041796684,
0.11858604848384857,
-0.01670728251338005,
-0.11313708126544952,
-0.013051186688244343,
-0.12778112292289734,
-0.0009471868979744613,
0.0706915631890297,
0.03543437272310257,
-0.05196481943130493,
-0.09425334632396698,
-0.07526201754808426,
-0.09277278929948807,
-0.025321442633867264,
-0.06377796083688736,
0.05664108321070671,
-0.05521264672279358,
-0.08221978694200516,
-0.035773586481809616,
-0.056746210902929306,
-0.07576826214790344,
-0.01926831714808941,
0.15887317061424255,
0.04002712294459343,
0.02039349265396595,
-0.03257617726922035,
0.10835811495780945,
0.003420233493670821,
-0.142034649848938,
-0.01615915820002556,
-0.0012464536121115088,
-0.09645719081163406,
-0.04637749493122101,
-0.05116526782512665,
-0.018215574324131012,
0.010867050848901272,
0.17665870487689972,
-0.08145212382078171,
0.07586947083473206,
0.009633307345211506,
-0.029361313208937645,
-0.00688878633081913,
0.14861877262592316,
-0.04383424296975136,
-0.04590560868382454,
-0.010499994270503521,
0.07339520007371902,
0.0031646178103983402,
-0.015182620845735073,
-0.06461455672979355,
-0.026951860636472702,
0.10215016454458237,
0.04526449739933014,
-0.05916410684585571,
0.040513891726732254,
-0.023002296686172485,
-0.02849333919584751,
0.017047669738531113,
-0.11494369804859161,
0.04415552690625191,
-0.0018468365306034684,
-0.08463974297046661,
-0.0006676645134575665,
0.000762382464017719,
-0.00572221027687192,
-0.008291170932352543,
0.11162086576223373,
-0.0998891219496727,
-0.0020102374255657196,
-0.06390112638473511,
-0.08323792368173599,
0.00890190526843071,
-0.15563851594924927,
-0.016376199200749397,
-0.05795866996049881,
-0.16890151798725128,
-0.031339503824710846,
0.03725988045334816,
-0.07394540309906006,
-0.008134701289236546,
-0.04833504930138588,
-0.0652252733707428,
0.025109322741627693,
-0.014649261720478535,
0.1729235202074051,
-0.054293520748615265,
0.07282973825931549,
-0.0007421004702337086,
0.045556940138339996,
0.014587026089429855,
0.035964030772447586,
-0.10564084351062775,
0.024883002042770386,
-0.13819310069084167,
0.06924803555011749,
-0.08462658524513245,
-0.002803192939609289,
-0.13331709802150726,
-0.09845825284719467,
0.010089945048093796,
-0.02187279984354973,
0.09120915085077286,
0.13876031339168549,
-0.19320239126682281,
-0.01896631345152855,
0.12708009779453278,
-0.07471200823783875,
-0.06415227055549622,
0.06259337812662125,
-0.060860175639390945,
0.029179930686950684,
0.051779668778181076,
0.2111426144838333,
0.03905785456299782,
-0.16652986407279968,
-0.03315328434109688,
-0.006066650152206421,
0.04122345149517059,
0.027050649747252464,
0.03975030779838562,
0.004006355069577694,
0.06361688673496246,
0.014659716747701168,
-0.07608246058225632,
-0.03220272809267044,
-0.09163583070039749,
-0.06478875875473022,
-0.054931141436100006,
-0.07188321650028229,
0.04111519455909729,
0.0035578112583607435,
0.042180128395557404,
-0.06490558385848999,
-0.10199814289808273,
0.1198849305510521,
0.09637132287025452,
-0.04842076823115349,
0.037448033690452576,
-0.0791599377989769,
0.019636960700154305,
-0.019837742671370506,
-0.03885011374950409,
-0.20660202205181122,
-0.13042904436588287,
0.0522303432226181,
-0.057205457240343094,
0.03435857966542244,
0.004900083411484957,
0.08088211715221405,
0.061586108058691025,
-0.04347430169582367,
-0.011435892432928085,
-0.09351922571659088,
0.0030791054014116526,
-0.11718863248825073,
-0.18898944556713104,
-0.07731851190328598,
-0.03974906727671623,
0.0936296358704567,
-0.17312650382518768,
-0.007297433912754059,
0.015933359041810036,
0.1441730111837387,
0.027552761137485504,
-0.06818035989999771,
-0.00332432403229177,
0.038502346724271774,
0.002051887335255742,
-0.09538659453392029,
0.04471360146999359,
0.008542035706341267,
-0.09362947940826416,
-0.06234649941325188,
-0.13531142473220825,
-0.010454432107508183,
0.060764238238334656,
0.0512109138071537,
-0.09677666425704956,
-0.04512317478656769,
-0.07071776688098907,
-0.040667574852705,
-0.07611922919750214,
0.013495689257979393,
0.20184524357318878,
0.03566104918718338,
0.11316510289907455,
-0.06735606491565704,
-0.0771314948797226,
-0.003113025799393654,
0.021992497146129608,
0.012231576256453991,
0.0770697295665741,
0.04098444804549217,
-0.05296354368329048,
0.0743933692574501,
0.09899268299341202,
-0.023422598838806152,
0.12454879283905029,
-0.046861618757247925,
-0.08376988768577576,
-0.03350363299250603,
-0.02470269799232483,
-0.02832777425646782,
0.12379475682973862,
-0.039979711174964905,
0.005754569545388222,
0.0365271121263504,
0.04537516087293625,
0.01730678789317608,
-0.16208386421203613,
0.008220548741519451,
0.021990405395627022,
-0.05367446318268776,
-0.036784470081329346,
-0.0005722372443415225,
0.027362411841750145,
0.09240055084228516,
0.03155839815735817,
-0.01327238604426384,
0.0027766458224505186,
-0.011700754053890705,
-0.061605364084243774,
0.18548622727394104,
-0.0989380031824112,
-0.08540133386850357,
-0.07529138773679733,
0.005323108285665512,
-0.06064022704958916,
-0.03635239228606224,
0.0164004098623991,
-0.08791891485452652,
-0.03941773623228073,
-0.08770519495010376,
-0.0169553030282259,
-0.017857786267995834,
0.02094818837940693,
0.03054238110780716,
-0.023025857284665108,
0.08106573671102524,
-0.1390978842973709,
0.0013165463460609317,
-0.051638174802064896,
-0.09147122502326965,
-0.0003421034198254347,
0.07510428130626678,
0.09851600229740143,
0.07954733073711395,
-0.017133019864559174,
0.029570283368229866,
-0.03446783125400543,
0.24221085011959076,
-0.04564368724822998,
0.01010632049292326,
0.10386435687541962,
-0.013174955733120441,
0.056505125015974045,
0.09469945728778839,
0.03795049339532852,
-0.0937889963388443,
0.021135183051228523,
0.08351055532693863,
-0.02918616682291031,
-0.22856773436069489,
-0.02610323205590248,
-0.004967757500708103,
-0.07888385653495789,
0.10605825483798981,
0.03187987580895424,
-0.03771362453699112,
0.044901832938194275,
0.020819298923015594,
0.0017469384474679828,
-0.05665234848856926,
0.08174960315227509,
0.07649021595716476,
0.05775488540530205,
0.10012821108102798,
-0.009036052972078323,
-0.028102613985538483,
0.06229547783732414,
0.0072301235049963,
0.24674925208091736,
-0.0258337389677763,
0.10106561332941055,
0.033159978687763214,
0.15194815397262573,
-0.026617668569087982,
0.0638866126537323,
0.00330393947660923,
-0.009477194398641586,
-0.01463441550731659,
-0.06732112914323807,
-0.02556636743247509,
0.022630715742707253,
-0.04661936312913895,
0.02978852391242981,
-0.08183468133211136,
0.025427838787436485,
0.028379788622260094,
0.2803414463996887,
0.03360859304666519,
-0.27414050698280334,
-0.06676716357469559,
-0.013458723202347755,
-0.041563574224710464,
-0.06368754804134369,
0.006657265592366457,
0.12031333148479462,
-0.13240040838718414,
0.0650133416056633,
-0.07668276876211166,
0.08972678333520889,
-0.03896161541342735,
0.011395144276320934,
0.04775863140821457,
0.15429410338401794,
-0.018225887790322304,
0.050095926970243454,
-0.18584662675857544,
0.24299828708171844,
0.024889836087822914,
0.10739313066005707,
-0.06441192328929901,
0.01012064516544342,
0.01909683831036091,
0.0077144913375377655,
0.10813004523515701,
0.0008396422490477562,
-0.0677558183670044,
-0.13769115507602692,
-0.09880927205085754,
0.04711146280169487,
0.14204753935337067,
-0.03465932607650757,
0.09938998520374298,
-0.028506344184279442,
0.012854971922934055,
0.034342117607593536,
-0.029747476801276207,
-0.15644192695617676,
-0.07370047271251678,
0.010292078368365765,
0.02840116247534752,
-0.0160392876714468,
-0.05176925286650658,
-0.10411761701107025,
-0.03911152109503746,
0.11877036839723587,
0.002355532720685005,
-0.045531265437603,
-0.15113465487957,
0.08441533893346786,
0.14596383273601532,
-0.05834994837641716,
0.015439353883266449,
0.013321048580110073,
0.11135545372962952,
0.03315645083785057,
-0.0863414853811264,
0.06701590120792389,
-0.05333093926310539,
-0.173558309674263,
-0.05815497040748596,
0.11808431148529053,
0.07915609329938889,
0.04514605924487114,
-0.0007037419709376991,
0.05760998651385307,
0.0013315660180523992,
-0.0971798226237297,
0.03585951030254364,
0.00404932489618659,
0.052369374781847,
0.028936617076396942,
-0.08495042473077774,
0.07730241864919662,
-0.03391889110207558,
0.01818329654633999,
0.12895256280899048,
0.23388099670410156,
-0.09938021749258041,
0.10198486596345901,
0.08079949766397476,
-0.07567271590232849,
-0.15976712107658386,
0.06162923946976662,
0.12514972686767578,
0.004637544974684715,
0.08335801213979721,
-0.20049430429935455,
0.13443876802921295,
0.10715766996145248,
-0.01344896201044321,
0.021075060591101646,
-0.27240434288978577,
-0.1323363482952118,
0.06503722816705704,
0.10968221724033356,
0.049332812428474426,
-0.12218505144119263,
-0.03502803295850754,
-0.010763213038444519,
-0.11976095288991928,
0.12953414022922516,
-0.0765363797545433,
0.11770856380462646,
-0.021506547927856445,
0.12327458709478378,
0.024254973977804184,
-0.03726937994360924,
0.11363200098276138,
0.07077904045581818,
0.086062490940094,
-0.03940654918551445,
-0.002171430503949523,
0.0639253631234169,
-0.06249762326478958,
0.03584035858511925,
-0.03876660019159317,
0.06328879296779633,
-0.14738571643829346,
0.007055866066366434,
-0.07830452919006348,
0.06027271971106529,
-0.04653439670801163,
-0.06528974324464798,
-0.027989530935883522,
0.04732085019350052,
0.07310070842504501,
-0.03591594472527504,
0.04697731137275696,
0.007893562316894531,
0.09083157777786255,
0.10059289634227753,
0.07223651558160782,
-0.023492727428674698,
-0.08305266499519348,
0.014440168626606464,
0.004848156590014696,
0.04728994518518448,
-0.08599357306957245,
0.015871170908212662,
0.14663992822170258,
0.06014043837785721,
0.10192269086837769,
0.04579981043934822,
-0.043358128517866135,
0.0056160567328333855,
0.016941817477345467,
-0.1434137374162674,
-0.09946787357330322,
0.028383223339915276,
-0.05777815356850624,
-0.1541193276643753,
0.03354678303003311,
0.1237916424870491,
-0.0372144877910614,
-0.01564883440732956,
-0.0062837242148816586,
0.008654579520225525,
-0.0118383364751935,
0.18452371656894684,
0.04238579049706459,
0.05412868410348892,
-0.09128617495298386,
0.11494291573762894,
0.03585773706436157,
-0.04207681491971016,
0.05440526828169823,
0.06788541376590729,
-0.09930112957954407,
0.013388757593929768,
0.07180032879114151,
0.1494225114583969,
-0.06774168461561203,
-0.012920006178319454,
-0.09195073693990707,
-0.07697242498397827,
0.0450211800634861,
0.14570482075214386,
0.05291964113712311,
-0.004328790586441755,
-0.06129082292318344,
0.03556672856211662,
-0.1184500977396965,
0.06841026991605759,
0.052245792001485825,
0.08247345685958862,
-0.10767510533332825,
0.12441670894622803,
-0.0070014833472669125,
0.023229217156767845,
-0.027914728969335556,
0.01811448484659195,
-0.10114046186208725,
-0.034132473170757294,
-0.10936085879802704,
-0.014101107604801655,
-0.0179811492562294,
-0.003405241994187236,
-0.019328942522406578,
-0.07537710666656494,
-0.04333088546991348,
0.032673247158527374,
-0.07676634937524796,
-0.04874055087566376,
0.01759370230138302,
0.04033644497394562,
-0.161634162068367,
0.0033719621133059263,
0.02627057582139969,
-0.08760760724544525,
0.08793280273675919,
0.06913357228040695,
0.015331964939832687,
0.027901049703359604,
-0.12361886352300644,
-0.03322723135352135,
0.00007784867193549871,
0.010509694926440716,
0.0778270959854126,
-0.09312193840742111,
-0.029520243406295776,
-0.030970927327871323,
0.048861656337976456,
0.015173155814409256,
0.10296645015478134,
-0.11952747404575348,
-0.012811923399567604,
-0.045651525259017944,
-0.03799217566847801,
-0.05677371472120285,
0.026664162054657936,
0.11446377635002136,
0.044773850589990616,
0.15755920112133026,
-0.07077232003211975,
0.05436231568455696,
-0.20409560203552246,
-0.03263097256422043,
0.011464751325547695,
-0.048082396388053894,
-0.07468501478433609,
-0.04489348456263542,
0.08429531008005142,
-0.050549283623695374,
0.12120117247104645,
-0.01539467554539442,
0.09334331005811691,
0.04449374973773956,
-0.00538012245669961,
-0.07073508948087692,
-0.011659846641123295,
0.1838238388299942,
0.0575973317027092,
-0.021103695034980774,
0.12073617428541183,
0.0036367152351886034,
0.04236302897334099,
0.06854391098022461,
0.23504148423671722,
0.1525648832321167,
-0.011185149662196636,
0.07512494176626205,
0.06625320017337799,
-0.07502296566963196,
-0.1411105841398239,
0.12251528352499008,
-0.02083871327340603,
0.1076052337884903,
-0.05267217010259628,
0.18891490995883942,
0.03809245675802231,
-0.17612870037555695,
0.05425943061709404,
-0.025205284357070923,
-0.10747132450342178,
-0.1255379319190979,
-0.01640038564801216,
-0.08177769929170609,
-0.11686872690916061,
0.0276299100369215,
-0.1236639991402626,
0.06846257299184799,
0.09587464481592178,
0.0075209964998066425,
0.03549277409911156,
0.1844862401485443,
-0.057530585676431656,
0.010699351318180561,
0.07291330397129059,
0.02101086638867855,
-0.004338522907346487,
-0.03956150263547897,
-0.06730540096759796,
0.036793604493141174,
0.04371540993452072,
0.07124564796686172,
-0.05038674920797348,
0.009724529460072517,
0.015893852338194847,
-0.010737502947449684,
-0.07826881110668182,
0.008397374302148819,
0.014427460730075836,
0.04934478551149368,
0.0361991822719574,
0.04686184599995613,
0.00844583660364151,
-0.05337362363934517,
0.2754245102405548,
-0.06788431107997894,
-0.06215447559952736,
-0.12349648028612137,
0.19493848085403442,
0.03347006440162659,
-0.018770689144730568,
0.05531565099954605,
-0.09274880588054657,
-0.013693829998373985,
0.16264070570468903,
0.13463044166564941,
-0.09186170250177383,
-0.021341094747185707,
-0.02413424290716648,
-0.008668740279972553,
-0.0128675801679492,
0.1047145426273346,
0.07141992449760437,
0.0017879381775856018,
-0.06638190150260925,
-0.014821968972682953,
-0.029953718185424805,
-0.04702772572636604,
-0.06232953071594238,
0.05888298526406288,
0.027089012786746025,
-0.00659289862960577,
-0.05937400460243225,
0.06262335181236267,
-0.005336763337254524,
-0.2352760285139084,
0.03848723694682121,
-0.1736777424812317,
-0.17366141080856323,
-0.013785767368972301,
0.07042320817708969,
0.0010535240871831775,
0.056278567761182785,
-0.006347947288304567,
0.00967682059854269,
0.11573819816112518,
-0.016982341185212135,
-0.013354153372347355,
-0.11744824796915054,
0.10892359167337418,
-0.10738824307918549,
0.2125307023525238,
-0.0017616382101550698,
0.06431277096271515,
0.0992649719119072,
0.038495346903800964,
-0.13487063348293304,
0.018811194226145744,
0.06244443356990814,
-0.12676866352558136,
0.001167607493698597,
0.14503057301044464,
-0.03439433127641678,
0.0639030933380127,
0.03177572414278984,
-0.1491631418466568,
-0.0028964949306100607,
0.02707473374903202,
-0.03769983723759651,
-0.06891845911741257,
-0.010748123750090599,
-0.0563078299164772,
0.16580529510974884,
0.207442045211792,
-0.028521908447146416,
0.011934369802474976,
-0.08427678048610687,
0.021990850567817688,
0.04802582040429115,
0.05865678936243057,
-0.0395643413066864,
-0.21645620465278625,
0.022045012563467026,
0.0731668621301651,
-0.0031599905341863632,
-0.19589368999004364,
-0.0964261144399643,
0.043764546513557434,
-0.035937342792749405,
-0.046071890741586685,
0.09201732277870178,
0.02408197522163391,
0.03737553954124451,
-0.019556580111384392,
-0.11547688394784927,
-0.02758307382464409,
0.14589226245880127,
-0.17547310888767242,
-0.04307953268289566
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09669841080904007,
0.11533159762620926,
-0.002251231577247381,
0.09116145223379135,
0.11923490464687347,
0.021985342726111412,
0.10081834346055984,
0.12725302577018738,
-0.09720584005117416,
0.08598307520151138,
0.08718890696763992,
0.03950558975338936,
0.04744080826640129,
0.14611153304576874,
-0.018893888220191002,
-0.26069456338882446,
0.010685287415981293,
-0.002758777467533946,
-0.03318703547120094,
0.11177188158035278,
0.08526025712490082,
-0.110845647752285,
0.08565830439329147,
0.014812915585935116,
-0.15366150438785553,
0.019556967541575432,
-0.03765407204627991,
-0.0351509191095829,
0.11340422183275223,
-0.033784475177526474,
0.10821575671434402,
0.024963608011603355,
0.1350688636302948,
-0.20932237803936005,
0.004898321349173784,
0.07349273562431335,
0.04620455205440521,
0.10029500722885132,
0.05043642595410347,
0.015942750498652458,
0.08990100771188736,
-0.15444232523441315,
0.09351519495248795,
0.028763938695192337,
-0.0905960202217102,
-0.1285071074962616,
-0.09526321291923523,
0.025073951110243797,
0.05335277318954468,
0.06883440911769867,
0.0010304622119292617,
0.15233559906482697,
-0.060603849589824677,
0.07880549877882004,
0.26622337102890015,
-0.32689449191093445,
-0.0640099048614502,
0.0335191935300827,
0.06095384806394577,
0.052787695080041885,
-0.12323160469532013,
-0.0070290472358465195,
0.027283066883683205,
0.029329929500818253,
0.11969459801912308,
-0.01723712496459484,
-0.11293613165616989,
-0.01325624343007803,
-0.12868906557559967,
-0.00007450410339515656,
0.07123330235481262,
0.036250654608011246,
-0.05201186612248421,
-0.09567389637231827,
-0.07498512417078018,
-0.09336564689874649,
-0.026310214772820473,
-0.06499063968658447,
0.05649503692984581,
-0.055774420499801636,
-0.0817563608288765,
-0.036094240844249725,
-0.056105148047208786,
-0.07636009156703949,
-0.017521405592560768,
0.15697826445102692,
0.04034600406885147,
0.0195122379809618,
-0.03209516778588295,
0.10821744054555893,
0.0011347410036250949,
-0.14160767197608948,
-0.015185751020908356,
-0.0014440696686506271,
-0.09803669899702072,
-0.04759015515446663,
-0.05144638195633888,
-0.017577366903424263,
0.009430005215108395,
0.17732574045658112,
-0.07840941846370697,
0.07584518194198608,
0.011571800336241722,
-0.030110733583569527,
-0.006484395358711481,
0.1486850082874298,
-0.044650848954916,
-0.0481235608458519,
-0.010815140791237354,
0.07444056123495102,
0.0026937518268823624,
-0.0139043303206563,
-0.06550663709640503,
-0.027577871456742287,
0.10243237018585205,
0.045070432126522064,
-0.06075406074523926,
0.040675126016139984,
-0.022093946114182472,
-0.02837168239057064,
0.018236365169286728,
-0.11556608974933624,
0.04437461122870445,
-0.002653065836057067,
-0.0858069360256195,
-0.0018466071924194694,
-0.0005725768860429525,
-0.00466127460822463,
-0.008247816935181618,
0.11021658778190613,
-0.0996052622795105,
-0.0023938724771142006,
-0.06441977620124817,
-0.08384156227111816,
0.008989667519927025,
-0.15894770622253418,
-0.015044253319501877,
-0.05702901631593704,
-0.17244116961956024,
-0.032087743282318115,
0.03632042557001114,
-0.07309482991695404,
-0.008868960663676262,
-0.04950414597988129,
-0.0649966299533844,
0.0233762189745903,
-0.014003552496433258,
0.17393726110458374,
-0.053303495049476624,
0.07169300317764282,
-0.000861886132042855,
0.046458564698696136,
0.014880036935210228,
0.0363144613802433,
-0.10516271740198135,
0.024675309658050537,
-0.1371757537126541,
0.06945127993822098,
-0.08484966307878494,
-0.0009840027196332812,
-0.13295263051986694,
-0.09861098974943161,
0.008359520696103573,
-0.021699853241443634,
0.09095191955566406,
0.1388053596019745,
-0.19425822794437408,
-0.017947359010577202,
0.12831473350524902,
-0.07416080683469772,
-0.06338850408792496,
0.061513595283031464,
-0.061333391815423965,
0.030628331005573273,
0.053458064794540405,
0.21062152087688446,
0.0415598526597023,
-0.16575312614440918,
-0.033473901450634,
-0.005746172275394201,
0.04104793816804886,
0.02514670230448246,
0.03980964049696922,
0.005403999704867601,
0.06468737125396729,
0.014775482006371021,
-0.07536295801401138,
-0.032240983098745346,
-0.09099031239748001,
-0.06518195569515228,
-0.05451972782611847,
-0.07285435497760773,
0.041511841118335724,
0.003957902546972036,
0.04255083575844765,
-0.06503818184137344,
-0.1018630638718605,
0.12059646099805832,
0.09710872173309326,
-0.0483643002808094,
0.03599204123020172,
-0.07917007803916931,
0.01910819113254547,
-0.020696232095360756,
-0.03900158777832985,
-0.20640967786312103,
-0.12858159840106964,
0.05257817357778549,
-0.057448018342256546,
0.034175824373960495,
0.007065560203045607,
0.08156612515449524,
0.061220794916152954,
-0.04320898279547691,
-0.011995362117886543,
-0.09347556531429291,
0.0029869158752262592,
-0.11820200830698013,
-0.18748511373996735,
-0.0783141553401947,
-0.04028460010886192,
0.0945049524307251,
-0.17454051971435547,
-0.0064448523335158825,
0.014049708843231201,
0.14370514452457428,
0.026809267699718475,
-0.06824631989002228,
-0.0024407634045928717,
0.03758511692285538,
0.002844126196578145,
-0.09582221508026123,
0.04442765936255455,
0.0075119659304618835,
-0.09339527040719986,
-0.06406999379396439,
-0.13635759055614471,
-0.012048294767737389,
0.060063302516937256,
0.05371683090925217,
-0.09652519226074219,
-0.04562682658433914,
-0.07060371339321136,
-0.040070075541734695,
-0.07502157241106033,
0.012935187667608261,
0.20102162659168243,
0.034721262753009796,
0.11262022703886032,
-0.06743519008159637,
-0.07814823091030121,
-0.003329294500872493,
0.023474346846342087,
0.012953517027199268,
0.07664026319980621,
0.04076702147722244,
-0.052980903536081314,
0.07376685738563538,
0.09991972148418427,
-0.022998543456196785,
0.12390901148319244,
-0.04708727449178696,
-0.08448439091444016,
-0.03317173570394516,
-0.025099167600274086,
-0.029427355155348778,
0.12349331378936768,
-0.04033350944519043,
0.004066324792802334,
0.036087244749069214,
0.04409429430961609,
0.017271365970373154,
-0.16244423389434814,
0.008509484119713306,
0.022104214876890182,
-0.05279253423213959,
-0.03814184293150902,
-0.0015477872220799327,
0.026194164529442787,
0.09175115823745728,
0.030890189111232758,
-0.013936296105384827,
0.0021701885852962732,
-0.011450686492025852,
-0.06118715927004814,
0.18551994860172272,
-0.09779444336891174,
-0.08414151519536972,
-0.07467015832662582,
0.006107484456151724,
-0.058693770319223404,
-0.03641046956181526,
0.015153273940086365,
-0.08797957748174667,
-0.0388469472527504,
-0.08727256208658218,
-0.017866414040327072,
-0.017757564783096313,
0.020235855132341385,
0.032071445137262344,
-0.02235274761915207,
0.07929891347885132,
-0.13923333585262299,
0.0019805487245321274,
-0.052151091396808624,
-0.0912226065993309,
-0.0009752886835485697,
0.07407025992870331,
0.0987405776977539,
0.08008086681365967,
-0.01775953732430935,
0.029712125658988953,
-0.035094160586595535,
0.2415686994791031,
-0.046160850673913956,
0.011559193022549152,
0.10358238965272903,
-0.01283157616853714,
0.0561036616563797,
0.09545158594846725,
0.03720618784427643,
-0.09350951761007309,
0.020836489275097847,
0.08264507353305817,
-0.028670022264122963,
-0.22916099429130554,
-0.025493241846561432,
-0.004222550429403782,
-0.08025122433900833,
0.10673942416906357,
0.03135042265057564,
-0.035670824348926544,
0.046627793461084366,
0.020626308396458626,
0.003177931997925043,
-0.055370405316352844,
0.08130203932523727,
0.0741724818944931,
0.05686376616358757,
0.10041219741106033,
-0.009288454428315163,
-0.02924988605082035,
0.06078679859638214,
0.007972716353833675,
0.24768458306789398,
-0.024779673665761948,
0.10084395855665207,
0.032116521149873734,
0.1513456553220749,
-0.027285084128379822,
0.0662362352013588,
0.004234724678099155,
-0.0098722530528903,
-0.014626022428274155,
-0.06701873987913132,
-0.02433282509446144,
0.023132851347327232,
-0.04593406245112419,
0.02961409091949463,
-0.08120688796043396,
0.025162845849990845,
0.027657076716423035,
0.2790899872779846,
0.035113994032144547,
-0.27521026134490967,
-0.06614083796739578,
-0.013102948665618896,
-0.04255267605185509,
-0.06357361376285553,
0.006323805544525385,
0.11932189017534256,
-0.13229745626449585,
0.06572185456752777,
-0.07667162269353867,
0.08993156999349594,
-0.038278959691524506,
0.011493830941617489,
0.0464840903878212,
0.15410186350345612,
-0.01805845834314823,
0.05122317001223564,
-0.18603704869747162,
0.24183964729309082,
0.02487366646528244,
0.10872343927621841,
-0.06551632285118103,
0.010267789475619793,
0.019116055220365524,
0.007300373632460833,
0.10937614738941193,
0.0005092076025903225,
-0.06790038198232651,
-0.1382298767566681,
-0.099036805331707,
0.0470968559384346,
0.1421215683221817,
-0.03426075354218483,
0.09992675483226776,
-0.027650298550724983,
0.011911531910300255,
0.034613292664289474,
-0.031220218166708946,
-0.1577664613723755,
-0.07351381331682205,
0.009658711031079292,
0.02730068564414978,
-0.016122879460453987,
-0.05134379118680954,
-0.10413993895053864,
-0.041145775467157364,
0.11734296381473541,
0.003567255800589919,
-0.045695960521698,
-0.1509808599948883,
0.08603576570749283,
0.14629586040973663,
-0.05787047743797302,
0.016039565205574036,
0.014717723242938519,
0.11218909174203873,
0.03264473006129265,
-0.08588975667953491,
0.06618791818618774,
-0.05338962376117706,
-0.17227672040462494,
-0.057263027876615524,
0.11930421739816666,
0.07962874323129654,
0.045438725501298904,
0.0007834644056856632,
0.0569068007171154,
0.0013434671564027667,
-0.09685372561216354,
0.03530476987361908,
0.003839226672425866,
0.05181558430194855,
0.029174549505114555,
-0.08595453947782516,
0.07588077336549759,
-0.034465037286281586,
0.02016596868634224,
0.12916883826255798,
0.23059000074863434,
-0.099204421043396,
0.10121997445821762,
0.08062923699617386,
-0.0761396512389183,
-0.15946952998638153,
0.062486033886671066,
0.12523119151592255,
0.005459244828671217,
0.08366450667381287,
-0.20015206933021545,
0.13461454212665558,
0.1065705195069313,
-0.012857695110142231,
0.019755277782678604,
-0.2723557651042938,
-0.13183072209358215,
0.06625775247812271,
0.10984892398118973,
0.0470799095928669,
-0.12135954946279526,
-0.03482525050640106,
-0.011340848170220852,
-0.11966534703969955,
0.12894675135612488,
-0.07760906219482422,
0.11682897061109543,
-0.020838988944888115,
0.12225039303302765,
0.024241622537374496,
-0.03736225143074989,
0.11200416088104248,
0.07251911610364914,
0.08633673191070557,
-0.039116792380809784,
-0.0031105023808777332,
0.06639797985553741,
-0.06219511106610298,
0.03761895373463631,
-0.03778401017189026,
0.06299127638339996,
-0.14722032845020294,
0.00639024144038558,
-0.07774806767702103,
0.060050833970308304,
-0.04620744287967682,
-0.06552772969007492,
-0.02747960388660431,
0.04733762890100479,
0.07270240038633347,
-0.03599545359611511,
0.04474422708153725,
0.009179173037409782,
0.09005429595708847,
0.09671573340892792,
0.07349499315023422,
-0.02242685854434967,
-0.08216959983110428,
0.014361361041665077,
0.005061141215264797,
0.04673226177692413,
-0.08666463941335678,
0.015258860774338245,
0.14664167165756226,
0.060566604137420654,
0.10218348354101181,
0.045233193784952164,
-0.04346563667058945,
0.005090699531137943,
0.016277531161904335,
-0.1409481316804886,
-0.10082117468118668,
0.02821355126798153,
-0.05878845602273941,
-0.15473341941833496,
0.0347774364054203,
0.12180338054895401,
-0.038046590983867645,
-0.0165341105312109,
-0.007402951363474131,
0.008197321556508541,
-0.01144929975271225,
0.1858631819486618,
0.04318748787045479,
0.05426802858710289,
-0.09174340218305588,
0.11448550969362259,
0.03613167628645897,
-0.0423305369913578,
0.0541674867272377,
0.06735872477293015,
-0.10004593431949615,
0.013041576370596886,
0.07301697880029678,
0.14986488223075867,
-0.06465660035610199,
-0.013273338787257671,
-0.09173498302698135,
-0.07698380947113037,
0.04533626139163971,
0.14514727890491486,
0.05314221978187561,
-0.005628915503621101,
-0.06092352047562599,
0.035963594913482666,
-0.11899387836456299,
0.06840335577726364,
0.05234803259372711,
0.08235989511013031,
-0.10797636210918427,
0.12469004094600677,
-0.006900152191519737,
0.023768434301018715,
-0.027895236387848854,
0.018939638510346413,
-0.10029127448797226,
-0.034514009952545166,
-0.10794202238321304,
-0.015510962344706059,
-0.019326673820614815,
-0.0032573656644672155,
-0.019973162561655045,
-0.07483911514282227,
-0.04303205385804176,
0.03230993449687958,
-0.0765102282166481,
-0.048832960426807404,
0.017178760841488838,
0.03953937813639641,
-0.1606234759092331,
0.003371753729879856,
0.025538714602589607,
-0.08709053695201874,
0.0872424840927124,
0.06843337416648865,
0.016109587624669075,
0.02854737639427185,
-0.12292619794607162,
-0.032679855823516846,
0.0010007634991779923,
0.010911449790000916,
0.07741885632276535,
-0.0928124263882637,
-0.029011065140366554,
-0.03074297308921814,
0.04995814710855484,
0.014328701421618462,
0.10017815232276917,
-0.11853066831827164,
-0.01376751996576786,
-0.04684478044509888,
-0.037385404109954834,
-0.05724087357521057,
0.0273281242698431,
0.1139398068189621,
0.043309807777404785,
0.15811853110790253,
-0.06902655959129333,
0.05425375699996948,
-0.20486171543598175,
-0.033180274069309235,
0.010727074928581715,
-0.04805714264512062,
-0.07446654886007309,
-0.04566967487335205,
0.0844576358795166,
-0.05008873715996742,
0.12220728397369385,
-0.015358914621174335,
0.09431729465723038,
0.04357049986720085,
-0.0022676275111734867,
-0.07027384638786316,
-0.010970305651426315,
0.18431591987609863,
0.05784246698021889,
-0.021647362038493156,
0.11991315335035324,
0.0042063575237989426,
0.04274272918701172,
0.06710010766983032,
0.2324073761701584,
0.15269477665424347,
-0.012185296975076199,
0.0751408264040947,
0.06707395613193512,
-0.07521437853574753,
-0.13991600275039673,
0.12376751005649567,
-0.021532200276851654,
0.10577941685914993,
-0.0524335615336895,
0.19119805097579956,
0.037713441997766495,
-0.17612314224243164,
0.05482516810297966,
-0.02443276159465313,
-0.10836391896009445,
-0.12430764734745026,
-0.018183764070272446,
-0.08206796646118164,
-0.1157182827591896,
0.027912234887480736,
-0.12370570003986359,
0.06675469130277634,
0.09612109512090683,
0.0074836784042418,
0.03477257862687111,
0.18574143946170807,
-0.05810898542404175,
0.01104014553129673,
0.07317093014717102,
0.02057008072733879,
-0.0037798485718667507,
-0.03998516872525215,
-0.06643078476190567,
0.03803344443440437,
0.04290447011590004,
0.07193554192781448,
-0.052706778049468994,
0.009117337875068188,
0.015787430107593536,
-0.009874856099486351,
-0.07766187191009521,
0.008527756668627262,
0.014029478654265404,
0.04926643148064613,
0.03504656255245209,
0.04711848497390747,
0.007885968312621117,
-0.05378691107034683,
0.2740241289138794,
-0.06785692274570465,
-0.06309036165475845,
-0.12383801490068436,
0.19225606322288513,
0.03447681665420532,
-0.018811989575624466,
0.05621263384819031,
-0.09307760000228882,
-0.011435788124799728,
0.16354751586914062,
0.13482172787189484,
-0.08968620747327805,
-0.021915890276432037,
-0.02376505918800831,
-0.009058723226189613,
-0.013660945929586887,
0.10425885021686554,
0.07182127237319946,
-0.0006494717672467232,
-0.06618615984916687,
-0.014142804779112339,
-0.028398891910910606,
-0.0482826791703701,
-0.06219029426574707,
0.05868420749902725,
0.028072088956832886,
-0.006939103826880455,
-0.05749651417136192,
0.0642128735780716,
-0.003131608944386244,
-0.23523718118667603,
0.03715863451361656,
-0.17258229851722717,
-0.17379605770111084,
-0.014433289878070354,
0.0701570138335228,
0.002806309377774596,
0.05616644024848938,
-0.006189517676830292,
0.010016726329922676,
0.11513440310955048,
-0.01636403799057007,
-0.01395172905176878,
-0.1175309345126152,
0.10915186256170273,
-0.10864359140396118,
0.21140478551387787,
-0.0019921879284083843,
0.06532099843025208,
0.09912211447954178,
0.036860983818769455,
-0.134706512093544,
0.019263140857219696,
0.06253762543201447,
-0.12464479357004166,
0.0027455994859337807,
0.1454722285270691,
-0.03416606783866882,
0.06162627041339874,
0.030934354290366173,
-0.14944273233413696,
-0.0025610688608139753,
0.027187857776880264,
-0.037135832011699677,
-0.06981893628835678,
-0.00823221169412136,
-0.055556315928697586,
0.16651703417301178,
0.20722974836826324,
-0.028953861445188522,
0.012472574599087238,
-0.08478192239999771,
0.021370897069573402,
0.04840268939733505,
0.05879675969481468,
-0.03935183957219124,
-0.2161433845758438,
0.02084628865122795,
0.0705503597855568,
-0.0026333718560636044,
-0.1951294243335724,
-0.09543347358703613,
0.04260709509253502,
-0.03734026476740837,
-0.04614235833287239,
0.09121913462877274,
0.025757521390914917,
0.03739919140934944,
-0.01926448382437229,
-0.11577306687831879,
-0.02789262868463993,
0.14588309824466705,
-0.17643220722675323,
-0.04222514480352402
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.