sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-base-finetuned-rte
This model is a fine-tuned version of [google/fnet-base](https://huggingface.co/google/fnet-base) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6978
- Accuracy: 0.6282
The model was fine-tuned to compare [google/fnet-base](https://huggingface.co/google/fnet-base) as introduced in [this paper](https://arxiv.org/abs/2105.03824) against [bert-base-cased](https://huggingface.co/bert-base-cased).
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
This model is trained using the [run_glue](https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py) script. The following command was used:
```bash
#!/usr/bin/bash
python ../run_glue.py \\n --model_name_or_path google/fnet-base \\n --task_name rte \\n --do_train \\n --do_eval \\n --max_seq_length 512 \\n --per_device_train_batch_size 16 \\n --learning_rate 2e-5 \\n --num_train_epochs 3 \\n --output_dir fnet-base-finetuned-rte \\n --push_to_hub \\n --hub_strategy all_checkpoints \\n --logging_strategy epoch \\n --save_strategy epoch \\n --evaluation_strategy epoch \\n```
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6829 | 1.0 | 156 | 0.6657 | 0.5704 |
| 0.6174 | 2.0 | 312 | 0.6784 | 0.6101 |
| 0.5141 | 3.0 | 468 | 0.6978 | 0.6282 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer", "fnet-bert-base-comparison"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-base-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE RTE", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.628158844765343, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-base-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"fnet-bert-base-comparison",
"en",
"dataset:glue",
"arxiv:2105.03824",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.03824"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-base-finetuned-rte
=======================
This model is a fine-tuned version of google/fnet-base on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6978
* Accuracy: 0.6282
The model was fine-tuned to compare google/fnet-base as introduced in this paper against bert-base-cased.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
This model is trained using the run\_glue script. The following command was used:
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
88,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.1281575709581375,
0.15705668926239014,
-0.0038979940582066774,
0.11255989223718643,
0.11293601244688034,
-0.008617952466011047,
0.13138744235038757,
0.1563730537891388,
-0.1094919964671135,
0.05590956658124924,
0.1562153697013855,
0.15063650906085968,
0.03187095746397972,
0.18714870512485504,
-0.05448957160115242,
-0.23206175863742828,
0.030974131077528,
0.07522018253803253,
-0.05673399940133095,
0.13729923963546753,
0.09973176568746567,
-0.11717205494642258,
0.09386496245861053,
0.041752949357032776,
-0.20222944021224976,
-0.015307755209505558,
-0.0008101215353235602,
-0.08269140124320984,
0.11289821565151215,
0.02265351265668869,
0.08919014781713486,
0.03242024779319763,
0.03147246316075325,
-0.1466279774904251,
0.007479942869395018,
0.055549394339323044,
0.0026599117554724216,
0.11788346618413925,
0.03937623277306557,
-0.014160647056996822,
0.07863719761371613,
-0.09087078273296356,
0.046092916280031204,
0.026117689907550812,
-0.10794810950756073,
-0.2786095142364502,
-0.08820846676826477,
0.07192650437355042,
0.042675234377384186,
0.0806850865483284,
0.001147752278484404,
0.17639178037643433,
-0.007389090955257416,
0.11422055959701538,
0.2610849142074585,
-0.3091604709625244,
-0.061695314943790436,
0.024227019399404526,
0.009995708242058754,
0.06274503469467163,
-0.08376740664243698,
-0.031031707301735878,
0.04775812104344368,
0.04504694044589996,
0.17843511700630188,
-0.011732942424714565,
-0.007466572802513838,
-0.025801893323659897,
-0.14371579885482788,
-0.07526436448097229,
0.19688530266284943,
0.04901329055428505,
-0.048377830535173416,
-0.06737475097179413,
-0.08691056817770004,
-0.16888527572155,
-0.03213018923997879,
-0.018070582300424576,
0.04411018267273903,
-0.0378425307571888,
-0.060805536806583405,
-0.029689611867070198,
-0.07623965293169022,
-0.042504265904426575,
-0.042007382959127426,
0.15460588037967682,
0.05135969817638397,
0.03275564685463905,
-0.03648478910326958,
0.08528219908475876,
-0.016176627948880196,
-0.15604232251644135,
-0.0033359206281602383,
0.008210553787648678,
0.032338954508304596,
-0.01879490725696087,
-0.03313892334699631,
-0.09054049104452133,
0.012796161696314812,
0.12658193707466125,
-0.09896866977214813,
0.0699896514415741,
0.014115668833255768,
0.05199545621871948,
-0.0832819789648056,
0.18305599689483643,
-0.03047999180853367,
0.014890378341078758,
0.025784319266676903,
0.0863259807229042,
0.045972537249326706,
-0.02756713703274727,
-0.10970760136842728,
0.026695188134908676,
0.13855181634426117,
0.006172229070216417,
-0.03736800700426102,
0.07175908982753754,
-0.04684898629784584,
-0.04453456029295921,
0.056356143206357956,
-0.11108183860778809,
0.012465346604585648,
0.0007881103083491325,
-0.08858437836170197,
-0.03560861945152283,
0.026957271620631218,
-0.012466381303966045,
-0.04220158979296684,
0.05419426038861275,
-0.0943819209933281,
0.009732727892696857,
-0.05831371992826462,
-0.1146949976682663,
0.013262145221233368,
-0.11782565712928772,
0.0005620947922579944,
-0.11043663322925568,
-0.13709686696529388,
-0.008001538924872875,
0.05364624038338661,
-0.021408166736364365,
-0.0810130313038826,
-0.055306028574705124,
-0.09359939396381378,
0.02663268707692623,
-0.015616829507052898,
0.05365283414721489,
-0.06507189571857452,
0.08449108898639679,
0.049848102033138275,
0.07236256450414658,
-0.03752796724438667,
0.04536004737019539,
-0.07848503440618515,
0.0464148111641407,
-0.20472228527069092,
0.07016371190547943,
-0.064207062125206,
0.06198544800281525,
-0.1068328395485878,
-0.11787286400794983,
0.030897412449121475,
-0.03552781045436859,
0.08240031450986862,
0.09914099425077438,
-0.14732807874679565,
-0.08291295915842056,
0.19005797803401947,
-0.08365947753190994,
-0.12725262343883514,
0.12337704747915268,
-0.04553937166929245,
0.0029195500537753105,
0.05737419053912163,
0.2257438600063324,
0.0889558345079422,
-0.0515265017747879,
-0.02331758849322796,
-0.0051168533973395824,
0.04387769103050232,
-0.07846325635910034,
0.08052174746990204,
0.006332707591354847,
0.04367716237902641,
0.02254851721227169,
-0.029947886243462563,
0.04004385322332382,
-0.08367564529180527,
-0.08474674820899963,
-0.05332755297422409,
-0.08056257665157318,
0.056404486298561096,
0.04767722263932228,
0.08231600373983383,
-0.11374399065971375,
-0.094481460750103,
0.02864118292927742,
0.09102724492549896,
-0.08461901545524597,
0.04763771593570709,
-0.09820594638586044,
0.12363352626562119,
-0.06146226450800896,
-0.004213334526866674,
-0.18391123414039612,
-0.009381342679262161,
0.0438222773373127,
-0.023048149421811104,
-0.0010736447293311357,
-0.017531832680106163,
0.0634535625576973,
0.05774131044745445,
-0.039399147033691406,
-0.0408325120806694,
-0.033703941851854324,
-0.009450753219425678,
-0.11244195699691772,
-0.18618963658809662,
-0.04849305376410484,
-0.034880541265010834,
0.11852548271417618,
-0.17111770808696747,
0.06201837584376335,
0.06472091376781464,
0.11162162572145462,
0.02296465076506138,
-0.02818126603960991,
-0.0076725417748093605,
0.041496098041534424,
-0.039293959736824036,
-0.0763016790151596,
0.06855007261037827,
0.03613912686705589,
-0.0994340106844902,
-0.021636441349983215,
-0.11005594581365585,
0.18589626252651215,
0.12711931765079498,
-0.015304340980947018,
-0.046145275235176086,
-0.0037261333782225847,
-0.06693658232688904,
-0.0252322256565094,
0.00010479353659320623,
0.02630312740802765,
0.18688923120498657,
0.007668614853173494,
0.17470361292362213,
-0.10448402166366577,
-0.054649028927087784,
0.04829766973853111,
-0.027804838493466377,
-0.012250205501914024,
0.11106427013874054,
0.000900426646694541,
-0.11244355887174606,
0.14791768789291382,
0.12212676554918289,
-0.06123341992497444,
0.1253422647714615,
-0.06269760429859161,
-0.042313165962696075,
-0.03154568001627922,
0.0070209321565926075,
0.016009759157896042,
0.09681913256645203,
-0.12338456511497498,
-0.019060473889112473,
0.03688754513859749,
0.026065392419695854,
0.022046322003006935,
-0.18368276953697205,
0.0016608438454568386,
0.046017080545425415,
-0.06232096627354622,
-0.0001330666127614677,
-0.01134573481976986,
-0.0014391003642231226,
0.09911268204450607,
0.018035314977169037,
-0.08723576366901398,
0.04871227592229843,
0.008767826482653618,
-0.07373929023742676,
0.20202793180942535,
-0.10205624997615814,
-0.19618737697601318,
-0.11792904138565063,
-0.0520332008600235,
-0.09464605152606964,
-0.00033909876947291195,
0.06213480234146118,
-0.07916445285081863,
-0.022073807194828987,
-0.09520765393972397,
-0.03123565763235092,
-0.01737889274954796,
0.03542753681540489,
0.07313533127307892,
-0.02203269489109516,
0.10872673243284225,
-0.1269455999135971,
-0.02306712418794632,
-0.0358131118118763,
0.008673591539263725,
0.055851276963949203,
0.009536282159388065,
0.09724908322095871,
0.1170153021812439,
-0.033818624913692474,
0.05197916179895401,
-0.028282886371016502,
0.24284608662128448,
-0.053264424204826355,
-0.02976847253739834,
0.12733469903469086,
-0.0040460084564983845,
0.08623170107603073,
0.08496475964784622,
0.04557865113019943,
-0.08525460958480835,
-0.01157840620726347,
0.00595470005646348,
-0.03817666694521904,
-0.21714363992214203,
-0.036886055022478104,
-0.04140350595116615,
0.02079310268163681,
0.10649195313453674,
0.04592718183994293,
0.043558333069086075,
0.057499635964632034,
0.023834481835365295,
0.04960625246167183,
-0.02408022992312908,
0.09037080407142639,
0.1317465603351593,
0.04990197345614433,
0.13514135777950287,
-0.04064502939581871,
-0.03152943775057793,
0.039630502462387085,
-0.002475862158462405,
0.19950714707374573,
-0.02927609719336033,
0.1829756200313568,
0.04599745571613312,
0.1883370578289032,
0.011177031323313713,
0.06271716207265854,
-0.024891022592782974,
-0.006736312061548233,
-0.009674341417849064,
-0.0412435419857502,
-0.05800659582018852,
0.0024435357190668583,
-0.04474163055419922,
0.07293359190225601,
-0.117747962474823,
0.040190309286117554,
0.06010587513446808,
0.29027169942855835,
0.022407572716474533,
-0.37493225932121277,
-0.11181744188070297,
-0.016597270965576172,
-0.031712714582681656,
-0.04552937671542168,
0.010863134637475014,
0.11205030232667923,
-0.09440278261899948,
0.06324154883623123,
-0.08447496592998505,
0.09077057987451553,
-0.07360748946666718,
0.03601958230137825,
0.05221237242221832,
0.09796657413244247,
0.006393694784492254,
0.06149907410144806,
-0.26434561610221863,
0.25069257616996765,
0.018371766433119774,
0.05161698907613754,
-0.06146470457315445,
0.015569853596389294,
0.02125987596809864,
0.06923633813858032,
0.07857105880975723,
0.0007500528590753675,
-0.02767954021692276,
-0.16410666704177856,
-0.11515739560127258,
0.016607077792286873,
0.07020550966262817,
-0.02121029421687126,
0.08431950211524963,
-0.0010764591861516237,
0.004067974630743265,
0.04312531650066376,
-0.008349082432687283,
-0.027957666665315628,
-0.09104249626398087,
0.016294151544570923,
0.06238500773906708,
-0.04210955649614334,
-0.08233670145273209,
-0.1198863536119461,
-0.08412427455186844,
0.1865774542093277,
-0.008332050405442715,
-0.0823029950261116,
-0.12076611816883087,
0.06184208020567894,
0.06703545898199081,
-0.09185026586055756,
0.042081426829099655,
-0.017016833648085594,
0.12382151186466217,
0.003344601020216942,
-0.07244402170181274,
0.09798984974622726,
-0.04916444793343544,
-0.16366566717624664,
-0.03181470185518265,
0.13521257042884827,
0.034096188843250275,
0.05754973739385605,
-0.012549868784844875,
0.019313739612698555,
-0.026896705850958824,
-0.0765208750963211,
0.03267578408122063,
0.002446891972795129,
0.09423114359378815,
-0.015256565995514393,
0.0032036362681537867,
0.028266185894608498,
-0.07761038839817047,
0.005427155178040266,
0.19771218299865723,
0.2606421709060669,
-0.1043548434972763,
0.03651893138885498,
0.027820715680718422,
-0.04207947105169296,
-0.1520952433347702,
0.017168138176202774,
0.08070502430200577,
0.007420012727379799,
-0.010656541213393211,
-0.1764611452817917,
0.05012203007936478,
0.08809427171945572,
-0.01788947917521,
0.08023256808519363,
-0.29865652322769165,
-0.11554376780986786,
0.11132355779409409,
0.12564530968666077,
0.09193845093250275,
-0.13787555694580078,
-0.04598912596702576,
-0.0075193652883172035,
-0.12548284232616425,
0.1158299669623375,
-0.0956348180770874,
0.10974547266960144,
-0.04320225492119789,
0.05354843661189079,
0.007337958551943302,
-0.052445266395807266,
0.12752960622310638,
0.01908416859805584,
0.0783768892288208,
-0.0478614941239357,
0.003130370983853936,
0.10798399150371552,
-0.0758182629942894,
0.06092115119099617,
-0.10174112766981125,
0.04936999827623367,
-0.12913434207439423,
-0.01087163481861353,
-0.08215732127428055,
0.02880539558827877,
-0.029542503878474236,
-0.03560418263077736,
-0.058059368282556534,
0.009000681340694427,
0.07482077181339264,
-0.0021140656899660826,
0.17803633213043213,
0.047338198870420456,
0.12804824113845825,
0.2045678347349167,
0.07827051728963852,
-0.11816922575235367,
-0.10215629637241364,
-0.0021673408336937428,
-0.009365938603878021,
0.0566975399851799,
-0.15991105139255524,
0.03947640210390091,
0.1441853940486908,
0.004068439826369286,
0.11697175353765488,
0.06757407635450363,
-0.058885324746370316,
-0.00002974093695229385,
0.04177243635058403,
-0.17993171513080597,
-0.10290152579545975,
-0.014580353163182735,
-0.012066712602972984,
-0.13221070170402527,
0.07946693897247314,
0.11126816272735596,
-0.07109289616346359,
-0.018744582310318947,
0.0011641265591606498,
-0.0008233633125200868,
-0.025426305830478668,
0.18059362471103668,
0.0713382288813591,
0.07033462822437286,
-0.10616692155599594,
0.09460733085870743,
0.0462157279253006,
-0.0725565180182457,
0.04061013460159302,
0.06810374557971954,
-0.11764568090438843,
-0.022825488820672035,
0.037216611206531525,
0.14856918156147003,
-0.027179433032870293,
-0.05499974638223648,
-0.17361967265605927,
-0.1086450144648552,
0.08841122686862946,
0.13100364804267883,
0.10317601263523102,
0.017121080309152603,
-0.04885329306125641,
-0.009807179681956768,
-0.11012768745422363,
0.09655274450778961,
0.05846559628844261,
0.0660039484500885,
-0.16399246454238892,
0.11977937817573547,
-0.0003578494652174413,
0.07169348001480103,
-0.012192845344543457,
0.001448655384592712,
-0.0843949168920517,
0.004430484492331743,
-0.09232339262962341,
0.012558856047689915,
-0.04001414775848389,
0.00048618309665471315,
-0.022317882627248764,
-0.04360825940966606,
-0.0546182319521904,
0.03691510483622551,
-0.10589302331209183,
-0.03589541092514992,
0.030760591849684715,
0.02635170891880989,
-0.11072052270174026,
-0.031555552035570145,
0.0018842692952603102,
-0.08543401211500168,
0.09491559118032455,
0.053661517798900604,
-0.007116627413779497,
0.009966611862182617,
-0.003835842479020357,
0.002103820675984025,
0.0713348463177681,
0.007812848314642906,
0.06204405426979065,
-0.11863148212432861,
-0.004516935441643,
0.004596107639372349,
0.001546198152936995,
0.019018597900867462,
0.12120135128498077,
-0.12007837742567062,
-0.014022485353052616,
-0.022460926324129105,
-0.022654950618743896,
-0.0753025934100151,
0.05987093597650528,
0.10378355532884598,
0.041243866086006165,
0.19044536352157593,
-0.06453124433755875,
0.015417633578181267,
-0.2059982270002365,
-0.0010957218473777175,
0.009144444018602371,
-0.1515446901321411,
-0.05387648567557335,
-0.025656811892986298,
0.05819327011704445,
-0.07224544882774353,
0.11503966152667999,
0.004568712320178747,
-0.01583913527429104,
0.03687077388167381,
-0.044159963726997375,
-0.016699714586138725,
0.008686945773661137,
0.16097450256347656,
0.013965492136776447,
-0.037191856652498245,
0.12147503346204758,
0.023390216752886772,
0.1060599610209465,
0.12157408893108368,
0.1685730367898941,
0.12814003229141235,
0.026453150436282158,
0.10737672448158264,
0.032515060156583786,
-0.030444564297795296,
-0.18784967064857483,
0.06720752269029617,
-0.04578623175621033,
0.13226191699504852,
0.006823073141276836,
0.17648069560527802,
0.11036133766174316,
-0.15301413834095,
0.049124132841825485,
-0.021638398990035057,
-0.09448863565921783,
-0.10050928592681885,
-0.09679119288921356,
-0.07743744552135468,
-0.15149177610874176,
-0.01236912701278925,
-0.11908751726150513,
-0.001917796558700502,
0.06672867387533188,
0.0058901347219944,
-0.021161947399377823,
0.13932976126670837,
0.023673847317695618,
-0.0022931741550564766,
0.0803397074341774,
-0.0035189997870475054,
-0.05596965178847313,
-0.07125981897115707,
-0.07409729808568954,
0.013776594772934914,
0.013752772472798824,
0.05871286615729332,
-0.02784240059554577,
0.009659827686846256,
0.051329080015420914,
-0.02629757858812809,
-0.10809190571308136,
0.009442382492125034,
0.021296720951795578,
0.054612502455711365,
0.020937973633408546,
0.013347321189939976,
-0.011912370100617409,
-0.017500825226306915,
0.17539910972118378,
-0.06457335501909256,
-0.022280963137745857,
-0.10723160207271576,
0.20094625651836395,
0.036107197403907776,
-0.03869583457708359,
0.040720485150814056,
-0.06905907392501831,
-0.020572103559970856,
0.18964605033397675,
0.20552918314933777,
-0.026153136044740677,
0.0006357349921017885,
-0.004643670748919249,
-0.01797187142074108,
-0.007700672838836908,
0.08245248347520828,
0.13317352533340454,
-0.0074555822648108006,
-0.06725732237100601,
-0.026400936767458916,
-0.06664355099201202,
-0.004993689712136984,
-0.048690151423215866,
0.06622577458620071,
0.027980800718069077,
0.012009805999696255,
-0.046264056116342545,
0.022495536133646965,
-0.042431484907865524,
-0.07840681821107864,
0.030966809019446373,
-0.20489253103733063,
-0.15987305343151093,
-0.008346200920641422,
0.03857562318444252,
0.0004496309265960008,
0.062375955283641815,
-0.0009975176071748137,
0.012196631170809269,
0.0908537283539772,
-0.02237413451075554,
-0.08646247535943985,
-0.07932375371456146,
0.09022057801485062,
-0.17671364545822144,
0.2005920559167862,
-0.04314102232456207,
0.049184780567884445,
0.1326240599155426,
0.04266604781150818,
-0.10272050648927689,
0.0379687063395977,
0.03511279076337814,
-0.008898447267711163,
-0.004219701513648033,
0.09855833649635315,
-0.012647976167500019,
0.07372915744781494,
0.03857356309890747,
-0.09686969965696335,
-0.039119016379117966,
-0.07064490765333176,
-0.0207537654787302,
-0.040169861167669296,
-0.052368659526109695,
-0.05137869715690613,
0.11046379059553146,
0.19000554084777832,
-0.049329470843076706,
-0.017489463090896606,
-0.06654652208089828,
0.005614506546407938,
0.07303186506032944,
-0.011176830157637596,
-0.04266827553510666,
-0.253805011510849,
0.010820526629686356,
0.046582285314798355,
-0.017032980918884277,
-0.2275182157754898,
-0.1029951274394989,
-0.004221665672957897,
-0.06776802986860275,
-0.07437815517187119,
0.07514068484306335,
0.04294868931174278,
0.05498082563281059,
-0.06696301698684692,
0.028071772307157516,
-0.09679107367992401,
0.15699414908885956,
-0.15248030424118042,
-0.07556789368391037
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-base-finetuned-sst2
This model is a fine-tuned version of [google/fnet-base](https://huggingface.co/google/fnet-base) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4674
- Accuracy: 0.8945
The model was fine-tuned to compare [google/fnet-base](https://huggingface.co/google/fnet-base) as introduced in [this paper](https://arxiv.org/abs/2105.03824) against [bert-base-cased](https://huggingface.co/bert-base-cased).
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
This model is trained using the [run_glue](https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py) script. The following command was used:
```bash
#!/usr/bin/bash
python ../run_glue.py \\n --model_name_or_path google/fnet-base \\n --task_name sst2 \\n --do_train \\n --do_eval \\n --max_seq_length 512 \\n --per_device_train_batch_size 16 \\n --learning_rate 2e-5 \\n --num_train_epochs 3 \\n --output_dir fnet-base-finetuned-sst2 \\n --push_to_hub \\n --hub_strategy all_checkpoints \\n --logging_strategy epoch \\n --save_strategy epoch \\n --evaluation_strategy epoch \\n```
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|:-------------:|:-----:|:-----:|:--------:|:---------------:|
| 0.2956 | 1.0 | 4210 | 0.8819 | 0.3128 |
| 0.1746 | 2.0 | 8420 | 0.8979 | 0.3850 |
| 0.1204 | 3.0 | 12630 | 0.8945 | 0.4674 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer", "fnet-bert-base-comparison"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-base-finetuned-sst2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE SST2", "type": "glue", "args": "sst2"}, "metrics": [{"type": "accuracy", "value": 0.8944954128440367, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-base-finetuned-sst2
|
[
"transformers",
"pytorch",
"tensorboard",
"rust",
"fnet",
"text-classification",
"generated_from_trainer",
"fnet-bert-base-comparison",
"en",
"dataset:glue",
"arxiv:2105.03824",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.03824"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #rust #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-base-finetuned-sst2
========================
This model is a fine-tuned version of google/fnet-base on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4674
* Accuracy: 0.8945
The model was fine-tuned to compare google/fnet-base as introduced in this paper against bert-base-cased.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
This model is trained using the run\_glue script. The following command was used:
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #rust #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
90,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #rust #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.13063226640224457,
0.15633131563663483,
-0.0037925252690911293,
0.11203409731388092,
0.10461338609457016,
-0.009861338883638382,
0.12661795318126678,
0.16349272429943085,
-0.11580837517976761,
0.06316039711236954,
0.15490630269050598,
0.1429654210805893,
0.04123261198401451,
0.18948093056678772,
-0.05874737352132797,
-0.2199392020702362,
0.02797965705394745,
0.07121839374303818,
-0.054610490798950195,
0.13441020250320435,
0.10405363887548447,
-0.11449993401765823,
0.0932154580950737,
0.035935480147600174,
-0.19936297833919525,
-0.011124072596430779,
-0.0005550032947212458,
-0.08129789680242538,
0.10687530785799026,
0.017396055161952972,
0.08939015120267868,
0.02981581725180149,
0.03562062978744507,
-0.14452964067459106,
0.0056946901604533195,
0.062086403369903564,
0.00092480075545609,
0.1255127191543579,
0.04625246301293373,
-0.007567607332020998,
0.09029750525951385,
-0.1046624407172203,
0.04411865025758743,
0.020654739812016487,
-0.10070418566465378,
-0.2709973156452179,
-0.09050595760345459,
0.08346740156412125,
0.04028705507516861,
0.08496937155723572,
0.00040999046177603304,
0.18604964017868042,
-0.01222275011241436,
0.11255290359258652,
0.26449301838874817,
-0.3073347210884094,
-0.059075888246297836,
0.025794774293899536,
0.0064407833851873875,
0.06442445516586304,
-0.08867183327674866,
-0.04175342991948128,
0.05119956657290459,
0.047084495425224304,
0.17069603502750397,
-0.005674343556165695,
-0.018662497401237488,
-0.01976717822253704,
-0.1417272835969925,
-0.0775144025683403,
0.1959732323884964,
0.047194115817546844,
-0.0452163927257061,
-0.0673254132270813,
-0.08256525546312332,
-0.17563487589359283,
-0.028148649260401726,
-0.011444833129644394,
0.04169340431690216,
-0.039965614676475525,
-0.07059433311223984,
-0.020428480580449104,
-0.08025042712688446,
-0.03841989114880562,
-0.046485334634780884,
0.13423210382461548,
0.055206071585416794,
0.03951743617653847,
-0.0316801555454731,
0.08464034646749496,
-0.018299555405974388,
-0.15670868754386902,
0.00040981906931847334,
0.008804244920611382,
0.03739367052912712,
-0.01736285351216793,
-0.032547298818826675,
-0.08269580453634262,
0.019832247868180275,
0.12673035264015198,
-0.11679986119270325,
0.06346418708562851,
0.015051589347422123,
0.05338345840573311,
-0.0823562741279602,
0.17666569352149963,
-0.0290383230894804,
0.0307563878595829,
0.026505332440137863,
0.08698103576898575,
0.04459889978170395,
-0.02380429208278656,
-0.10421646386384964,
0.028630105778574944,
0.14289671182632446,
0.005146459210664034,
-0.03461004048585892,
0.05977455899119377,
-0.045425333082675934,
-0.04085938632488251,
0.06445733457803726,
-0.11116056144237518,
0.011621400713920593,
0.00044130062451586127,
-0.09109426289796829,
-0.03252977505326271,
0.021522335708141327,
-0.00864247977733612,
-0.05097212642431259,
0.05620585009455681,
-0.09146568924188614,
0.007124002557247877,
-0.06363824754953384,
-0.11520379036664963,
0.01581026241183281,
-0.11959714442491531,
-0.002699497388675809,
-0.1089189276099205,
-0.14425615966320038,
-0.01837061159312725,
0.05058383569121361,
-0.025914940983057022,
-0.0896802768111229,
-0.049672868102788925,
-0.09426629543304443,
0.02425595186650753,
-0.015957778319716454,
0.07887307554483414,
-0.06591194868087769,
0.08837869763374329,
0.0402858704328537,
0.07137258350849152,
-0.03602999821305275,
0.04139500856399536,
-0.0784967914223671,
0.03722416236996651,
-0.18634210526943207,
0.06024167686700821,
-0.0588490329682827,
0.0562281608581543,
-0.1098199114203453,
-0.11847174167633057,
0.02833804301917553,
-0.03466188535094261,
0.08629939705133438,
0.10089240223169327,
-0.15412063896656036,
-0.07549135386943817,
0.18851380050182343,
-0.08410783112049103,
-0.12087847292423248,
0.13110466301441193,
-0.03991724178195,
-0.0005008832085877657,
0.05830791965126991,
0.22815270721912384,
0.07839261740446091,
-0.05640382692217827,
-0.03120187669992447,
-0.004218417685478926,
0.056883230805397034,
-0.08858876675367355,
0.08376684039831161,
0.008970614522695541,
0.05215118080377579,
0.02382316254079342,
-0.020018836483359337,
0.030172089114785194,
-0.08699626475572586,
-0.08231104165315628,
-0.057634178549051285,
-0.0847942903637886,
0.0395156666636467,
0.04896559566259384,
0.085788294672966,
-0.11441800743341446,
-0.09346102923154831,
0.00899489689618349,
0.09607084095478058,
-0.07996120303869247,
0.04754983261227608,
-0.09285339713096619,
0.12577758729457855,
-0.060171764343976974,
-0.008010689169168472,
-0.1890554577112198,
0.002996897790580988,
0.04327888414263725,
-0.006943043787032366,
-0.0016399146988987923,
-0.012384600937366486,
0.06187395751476288,
0.05335979163646698,
-0.04247988015413284,
-0.0369383879005909,
-0.033413805067539215,
-0.015020554885268211,
-0.11645109951496124,
-0.18868239223957062,
-0.03757066652178764,
-0.034698646515607834,
0.11190339177846909,
-0.17388039827346802,
0.05636288970708847,
0.058224327862262726,
0.11816849559545517,
0.027149008587002754,
-0.02961898222565651,
-0.012199647724628448,
0.04524216800928116,
-0.040020156651735306,
-0.07441188395023346,
0.06467930227518082,
0.038477081805467606,
-0.0929902121424675,
-0.022192904725670815,
-0.10324739664793015,
0.18369130790233612,
0.12562769651412964,
-0.0006781669799238443,
-0.051467813551425934,
-0.013368522748351097,
-0.07582661509513855,
-0.025659918785095215,
-0.011854519136250019,
0.029444551095366478,
0.18420574069023132,
0.008648110553622246,
0.1694212257862091,
-0.10795796662569046,
-0.06051081791520119,
0.04598398879170418,
-0.02880842611193657,
-0.008404575288295746,
0.12188856303691864,
0.0023433833848685026,
-0.10419286042451859,
0.15016961097717285,
0.11519934237003326,
-0.06997606158256531,
0.11930801719427109,
-0.07269870489835739,
-0.05238637700676918,
-0.02686835639178753,
0.008327978663146496,
0.018042195588350296,
0.09764431416988373,
-0.09955422580242157,
-0.018946988508105278,
0.04220057651400566,
0.02424871362745762,
0.025213340297341347,
-0.18747174739837646,
0.0014426866546273232,
0.0378492996096611,
-0.06024162098765373,
-0.004153205547481775,
-0.009553560055792332,
-0.0010891371639445424,
0.10583967715501785,
0.012650016695261002,
-0.0829068124294281,
0.039985038340091705,
0.00892771128565073,
-0.07517937570810318,
0.19816654920578003,
-0.10037701576948166,
-0.19453631341457367,
-0.1128833070397377,
-0.039358969777822495,
-0.0855742022395134,
-0.006700264289975166,
0.06084170937538147,
-0.08822425454854965,
-0.024503212422132492,
-0.09890531748533249,
-0.045181065797805786,
-0.012304885312914848,
0.03484485670924187,
0.07599799335002899,
-0.012380246073007584,
0.10611307621002197,
-0.12071480602025986,
-0.024149037897586823,
-0.03593846410512924,
0.011456144042313099,
0.05634208396077156,
0.00892501138150692,
0.08890581130981445,
0.10901306569576263,
-0.028440648689866066,
0.051646888256073,
-0.031503159552812576,
0.23684518039226532,
-0.04647887498140335,
-0.03624913468956947,
0.13397109508514404,
0.0004287174961064011,
0.08576217293739319,
0.0813814178109169,
0.04096377268433571,
-0.07712243497371674,
-0.009418899193406105,
0.0055204154923558235,
-0.03407372161746025,
-0.2271113246679306,
-0.03416678309440613,
-0.03929072245955467,
0.020765453577041626,
0.11208493262529373,
0.047345120459795,
0.03553140535950661,
0.05425295978784561,
0.018916208297014236,
0.05288437753915787,
-0.030659440904855728,
0.08983194082975388,
0.11400288343429565,
0.05064994841814041,
0.13473404943943024,
-0.04515247046947479,
-0.02823680453002453,
0.048377662897109985,
-0.002433949848636985,
0.19940131902694702,
-0.04144316911697388,
0.17221161723136902,
0.043075770139694214,
0.1924562007188797,
0.007562489248812199,
0.06979893893003464,
-0.030760448426008224,
-0.005231582093983889,
-0.008554150350391865,
-0.04285431280732155,
-0.05821991711854935,
0.005189583171159029,
-0.04531104490160942,
0.06922867894172668,
-0.11856632679700851,
0.04238196462392807,
0.05632586404681206,
0.29192957282066345,
0.029188983142375946,
-0.37035730481147766,
-0.11157792806625366,
-0.015739955008029938,
-0.030478358268737793,
-0.04239088296890259,
0.011899220757186413,
0.11510417610406876,
-0.0897291973233223,
0.053587716072797775,
-0.07642748206853867,
0.09408147633075714,
-0.0711061879992485,
0.03527676686644554,
0.04610827937722206,
0.10399871319532394,
0.007797189522534609,
0.06642729043960571,
-0.2723133862018585,
0.2539733946323395,
0.0168149471282959,
0.05572367459535599,
-0.0685204342007637,
0.011051103472709656,
0.013551155105233192,
0.05040520802140236,
0.08040108531713486,
0.000967521860729903,
-0.04329884424805641,
-0.15111668407917023,
-0.12764613330364227,
0.019946599379181862,
0.06591150164604187,
-0.01305406354367733,
0.08375781774520874,
-0.002510834950953722,
0.00672059366479516,
0.040556441992521286,
-0.018693989142775536,
-0.03143439069390297,
-0.09134958684444427,
0.014838696457445621,
0.060506705194711685,
-0.052714549005031586,
-0.08286454528570175,
-0.12293194234371185,
-0.08703682571649551,
0.19172890484333038,
-0.006368591915816069,
-0.07997357845306396,
-0.1183437630534172,
0.07543320208787918,
0.0750763863325119,
-0.09050185233354568,
0.03079194389283657,
-0.013024372048676014,
0.12397533655166626,
0.006440191995352507,
-0.06937482953071594,
0.08778965473175049,
-0.051547929644584656,
-0.16751229763031006,
-0.03421979770064354,
0.1292639672756195,
0.02789887972176075,
0.06132403016090393,
-0.012717606499791145,
0.023007389158010483,
-0.04030200466513634,
-0.072586290538311,
0.03254525735974312,
0.0010567267891019583,
0.11101605743169785,
-0.008316283114254475,
-0.005813507363200188,
0.023243838921189308,
-0.07502855360507965,
-0.012216025032103062,
0.19287173449993134,
0.2598513662815094,
-0.10139179974794388,
0.04272756725549698,
0.027167685329914093,
-0.04014362767338753,
-0.1525976061820984,
0.007239512167870998,
0.08735556900501251,
0.015635890886187553,
-0.014053914695978165,
-0.16497549414634705,
0.04335738345980644,
0.08901513367891312,
-0.017982086166739464,
0.07383543998003006,
-0.2933061420917511,
-0.11565849184989929,
0.10940925776958466,
0.12217571586370468,
0.09801682829856873,
-0.14232610166072845,
-0.0419468879699707,
-0.015194159932434559,
-0.10476759821176529,
0.11601293832063675,
-0.10654283314943314,
0.10379815846681595,
-0.047635726630687714,
0.06407041102647781,
0.00958494283258915,
-0.056945525109767914,
0.12801238894462585,
0.005692858248949051,
0.0746660903096199,
-0.04752299562096596,
0.011951659806072712,
0.09785611927509308,
-0.0782160833477974,
0.0642547458410263,
-0.10825541615486145,
0.057282064110040665,
-0.12708042562007904,
-0.00792262889444828,
-0.0784040242433548,
0.024450013414025307,
-0.030641162768006325,
-0.031962983310222626,
-0.056421078741550446,
0.014327485114336014,
0.0703808069229126,
-0.008604411035776138,
0.17678502202033997,
0.04890294745564461,
0.1412046104669571,
0.20197352766990662,
0.07084232568740845,
-0.10379855334758759,
-0.0964064672589302,
0.0007039987831376493,
-0.012401307933032513,
0.051076799631118774,
-0.1682175248861313,
0.030786538496613503,
0.14742203056812286,
0.0027169862296432257,
0.11752926558256149,
0.06319882720708847,
-0.06321156024932861,
-0.007134811021387577,
0.049133121967315674,
-0.18155427277088165,
-0.08285916596651077,
-0.010584049858152866,
0.010095035657286644,
-0.14054149389266968,
0.06606565415859222,
0.10521182417869568,
-0.07349129021167755,
-0.025234419852495193,
0.00028883255436085165,
0.009552503004670143,
-0.02224339358508587,
0.1863895207643509,
0.07228333503007889,
0.06988339126110077,
-0.10634329915046692,
0.09283725917339325,
0.05232072249054909,
-0.07614095509052277,
0.042675863951444626,
0.0670136958360672,
-0.11214848607778549,
-0.02277013100683689,
0.061373841017484665,
0.14844553172588348,
-0.020398838445544243,
-0.062468890100717545,
-0.17536185681819916,
-0.1063779816031456,
0.09143218398094177,
0.12016201764345169,
0.10274738073348999,
0.015708038583397865,
-0.041042134165763855,
-0.01863032393157482,
-0.11133535206317902,
0.09828272461891174,
0.07069218158721924,
0.062345054000616074,
-0.15972255170345306,
0.11580965667963028,
-0.005582467652857304,
0.0673426017165184,
-0.007885712198913097,
0.015531334094703197,
-0.094784215092659,
0.0008056078222580254,
-0.09358330070972443,
0.020181704312562943,
-0.029838014394044876,
-0.003756989259272814,
-0.01978219859302044,
-0.04112142696976662,
-0.052657920867204666,
0.038761645555496216,
-0.10572616010904312,
-0.03540357947349548,
0.024233359843492508,
0.025262825191020966,
-0.10956046730279922,
-0.03016810119152069,
0.00004374065974843688,
-0.08225184679031372,
0.08771076053380966,
0.052139971405267715,
-0.00394732179120183,
0.011389168910682201,
-0.0036440121475607157,
0.009537899866700172,
0.06617105007171631,
0.011675015091896057,
0.061572082340717316,
-0.11589525640010834,
-0.003293219953775406,
0.011826855130493641,
0.0013611342292279005,
0.022601554170250893,
0.12110871821641922,
-0.11655819416046143,
-0.010821192525327206,
-0.023522883653640747,
-0.022827301174402237,
-0.07513058185577393,
0.06335283815860748,
0.10025003552436829,
0.048545386642217636,
0.18745556473731995,
-0.06630514562129974,
0.016041036695241928,
-0.20688487589359283,
0.0048309508711099625,
0.006904290523380041,
-0.14436574280261993,
-0.05657172203063965,
-0.02251015417277813,
0.06533834338188171,
-0.06768489629030228,
0.1141919195652008,
0.015339010395109653,
-0.014805036596953869,
0.03599223867058754,
-0.043198004364967346,
-0.022587396204471588,
0.005286360625177622,
0.15932518243789673,
0.01039932481944561,
-0.032152608036994934,
0.11329957842826843,
0.013813790865242481,
0.10409053415060043,
0.12376110255718231,
0.17200680077075958,
0.12915633618831635,
0.033689770847558975,
0.10319431871175766,
0.026541225612163544,
-0.038830678910017014,
-0.1638840138912201,
0.06827042251825333,
-0.04830608516931534,
0.1259995549917221,
0.0037670086603611708,
0.18351177871227264,
0.11120174825191498,
-0.156941220164299,
0.04635832458734512,
-0.028745491057634354,
-0.09243429452180862,
-0.10279085487127304,
-0.09493518620729446,
-0.08163266628980637,
-0.14313632249832153,
-0.011727920733392239,
-0.12314673513174057,
-0.0001119606604333967,
0.06378895789384842,
0.008631679229438305,
-0.01658823899924755,
0.11941222846508026,
0.0199019443243742,
-0.0028313337825238705,
0.07997620105743408,
-0.0009426053147763014,
-0.05454117804765701,
-0.06781815737485886,
-0.06792928278446198,
0.01648208126425743,
0.016566617414355278,
0.06634296476840973,
-0.027873553335666656,
0.013387996703386307,
0.04931917414069176,
-0.020463308319449425,
-0.10177630931138992,
0.007021164055913687,
0.01602855883538723,
0.05895385518670082,
0.042454756796360016,
0.012704790569841862,
-0.003998691216111183,
-0.01944013684988022,
0.18521079421043396,
-0.06868687272071838,
-0.03497106954455376,
-0.11111391335725784,
0.2134825736284256,
0.030410684645175934,
-0.04923909157514572,
0.045359402894973755,
-0.07342531532049179,
-0.020052069798111916,
0.19482727348804474,
0.2061251550912857,
-0.025739775970578194,
-0.004483010619878769,
-0.0011589471250772476,
-0.016989540308713913,
-0.016670143231749535,
0.08055374026298523,
0.14397574961185455,
0.0015819028485566378,
-0.07257590442895889,
-0.023858262225985527,
-0.06706216931343079,
0.00024062630836851895,
-0.057562168687582016,
0.05649811401963234,
0.02420194074511528,
0.007187279872596264,
-0.04374741017818451,
0.021545272320508957,
-0.04005713388323784,
-0.08320048451423645,
0.022942014038562775,
-0.20889990031719208,
-0.16554507613182068,
-0.0002199718583142385,
0.04263186827301979,
-0.002593814395368099,
0.058943089097738266,
-0.005378745496273041,
0.01758071780204773,
0.0853714644908905,
-0.029487308114767075,
-0.07979992032051086,
-0.07162126898765564,
0.09276914596557617,
-0.16957968473434448,
0.18944604694843292,
-0.04122228920459747,
0.0698293074965477,
0.1310187429189682,
0.03684106469154358,
-0.10431881248950958,
0.03226975351572037,
0.03390989080071449,
-0.01958799734711647,
-0.004574302118271589,
0.10153369605541229,
-0.0165591798722744,
0.054169923067092896,
0.04157818853855133,
-0.09150781482458115,
-0.04808119311928749,
-0.063088059425354,
-0.012918472290039062,
-0.036404240876436234,
-0.03994256258010864,
-0.05341881513595581,
0.11710771918296814,
0.18869613111019135,
-0.04988042265176773,
-0.01513571199029684,
-0.0680389553308487,
0.009178905747830868,
0.0669013038277626,
-0.017664512619376183,
-0.035580288618803024,
-0.251682847738266,
0.011223016306757927,
0.05258801579475403,
-0.015884650871157646,
-0.2378324270248413,
-0.09532994776964188,
-0.006329922471195459,
-0.07142023742198944,
-0.08623926341533661,
0.0746576189994812,
0.04345089569687843,
0.05326208472251892,
-0.06762698292732239,
0.0311188492923975,
-0.09114693105220795,
0.1469976007938385,
-0.14606402814388275,
-0.0775352269411087
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-base-finetuned-stsb
This model is a fine-tuned version of [google/fnet-base](https://huggingface.co/google/fnet-base) on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7894
- Pearson: 0.8256
- Spearmanr: 0.8219
- Combined Score: 0.8238
The model was fine-tuned to compare [google/fnet-base](https://huggingface.co/google/fnet-base) as introduced in [this paper](https://arxiv.org/abs/2105.03824) against [bert-base-cased](https://huggingface.co/bert-base-cased).
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
This model is trained using the [run_glue](https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py) script. The following command was used:
```bash
#!/usr/bin/bash
python ../run_glue.py \\n --model_name_or_path google/fnet-base \\n --task_name stsb \\n --do_train \\n --do_eval \\n --max_seq_length 512 \\n --per_device_train_batch_size 16 \\n --learning_rate 2e-5 \\n --num_train_epochs 3 \\n --output_dir fnet-base-finetuned-stsb \\n --push_to_hub \\n --hub_strategy all_checkpoints \\n --logging_strategy epoch \\n --save_strategy epoch \\n --evaluation_strategy epoch \\n```
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Combined Score | Validation Loss | Pearson | Spearmanr |
|:-------------:|:-----:|:----:|:--------------:|:---------------:|:-------:|:---------:|
| 1.5473 | 1.0 | 360 | 0.8120 | 0.7751 | 0.8115 | 0.8125 |
| 0.6954 | 2.0 | 720 | 0.8145 | 0.8717 | 0.8160 | 0.8130 |
| 0.4828 | 3.0 | 1080 | 0.8238 | 0.7894 | 0.8256 | 0.8219 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer", "fnet-bert-base-comparison"], "datasets": ["glue"], "metrics": ["spearmanr"], "model-index": [{"name": "fnet-base-finetuned-stsb", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE STSB", "type": "glue", "args": "stsb"}, "metrics": [{"type": "spearmanr", "value": 0.8219397497728022, "name": "Spearmanr"}]}]}]}
|
text-classification
|
gchhablani/fnet-base-finetuned-stsb
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"fnet-bert-base-comparison",
"en",
"dataset:glue",
"arxiv:2105.03824",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.03824"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-base-finetuned-stsb
========================
This model is a fine-tuned version of google/fnet-base on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
* Loss: 0.7894
* Pearson: 0.8256
* Spearmanr: 0.8219
* Combined Score: 0.8238
The model was fine-tuned to compare google/fnet-base as introduced in this paper against bert-base-cased.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
This model is trained using the run\_glue script. The following command was used:
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
88,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.1281575709581375,
0.15705668926239014,
-0.0038979940582066774,
0.11255989223718643,
0.11293601244688034,
-0.008617952466011047,
0.13138744235038757,
0.1563730537891388,
-0.1094919964671135,
0.05590956658124924,
0.1562153697013855,
0.15063650906085968,
0.03187095746397972,
0.18714870512485504,
-0.05448957160115242,
-0.23206175863742828,
0.030974131077528,
0.07522018253803253,
-0.05673399940133095,
0.13729923963546753,
0.09973176568746567,
-0.11717205494642258,
0.09386496245861053,
0.041752949357032776,
-0.20222944021224976,
-0.015307755209505558,
-0.0008101215353235602,
-0.08269140124320984,
0.11289821565151215,
0.02265351265668869,
0.08919014781713486,
0.03242024779319763,
0.03147246316075325,
-0.1466279774904251,
0.007479942869395018,
0.055549394339323044,
0.0026599117554724216,
0.11788346618413925,
0.03937623277306557,
-0.014160647056996822,
0.07863719761371613,
-0.09087078273296356,
0.046092916280031204,
0.026117689907550812,
-0.10794810950756073,
-0.2786095142364502,
-0.08820846676826477,
0.07192650437355042,
0.042675234377384186,
0.0806850865483284,
0.001147752278484404,
0.17639178037643433,
-0.007389090955257416,
0.11422055959701538,
0.2610849142074585,
-0.3091604709625244,
-0.061695314943790436,
0.024227019399404526,
0.009995708242058754,
0.06274503469467163,
-0.08376740664243698,
-0.031031707301735878,
0.04775812104344368,
0.04504694044589996,
0.17843511700630188,
-0.011732942424714565,
-0.007466572802513838,
-0.025801893323659897,
-0.14371579885482788,
-0.07526436448097229,
0.19688530266284943,
0.04901329055428505,
-0.048377830535173416,
-0.06737475097179413,
-0.08691056817770004,
-0.16888527572155,
-0.03213018923997879,
-0.018070582300424576,
0.04411018267273903,
-0.0378425307571888,
-0.060805536806583405,
-0.029689611867070198,
-0.07623965293169022,
-0.042504265904426575,
-0.042007382959127426,
0.15460588037967682,
0.05135969817638397,
0.03275564685463905,
-0.03648478910326958,
0.08528219908475876,
-0.016176627948880196,
-0.15604232251644135,
-0.0033359206281602383,
0.008210553787648678,
0.032338954508304596,
-0.01879490725696087,
-0.03313892334699631,
-0.09054049104452133,
0.012796161696314812,
0.12658193707466125,
-0.09896866977214813,
0.0699896514415741,
0.014115668833255768,
0.05199545621871948,
-0.0832819789648056,
0.18305599689483643,
-0.03047999180853367,
0.014890378341078758,
0.025784319266676903,
0.0863259807229042,
0.045972537249326706,
-0.02756713703274727,
-0.10970760136842728,
0.026695188134908676,
0.13855181634426117,
0.006172229070216417,
-0.03736800700426102,
0.07175908982753754,
-0.04684898629784584,
-0.04453456029295921,
0.056356143206357956,
-0.11108183860778809,
0.012465346604585648,
0.0007881103083491325,
-0.08858437836170197,
-0.03560861945152283,
0.026957271620631218,
-0.012466381303966045,
-0.04220158979296684,
0.05419426038861275,
-0.0943819209933281,
0.009732727892696857,
-0.05831371992826462,
-0.1146949976682663,
0.013262145221233368,
-0.11782565712928772,
0.0005620947922579944,
-0.11043663322925568,
-0.13709686696529388,
-0.008001538924872875,
0.05364624038338661,
-0.021408166736364365,
-0.0810130313038826,
-0.055306028574705124,
-0.09359939396381378,
0.02663268707692623,
-0.015616829507052898,
0.05365283414721489,
-0.06507189571857452,
0.08449108898639679,
0.049848102033138275,
0.07236256450414658,
-0.03752796724438667,
0.04536004737019539,
-0.07848503440618515,
0.0464148111641407,
-0.20472228527069092,
0.07016371190547943,
-0.064207062125206,
0.06198544800281525,
-0.1068328395485878,
-0.11787286400794983,
0.030897412449121475,
-0.03552781045436859,
0.08240031450986862,
0.09914099425077438,
-0.14732807874679565,
-0.08291295915842056,
0.19005797803401947,
-0.08365947753190994,
-0.12725262343883514,
0.12337704747915268,
-0.04553937166929245,
0.0029195500537753105,
0.05737419053912163,
0.2257438600063324,
0.0889558345079422,
-0.0515265017747879,
-0.02331758849322796,
-0.0051168533973395824,
0.04387769103050232,
-0.07846325635910034,
0.08052174746990204,
0.006332707591354847,
0.04367716237902641,
0.02254851721227169,
-0.029947886243462563,
0.04004385322332382,
-0.08367564529180527,
-0.08474674820899963,
-0.05332755297422409,
-0.08056257665157318,
0.056404486298561096,
0.04767722263932228,
0.08231600373983383,
-0.11374399065971375,
-0.094481460750103,
0.02864118292927742,
0.09102724492549896,
-0.08461901545524597,
0.04763771593570709,
-0.09820594638586044,
0.12363352626562119,
-0.06146226450800896,
-0.004213334526866674,
-0.18391123414039612,
-0.009381342679262161,
0.0438222773373127,
-0.023048149421811104,
-0.0010736447293311357,
-0.017531832680106163,
0.0634535625576973,
0.05774131044745445,
-0.039399147033691406,
-0.0408325120806694,
-0.033703941851854324,
-0.009450753219425678,
-0.11244195699691772,
-0.18618963658809662,
-0.04849305376410484,
-0.034880541265010834,
0.11852548271417618,
-0.17111770808696747,
0.06201837584376335,
0.06472091376781464,
0.11162162572145462,
0.02296465076506138,
-0.02818126603960991,
-0.0076725417748093605,
0.041496098041534424,
-0.039293959736824036,
-0.0763016790151596,
0.06855007261037827,
0.03613912686705589,
-0.0994340106844902,
-0.021636441349983215,
-0.11005594581365585,
0.18589626252651215,
0.12711931765079498,
-0.015304340980947018,
-0.046145275235176086,
-0.0037261333782225847,
-0.06693658232688904,
-0.0252322256565094,
0.00010479353659320623,
0.02630312740802765,
0.18688923120498657,
0.007668614853173494,
0.17470361292362213,
-0.10448402166366577,
-0.054649028927087784,
0.04829766973853111,
-0.027804838493466377,
-0.012250205501914024,
0.11106427013874054,
0.000900426646694541,
-0.11244355887174606,
0.14791768789291382,
0.12212676554918289,
-0.06123341992497444,
0.1253422647714615,
-0.06269760429859161,
-0.042313165962696075,
-0.03154568001627922,
0.0070209321565926075,
0.016009759157896042,
0.09681913256645203,
-0.12338456511497498,
-0.019060473889112473,
0.03688754513859749,
0.026065392419695854,
0.022046322003006935,
-0.18368276953697205,
0.0016608438454568386,
0.046017080545425415,
-0.06232096627354622,
-0.0001330666127614677,
-0.01134573481976986,
-0.0014391003642231226,
0.09911268204450607,
0.018035314977169037,
-0.08723576366901398,
0.04871227592229843,
0.008767826482653618,
-0.07373929023742676,
0.20202793180942535,
-0.10205624997615814,
-0.19618737697601318,
-0.11792904138565063,
-0.0520332008600235,
-0.09464605152606964,
-0.00033909876947291195,
0.06213480234146118,
-0.07916445285081863,
-0.022073807194828987,
-0.09520765393972397,
-0.03123565763235092,
-0.01737889274954796,
0.03542753681540489,
0.07313533127307892,
-0.02203269489109516,
0.10872673243284225,
-0.1269455999135971,
-0.02306712418794632,
-0.0358131118118763,
0.008673591539263725,
0.055851276963949203,
0.009536282159388065,
0.09724908322095871,
0.1170153021812439,
-0.033818624913692474,
0.05197916179895401,
-0.028282886371016502,
0.24284608662128448,
-0.053264424204826355,
-0.02976847253739834,
0.12733469903469086,
-0.0040460084564983845,
0.08623170107603073,
0.08496475964784622,
0.04557865113019943,
-0.08525460958480835,
-0.01157840620726347,
0.00595470005646348,
-0.03817666694521904,
-0.21714363992214203,
-0.036886055022478104,
-0.04140350595116615,
0.02079310268163681,
0.10649195313453674,
0.04592718183994293,
0.043558333069086075,
0.057499635964632034,
0.023834481835365295,
0.04960625246167183,
-0.02408022992312908,
0.09037080407142639,
0.1317465603351593,
0.04990197345614433,
0.13514135777950287,
-0.04064502939581871,
-0.03152943775057793,
0.039630502462387085,
-0.002475862158462405,
0.19950714707374573,
-0.02927609719336033,
0.1829756200313568,
0.04599745571613312,
0.1883370578289032,
0.011177031323313713,
0.06271716207265854,
-0.024891022592782974,
-0.006736312061548233,
-0.009674341417849064,
-0.0412435419857502,
-0.05800659582018852,
0.0024435357190668583,
-0.04474163055419922,
0.07293359190225601,
-0.117747962474823,
0.040190309286117554,
0.06010587513446808,
0.29027169942855835,
0.022407572716474533,
-0.37493225932121277,
-0.11181744188070297,
-0.016597270965576172,
-0.031712714582681656,
-0.04552937671542168,
0.010863134637475014,
0.11205030232667923,
-0.09440278261899948,
0.06324154883623123,
-0.08447496592998505,
0.09077057987451553,
-0.07360748946666718,
0.03601958230137825,
0.05221237242221832,
0.09796657413244247,
0.006393694784492254,
0.06149907410144806,
-0.26434561610221863,
0.25069257616996765,
0.018371766433119774,
0.05161698907613754,
-0.06146470457315445,
0.015569853596389294,
0.02125987596809864,
0.06923633813858032,
0.07857105880975723,
0.0007500528590753675,
-0.02767954021692276,
-0.16410666704177856,
-0.11515739560127258,
0.016607077792286873,
0.07020550966262817,
-0.02121029421687126,
0.08431950211524963,
-0.0010764591861516237,
0.004067974630743265,
0.04312531650066376,
-0.008349082432687283,
-0.027957666665315628,
-0.09104249626398087,
0.016294151544570923,
0.06238500773906708,
-0.04210955649614334,
-0.08233670145273209,
-0.1198863536119461,
-0.08412427455186844,
0.1865774542093277,
-0.008332050405442715,
-0.0823029950261116,
-0.12076611816883087,
0.06184208020567894,
0.06703545898199081,
-0.09185026586055756,
0.042081426829099655,
-0.017016833648085594,
0.12382151186466217,
0.003344601020216942,
-0.07244402170181274,
0.09798984974622726,
-0.04916444793343544,
-0.16366566717624664,
-0.03181470185518265,
0.13521257042884827,
0.034096188843250275,
0.05754973739385605,
-0.012549868784844875,
0.019313739612698555,
-0.026896705850958824,
-0.0765208750963211,
0.03267578408122063,
0.002446891972795129,
0.09423114359378815,
-0.015256565995514393,
0.0032036362681537867,
0.028266185894608498,
-0.07761038839817047,
0.005427155178040266,
0.19771218299865723,
0.2606421709060669,
-0.1043548434972763,
0.03651893138885498,
0.027820715680718422,
-0.04207947105169296,
-0.1520952433347702,
0.017168138176202774,
0.08070502430200577,
0.007420012727379799,
-0.010656541213393211,
-0.1764611452817917,
0.05012203007936478,
0.08809427171945572,
-0.01788947917521,
0.08023256808519363,
-0.29865652322769165,
-0.11554376780986786,
0.11132355779409409,
0.12564530968666077,
0.09193845093250275,
-0.13787555694580078,
-0.04598912596702576,
-0.0075193652883172035,
-0.12548284232616425,
0.1158299669623375,
-0.0956348180770874,
0.10974547266960144,
-0.04320225492119789,
0.05354843661189079,
0.007337958551943302,
-0.052445266395807266,
0.12752960622310638,
0.01908416859805584,
0.0783768892288208,
-0.0478614941239357,
0.003130370983853936,
0.10798399150371552,
-0.0758182629942894,
0.06092115119099617,
-0.10174112766981125,
0.04936999827623367,
-0.12913434207439423,
-0.01087163481861353,
-0.08215732127428055,
0.02880539558827877,
-0.029542503878474236,
-0.03560418263077736,
-0.058059368282556534,
0.009000681340694427,
0.07482077181339264,
-0.0021140656899660826,
0.17803633213043213,
0.047338198870420456,
0.12804824113845825,
0.2045678347349167,
0.07827051728963852,
-0.11816922575235367,
-0.10215629637241364,
-0.0021673408336937428,
-0.009365938603878021,
0.0566975399851799,
-0.15991105139255524,
0.03947640210390091,
0.1441853940486908,
0.004068439826369286,
0.11697175353765488,
0.06757407635450363,
-0.058885324746370316,
-0.00002974093695229385,
0.04177243635058403,
-0.17993171513080597,
-0.10290152579545975,
-0.014580353163182735,
-0.012066712602972984,
-0.13221070170402527,
0.07946693897247314,
0.11126816272735596,
-0.07109289616346359,
-0.018744582310318947,
0.0011641265591606498,
-0.0008233633125200868,
-0.025426305830478668,
0.18059362471103668,
0.0713382288813591,
0.07033462822437286,
-0.10616692155599594,
0.09460733085870743,
0.0462157279253006,
-0.0725565180182457,
0.04061013460159302,
0.06810374557971954,
-0.11764568090438843,
-0.022825488820672035,
0.037216611206531525,
0.14856918156147003,
-0.027179433032870293,
-0.05499974638223648,
-0.17361967265605927,
-0.1086450144648552,
0.08841122686862946,
0.13100364804267883,
0.10317601263523102,
0.017121080309152603,
-0.04885329306125641,
-0.009807179681956768,
-0.11012768745422363,
0.09655274450778961,
0.05846559628844261,
0.0660039484500885,
-0.16399246454238892,
0.11977937817573547,
-0.0003578494652174413,
0.07169348001480103,
-0.012192845344543457,
0.001448655384592712,
-0.0843949168920517,
0.004430484492331743,
-0.09232339262962341,
0.012558856047689915,
-0.04001414775848389,
0.00048618309665471315,
-0.022317882627248764,
-0.04360825940966606,
-0.0546182319521904,
0.03691510483622551,
-0.10589302331209183,
-0.03589541092514992,
0.030760591849684715,
0.02635170891880989,
-0.11072052270174026,
-0.031555552035570145,
0.0018842692952603102,
-0.08543401211500168,
0.09491559118032455,
0.053661517798900604,
-0.007116627413779497,
0.009966611862182617,
-0.003835842479020357,
0.002103820675984025,
0.0713348463177681,
0.007812848314642906,
0.06204405426979065,
-0.11863148212432861,
-0.004516935441643,
0.004596107639372349,
0.001546198152936995,
0.019018597900867462,
0.12120135128498077,
-0.12007837742567062,
-0.014022485353052616,
-0.022460926324129105,
-0.022654950618743896,
-0.0753025934100151,
0.05987093597650528,
0.10378355532884598,
0.041243866086006165,
0.19044536352157593,
-0.06453124433755875,
0.015417633578181267,
-0.2059982270002365,
-0.0010957218473777175,
0.009144444018602371,
-0.1515446901321411,
-0.05387648567557335,
-0.025656811892986298,
0.05819327011704445,
-0.07224544882774353,
0.11503966152667999,
0.004568712320178747,
-0.01583913527429104,
0.03687077388167381,
-0.044159963726997375,
-0.016699714586138725,
0.008686945773661137,
0.16097450256347656,
0.013965492136776447,
-0.037191856652498245,
0.12147503346204758,
0.023390216752886772,
0.1060599610209465,
0.12157408893108368,
0.1685730367898941,
0.12814003229141235,
0.026453150436282158,
0.10737672448158264,
0.032515060156583786,
-0.030444564297795296,
-0.18784967064857483,
0.06720752269029617,
-0.04578623175621033,
0.13226191699504852,
0.006823073141276836,
0.17648069560527802,
0.11036133766174316,
-0.15301413834095,
0.049124132841825485,
-0.021638398990035057,
-0.09448863565921783,
-0.10050928592681885,
-0.09679119288921356,
-0.07743744552135468,
-0.15149177610874176,
-0.01236912701278925,
-0.11908751726150513,
-0.001917796558700502,
0.06672867387533188,
0.0058901347219944,
-0.021161947399377823,
0.13932976126670837,
0.023673847317695618,
-0.0022931741550564766,
0.0803397074341774,
-0.0035189997870475054,
-0.05596965178847313,
-0.07125981897115707,
-0.07409729808568954,
0.013776594772934914,
0.013752772472798824,
0.05871286615729332,
-0.02784240059554577,
0.009659827686846256,
0.051329080015420914,
-0.02629757858812809,
-0.10809190571308136,
0.009442382492125034,
0.021296720951795578,
0.054612502455711365,
0.020937973633408546,
0.013347321189939976,
-0.011912370100617409,
-0.017500825226306915,
0.17539910972118378,
-0.06457335501909256,
-0.022280963137745857,
-0.10723160207271576,
0.20094625651836395,
0.036107197403907776,
-0.03869583457708359,
0.040720485150814056,
-0.06905907392501831,
-0.020572103559970856,
0.18964605033397675,
0.20552918314933777,
-0.026153136044740677,
0.0006357349921017885,
-0.004643670748919249,
-0.01797187142074108,
-0.007700672838836908,
0.08245248347520828,
0.13317352533340454,
-0.0074555822648108006,
-0.06725732237100601,
-0.026400936767458916,
-0.06664355099201202,
-0.004993689712136984,
-0.048690151423215866,
0.06622577458620071,
0.027980800718069077,
0.012009805999696255,
-0.046264056116342545,
0.022495536133646965,
-0.042431484907865524,
-0.07840681821107864,
0.030966809019446373,
-0.20489253103733063,
-0.15987305343151093,
-0.008346200920641422,
0.03857562318444252,
0.0004496309265960008,
0.062375955283641815,
-0.0009975176071748137,
0.012196631170809269,
0.0908537283539772,
-0.02237413451075554,
-0.08646247535943985,
-0.07932375371456146,
0.09022057801485062,
-0.17671364545822144,
0.2005920559167862,
-0.04314102232456207,
0.049184780567884445,
0.1326240599155426,
0.04266604781150818,
-0.10272050648927689,
0.0379687063395977,
0.03511279076337814,
-0.008898447267711163,
-0.004219701513648033,
0.09855833649635315,
-0.012647976167500019,
0.07372915744781494,
0.03857356309890747,
-0.09686969965696335,
-0.039119016379117966,
-0.07064490765333176,
-0.0207537654787302,
-0.040169861167669296,
-0.052368659526109695,
-0.05137869715690613,
0.11046379059553146,
0.19000554084777832,
-0.049329470843076706,
-0.017489463090896606,
-0.06654652208089828,
0.005614506546407938,
0.07303186506032944,
-0.011176830157637596,
-0.04266827553510666,
-0.253805011510849,
0.010820526629686356,
0.046582285314798355,
-0.017032980918884277,
-0.2275182157754898,
-0.1029951274394989,
-0.004221665672957897,
-0.06776802986860275,
-0.07437815517187119,
0.07514068484306335,
0.04294868931174278,
0.05498082563281059,
-0.06696301698684692,
0.028071772307157516,
-0.09679107367992401,
0.15699414908885956,
-0.15248030424118042,
-0.07556789368391037
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-base-finetuned-wnli
This model is a fine-tuned version of [google/fnet-base](https://huggingface.co/google/fnet-base) on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6887
- Accuracy: 0.5493
The model was fine-tuned to compare [google/fnet-base](https://huggingface.co/google/fnet-base) as introduced in [this paper](https://arxiv.org/abs/2105.03824) against [bert-base-cased](https://huggingface.co/bert-base-cased).
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
This model is trained using the [run_glue](https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py) script. The following command was used:
```bash
#!/usr/bin/bash
python ../run_glue.py \\n --model_name_or_path google/fnet-base \\n --task_name wnli \\n --do_train \\n --do_eval \\n --max_seq_length 512 \\n --per_device_train_batch_size 16 \\n --learning_rate 2e-5 \\n --num_train_epochs 5 \\n --output_dir fnet-base-finetuned-wnli \\n --push_to_hub \\n --hub_strategy all_checkpoints \\n --logging_strategy epoch \\n --save_strategy epoch \\n --evaluation_strategy epoch \\n```
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7052 | 1.0 | 40 | 0.6902 | 0.5634 |
| 0.6957 | 2.0 | 80 | 0.7013 | 0.4366 |
| 0.6898 | 3.0 | 120 | 0.6898 | 0.5352 |
| 0.6958 | 4.0 | 160 | 0.6874 | 0.5634 |
| 0.6982 | 5.0 | 200 | 0.6887 | 0.5493 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer", "fnet-bert-base-comparison"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-base-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE WNLI", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5492957746478874, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-base-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"fnet-bert-base-comparison",
"en",
"dataset:glue",
"arxiv:2105.03824",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.03824"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-base-finetuned-wnli
========================
This model is a fine-tuned version of google/fnet-base on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6887
* Accuracy: 0.5493
The model was fine-tuned to compare google/fnet-base as introduced in this paper against bert-base-cased.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
This model is trained using the run\_glue script. The following command was used:
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
88,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #fnet-bert-base-comparison #en #dataset-glue #arxiv-2105.03824 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.1294347047805786,
0.15776823461055756,
-0.003917243331670761,
0.11227939277887344,
0.11287269741296768,
-0.00867973268032074,
0.13006214797496796,
0.15599386394023895,
-0.10833613574504852,
0.05695851519703865,
0.15625059604644775,
0.14928914606571198,
0.031990837305784225,
0.18835246562957764,
-0.054748013615608215,
-0.2318466156721115,
0.030150001868605614,
0.0764521062374115,
-0.05639369785785675,
0.13710667192935944,
0.0989472046494484,
-0.11860295385122299,
0.09347143769264221,
0.04175164923071861,
-0.2037433683872223,
-0.01586032286286354,
0.00032077240757644176,
-0.08281534165143967,
0.11295787245035172,
0.022520827129483223,
0.08912119269371033,
0.03228167071938515,
0.03050021082162857,
-0.1462295800447464,
0.007699727546423674,
0.05618569254875183,
0.0025992360897362232,
0.11889952421188354,
0.03863006457686424,
-0.013525396585464478,
0.07942727953195572,
-0.09125499427318573,
0.04518340900540352,
0.02592192031443119,
-0.1080249696969986,
-0.2819972038269043,
-0.08688072115182877,
0.07268423587083817,
0.04202193394303322,
0.0803443118929863,
0.0008178367861546576,
0.17575936019420624,
-0.007282824721187353,
0.1144152283668518,
0.2596985697746277,
-0.3110000193119049,
-0.06116798520088196,
0.02316102758049965,
0.009494342841207981,
0.06345420330762863,
-0.08541600406169891,
-0.031247681006789207,
0.048284538090229034,
0.04325330629944801,
0.1794067770242691,
-0.011458450928330421,
-0.007149836979806423,
-0.025910677388310432,
-0.14341889321804047,
-0.0757712796330452,
0.19806008040905,
0.04967270791530609,
-0.047597505152225494,
-0.06660094112157822,
-0.08804943412542343,
-0.1688326746225357,
-0.03264717012643814,
-0.01801031455397606,
0.04462956637144089,
-0.03791727125644684,
-0.0589226633310318,
-0.028534654527902603,
-0.0764043927192688,
-0.042577486485242844,
-0.042390674352645874,
0.15392567217350006,
0.05139201879501343,
0.032932400703430176,
-0.03843959793448448,
0.08476367592811584,
-0.018588144332170486,
-0.15685492753982544,
-0.003073766129091382,
0.00831099133938551,
0.032669827342033386,
-0.018471626564860344,
-0.03288761153817177,
-0.09086716920137405,
0.013486159965395927,
0.12598945200443268,
-0.09839624166488647,
0.06968382745981216,
0.012926696799695492,
0.051633335649967194,
-0.08377796411514282,
0.18277320265769958,
-0.029339900240302086,
0.013655705377459526,
0.02643161453306675,
0.08627751469612122,
0.045253686606884,
-0.027602221816778183,
-0.10944601148366928,
0.026082325726747513,
0.138589009642601,
0.005100409034639597,
-0.03699839860200882,
0.07159482687711716,
-0.046671394258737564,
-0.04377603530883789,
0.055555734783411026,
-0.11096695065498352,
0.013195594772696495,
0.0012843089643865824,
-0.08894145488739014,
-0.036723051220178604,
0.026743998751044273,
-0.01389769185334444,
-0.043164223432540894,
0.05528248846530914,
-0.09538555145263672,
0.009111303836107254,
-0.059108372777700424,
-0.11533742398023605,
0.01289414893835783,
-0.11949043720960617,
0.0013519255444407463,
-0.109888456761837,
-0.1352292001247406,
-0.008098151534795761,
0.05253935232758522,
-0.02088603377342224,
-0.08179537206888199,
-0.05587518587708473,
-0.09437811374664307,
0.027021823450922966,
-0.01534327119588852,
0.0546845868229866,
-0.0649983137845993,
0.08562931418418884,
0.05025998502969742,
0.0732530802488327,
-0.03711319342255592,
0.045747533440589905,
-0.07925648242235184,
0.04529445618391037,
-0.20424382388591766,
0.06939747184515,
-0.06434474140405655,
0.06229047104716301,
-0.10715609043836594,
-0.117185078561306,
0.029869243502616882,
-0.03632883355021477,
0.08210492134094238,
0.09986089169979095,
-0.1464376151561737,
-0.08419061452150345,
0.19336776435375214,
-0.08402501791715622,
-0.1270311027765274,
0.12291108816862106,
-0.04557105898857117,
0.002800210379064083,
0.05785496532917023,
0.22697393596172333,
0.08953249454498291,
-0.05094270780682564,
-0.023491637781262398,
-0.005242811515927315,
0.043609559535980225,
-0.07905299216508865,
0.07979247719049454,
0.006977790035307407,
0.04260342940688133,
0.022792303934693336,
-0.029363535344600677,
0.04053501412272453,
-0.08387166261672974,
-0.08405832201242447,
-0.054650142788887024,
-0.08046764135360718,
0.05724374204874039,
0.04899771884083748,
0.08289357274770737,
-0.11314118653535843,
-0.09415262937545776,
0.027980055660009384,
0.09097029268741608,
-0.08321060240268707,
0.04656745120882988,
-0.09828027337789536,
0.1251775026321411,
-0.0622311495244503,
-0.0028876904398202896,
-0.18514208495616913,
-0.0072322809137403965,
0.04435204342007637,
-0.02490418776869774,
-0.000853859179187566,
-0.01722896471619606,
0.06357277929782867,
0.05737774819135666,
-0.038729820400476456,
-0.04096408560872078,
-0.034338995814323425,
-0.009837169200181961,
-0.11381691694259644,
-0.1878732591867447,
-0.04747845232486725,
-0.03478720784187317,
0.11789482831954956,
-0.17237557470798492,
0.06186696141958237,
0.06596499681472778,
0.11241570860147476,
0.024192502722144127,
-0.02798614464700222,
-0.0078679658472538,
0.04189065843820572,
-0.03845973685383797,
-0.07504423707723618,
0.06836918741464615,
0.03562292829155922,
-0.09856722503900528,
-0.022597767412662506,
-0.10956339538097382,
0.1883760541677475,
0.12811586260795593,
-0.01528948824852705,
-0.04474876448512077,
-0.0030113121028989553,
-0.06595233082771301,
-0.025376109406352043,
0.0001992616307688877,
0.028176143765449524,
0.18498769402503967,
0.006765210535377264,
0.17534932494163513,
-0.10435620695352554,
-0.05398137867450714,
0.04991708695888519,
-0.026386620476841927,
-0.010547580197453499,
0.1123865395784378,
0.0013204298447817564,
-0.11275755614042282,
0.14809755980968475,
0.12280955165624619,
-0.06073061749339104,
0.12605755031108856,
-0.06380902230739594,
-0.043400008231401443,
-0.03124178946018219,
0.006571766920387745,
0.015356606803834438,
0.09826050698757172,
-0.12633495032787323,
-0.019942620769143105,
0.03755846247076988,
0.025127369910478592,
0.0213486235588789,
-0.1836351752281189,
0.0026766532100737095,
0.0459781289100647,
-0.06142926961183548,
-0.002965846098959446,
-0.010835528373718262,
-0.0011737890308722854,
0.09970453381538391,
0.017555667087435722,
-0.08726724237203598,
0.04739336669445038,
0.009003933519124985,
-0.07242876291275024,
0.20206153392791748,
-0.10128787904977798,
-0.19561681151390076,
-0.1166614294052124,
-0.04983629286289215,
-0.0937039852142334,
-0.0009387843892909586,
0.06328993290662766,
-0.07944092154502869,
-0.021747061982750893,
-0.09374503046274185,
-0.031456977128982544,
-0.016678253188729286,
0.03649067506194115,
0.07167337089776993,
-0.02152007259428501,
0.10843265056610107,
-0.1268416792154312,
-0.022187968716025352,
-0.036466311663389206,
0.009407520294189453,
0.055414751172065735,
0.010888321325182915,
0.09761083126068115,
0.11708344519138336,
-0.03449703007936478,
0.051886748522520065,
-0.02834194153547287,
0.2431972175836563,
-0.0549328438937664,
-0.029489416629076004,
0.12720809876918793,
-0.0036414244677871466,
0.08575236052274704,
0.0850614607334137,
0.04563970863819122,
-0.08425978571176529,
-0.012521781027317047,
0.00530668580904603,
-0.037102844566106796,
-0.21723859012126923,
-0.0365767739713192,
-0.04194740578532219,
0.021104680374264717,
0.10723866522312164,
0.04582393541932106,
0.04446491226553917,
0.05713071674108505,
0.02403874322772026,
0.05055225268006325,
-0.02454719878733158,
0.09011561423540115,
0.1325213462114334,
0.04956340417265892,
0.13514669239521027,
-0.041864071041345596,
-0.031858645379543304,
0.03866573050618172,
-0.0034419160801917315,
0.2011357694864273,
-0.028623657301068306,
0.18386606872081757,
0.04475536197423935,
0.19028805196285248,
0.012321713380515575,
0.06256132572889328,
-0.0251434538513422,
-0.006521121598780155,
-0.009183596819639206,
-0.04077095165848732,
-0.057392667979002,
0.0030708396807312965,
-0.04522627592086792,
0.07294609397649765,
-0.11834508180618286,
0.03892838582396507,
0.06002010405063629,
0.29006633162498474,
0.022332480177283287,
-0.37384721636772156,
-0.1128094345331192,
-0.01751832291483879,
-0.03198004513978958,
-0.04660427197813988,
0.010848252102732658,
0.11287738382816315,
-0.09319595247507095,
0.06235712766647339,
-0.08436550945043564,
0.09173392504453659,
-0.07310128211975098,
0.036443956196308136,
0.0506972000002861,
0.09780693054199219,
0.006120195612311363,
0.06156545877456665,
-0.2656126320362091,
0.25246939063072205,
0.01898382045328617,
0.05165523290634155,
-0.06228072568774223,
0.014799673110246658,
0.020547624677419662,
0.07127467542886734,
0.07885634899139404,
0.0006794908549636602,
-0.028727702796459198,
-0.16280202567577362,
-0.11423774808645248,
0.01696728728711605,
0.07045066356658936,
-0.02243482507765293,
0.08354400843381882,
-0.0007569814915768802,
0.0032902355305850506,
0.04280740022659302,
-0.010342855006456375,
-0.027652321383357048,
-0.09026366472244263,
0.016356363892555237,
0.06345270574092865,
-0.041483666747808456,
-0.08211879432201385,
-0.11932772397994995,
-0.08241575956344604,
0.1833699345588684,
-0.01318553276360035,
-0.08128651231527328,
-0.12019886076450348,
0.06338709592819214,
0.06619332730770111,
-0.09245581924915314,
0.042988114058971405,
-0.017607081681489944,
0.1234922781586647,
0.0032546466682106256,
-0.07252337038516998,
0.09986564517021179,
-0.04792711138725281,
-0.16395272314548492,
-0.03170733153820038,
0.13534678518772125,
0.03350100293755531,
0.057126160711050034,
-0.013107032515108585,
0.019576072692871094,
-0.02810630574822426,
-0.07615172117948532,
0.033417459577322006,
0.0019211877370253205,
0.09283176064491272,
-0.016100896522402763,
0.0036490904167294502,
0.02513909339904785,
-0.07741467654705048,
0.005653811618685722,
0.1960393786430359,
0.2601768672466278,
-0.1054224744439125,
0.03657129034399986,
0.028584623709321022,
-0.04292646795511246,
-0.15217222273349762,
0.018080569803714752,
0.07971378415822983,
0.007898523472249508,
-0.012302767485380173,
-0.17773962020874023,
0.050347067415714264,
0.08671995997428894,
-0.018084589391946793,
0.08345312625169754,
-0.29868802428245544,
-0.11544010043144226,
0.11149483919143677,
0.12423747777938843,
0.09513396769762039,
-0.1379803866147995,
-0.04662996158003807,
-0.006842230912297964,
-0.12511427700519562,
0.11477506905794144,
-0.09774457663297653,
0.10926564782857895,
-0.04369644820690155,
0.05271214619278908,
0.0077659860253334045,
-0.0524640791118145,
0.12633778154850006,
0.01897483877837658,
0.07942014187574387,
-0.04729471355676651,
0.0028797932900488377,
0.10791897773742676,
-0.07520560175180435,
0.06073889136314392,
-0.10178165137767792,
0.04971356689929962,
-0.12877590954303741,
-0.010309635661542416,
-0.082453154027462,
0.0295798871666193,
-0.029397226870059967,
-0.03437960892915726,
-0.05814070999622345,
0.008260298520326614,
0.07413773238658905,
-0.0016154507175087929,
0.17978344857692719,
0.04773912951350212,
0.12875743210315704,
0.20528703927993774,
0.07694612443447113,
-0.11820431053638458,
-0.10233582556247711,
-0.0013466635718941689,
-0.00923001766204834,
0.05888792872428894,
-0.16294585168361664,
0.03940162807703018,
0.1446576565504074,
0.005404313560575247,
0.11615433543920517,
0.06740670651197433,
-0.05869709327816963,
-0.00048802205128595233,
0.0418846420943737,
-0.18120436370372772,
-0.1063351184129715,
-0.014679894782602787,
-0.012210211716592312,
-0.13097167015075684,
0.08067499846220016,
0.1108740046620369,
-0.07254352420568466,
-0.018809350207448006,
0.000591395131777972,
-0.0013111312873661518,
-0.02480907179415226,
0.1805391013622284,
0.07104653865098953,
0.07086271047592163,
-0.10655967146158218,
0.09469111263751984,
0.04809468612074852,
-0.07629316300153732,
0.040003977715969086,
0.06678993254899979,
-0.11828560382127762,
-0.02318212017416954,
0.03783298283815384,
0.14980897307395935,
-0.025542154908180237,
-0.05531641095876694,
-0.1733895242214203,
-0.1079295352101326,
0.08900009095668793,
0.1341186910867691,
0.10272374749183655,
0.016925804316997528,
-0.04834591969847679,
-0.009077857248485088,
-0.11072597652673721,
0.09780983626842499,
0.05804075300693512,
0.06632594019174576,
-0.16388888657093048,
0.12005561590194702,
0.00006893343379488215,
0.07082127034664154,
-0.01233591791242361,
0.0014726483495905995,
-0.08466386049985886,
0.004197877366095781,
-0.0903686061501503,
0.012840780429542065,
-0.03902212157845497,
0.00046823700540699065,
-0.02279692143201828,
-0.04429886117577553,
-0.05395001173019409,
0.03722048178315163,
-0.10660385340452194,
-0.035357195883989334,
0.03133080527186394,
0.02680857852101326,
-0.11128820478916168,
-0.031047474592924118,
0.002136401366442442,
-0.08466892689466476,
0.0947682186961174,
0.05381849408149719,
-0.007051385007798672,
0.010299705900251865,
-0.0031958429608494043,
0.003333411179482937,
0.07069303095340729,
0.00807509757578373,
0.062125466763973236,
-0.11772271990776062,
-0.0033852620981633663,
0.003918701782822609,
0.0015556140569970012,
0.018365269526839256,
0.12068989127874374,
-0.1207766979932785,
-0.014938155189156532,
-0.021776234731078148,
-0.022403638809919357,
-0.07529860734939575,
0.05935674160718918,
0.10323891043663025,
0.04002806171774864,
0.1898861676454544,
-0.06376539915800095,
0.014657828956842422,
-0.20644508302211761,
-0.0007039525080472231,
0.008912970311939716,
-0.15322327613830566,
-0.05396716296672821,
-0.026020385324954987,
0.05872125178575516,
-0.07323415577411652,
0.11428996920585632,
0.005394491832703352,
-0.014936160296201706,
0.03706309571862221,
-0.04498806968331337,
-0.019579971209168434,
0.009365580976009369,
0.16106203198432922,
0.014660797081887722,
-0.037143524736166,
0.12152601033449173,
0.0238568764179945,
0.10519958287477493,
0.12467579543590546,
0.1681404858827591,
0.12871408462524414,
0.027525760233402252,
0.10744131356477737,
0.03219867870211601,
-0.03000696748495102,
-0.1889362335205078,
0.0689808651804924,
-0.046327803283929825,
0.1342916637659073,
0.005838233977556229,
0.1764954775571823,
0.11104877293109894,
-0.15142668783664703,
0.05000590160489082,
-0.02190127596259117,
-0.09401895105838776,
-0.10003934055566788,
-0.09683806449174881,
-0.07801129668951035,
-0.1528715342283249,
-0.012698844075202942,
-0.11861390620470047,
-0.00156453310046345,
0.0643860250711441,
0.004968239460140467,
-0.022064432501792908,
0.13960690796375275,
0.025364646688103676,
-0.0021634565200656652,
0.08261991292238235,
-0.003702986752614379,
-0.05754280835390091,
-0.0716606080532074,
-0.07397709786891937,
0.014353367500007153,
0.01204715110361576,
0.05748964473605156,
-0.02806028723716736,
0.008202285505831242,
0.05155220627784729,
-0.025701100006699562,
-0.10751278698444366,
0.009841586463153362,
0.021482164040207863,
0.05424986407160759,
0.020782899111509323,
0.013040878809988499,
-0.012086511589586735,
-0.017886120826005936,
0.1773153394460678,
-0.06466637551784515,
-0.021009275689721107,
-0.10658837109804153,
0.19975638389587402,
0.03737352043390274,
-0.03866131603717804,
0.04084712266921997,
-0.06903629750013351,
-0.019598117098212242,
0.1893388032913208,
0.20313337445259094,
-0.02429080381989479,
0.0008256763103418052,
-0.0044187759049236774,
-0.01814758963882923,
-0.007503405679017305,
0.0820104330778122,
0.1334698349237442,
-0.0071779401041567326,
-0.06694548577070236,
-0.025810882449150085,
-0.06658927351236343,
-0.004836720880120993,
-0.0482044480741024,
0.066391222178936,
0.027743671089410782,
0.010540149174630642,
-0.046184465289115906,
0.022228645160794258,
-0.04199829697608948,
-0.07882942259311676,
0.03226859122514725,
-0.20561571419239044,
-0.16009460389614105,
-0.009006503038108349,
0.04061219468712807,
0.001191785093396902,
0.06098309904336929,
-0.0008008016739040613,
0.011895515024662018,
0.09127578884363174,
-0.022390378639101982,
-0.08587329834699631,
-0.08225788921117783,
0.08901824802160263,
-0.1791907101869583,
0.20023761689662933,
-0.0428975448012352,
0.048247937113046646,
0.13208943605422974,
0.04311031848192215,
-0.10308555513620377,
0.03702815622091293,
0.034668512642383575,
-0.009859585203230381,
-0.0046586100943386555,
0.10009314119815826,
-0.011757995933294296,
0.07440435141324997,
0.03818235918879509,
-0.09599507600069046,
-0.039281949400901794,
-0.0724983662366867,
-0.02112523838877678,
-0.04013347998261452,
-0.051686834543943405,
-0.05065472051501274,
0.10975289344787598,
0.189988374710083,
-0.04916232079267502,
-0.017519403249025345,
-0.06736118346452713,
0.00519917719066143,
0.07361429184675217,
-0.009850573725998402,
-0.04280546307563782,
-0.25230857729911804,
0.011408863589167595,
0.047809310257434845,
-0.016395779326558113,
-0.22591710090637207,
-0.10197950899600983,
-0.005344944540411234,
-0.06733774393796921,
-0.07399795949459076,
0.07431349903345108,
0.04495703801512718,
0.05470620095729828,
-0.0674663856625557,
0.023926597088575363,
-0.09551819413900375,
0.15745952725410461,
-0.15249231457710266,
-0.07582691311836243
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-cola-copy
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6243
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6195 | 1.0 | 2138 | 0.6527 | 0.0 |
| 0.6168 | 2.0 | 4276 | 0.6259 | 0.0 |
| 0.616 | 3.0 | 6414 | 0.6243 | 0.0 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "fnet-large-finetuned-cola-copy", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE COLA", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.0, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-cola-copy
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-cola-copy
==============================
This model is a fine-tuned version of google/fnet-large on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6243
* Matthews Correlation: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09392090886831284,
0.086028553545475,
-0.002810414880514145,
0.12563712894916534,
0.15835294127464294,
0.02616388164460659,
0.12096844613552094,
0.12440948188304901,
-0.09253979474306107,
0.01095916610211134,
0.11961711943149567,
0.16139951348304749,
0.017056036740541458,
0.12809212505817413,
-0.055080391466617584,
-0.25876080989837646,
-0.008076829835772514,
0.051302410662174225,
-0.07248687744140625,
0.13545523583889008,
0.10696391016244888,
-0.10981715470552444,
0.08594673126935959,
0.012483679689466953,
-0.188598170876503,
0.0053088548593223095,
0.006561026908457279,
-0.06197373941540718,
0.14454741775989532,
0.02627757005393505,
0.11637239903211594,
-0.004603952635079622,
0.07419975847005844,
-0.1947721242904663,
0.010915560647845268,
0.06143670529127121,
-0.0008093014475889504,
0.0922541469335556,
0.04200132563710213,
0.016881262883543968,
0.12951737642288208,
-0.07753071933984756,
0.047175586223602295,
0.024195468053221703,
-0.10057501494884491,
-0.2488769292831421,
-0.06924816966056824,
0.04293599724769592,
0.0703204795718193,
0.114735446870327,
-0.0025232243351638317,
0.12956556677818298,
-0.07075148820877075,
0.09971623867750168,
0.2038886547088623,
-0.28335192799568176,
-0.06467258930206299,
0.053865209221839905,
0.00007339009607676417,
0.05563786253333092,
-0.09489043802022934,
-0.03489343822002411,
0.0464896559715271,
0.05855744704604149,
0.13576450943946838,
-0.024170638993382454,
-0.11294703930616379,
-0.00035783895873464644,
-0.14739225804805756,
-0.03345825895667076,
0.18343965709209442,
0.04216032102704048,
-0.031146908178925514,
-0.0553949810564518,
-0.07025839388370514,
-0.13019998371601105,
-0.03334832936525345,
-0.016549253836274147,
0.042229484766721725,
-0.017025606706738472,
-0.02913014031946659,
-0.02009456790983677,
-0.10942849516868591,
-0.06127852201461792,
-0.07925374060869217,
0.12757611274719238,
0.03740452602505684,
0.026067685335874557,
-0.036684900522232056,
0.1074313297867775,
-0.008039964362978935,
-0.12542767822742462,
0.02348942495882511,
0.014696755446493626,
0.02391907386481762,
-0.029253896325826645,
-0.05762016028165817,
-0.07737185806035995,
0.013022209517657757,
0.11649444699287415,
-0.06627753376960754,
0.045372430235147476,
0.03719029203057289,
0.052638933062553406,
-0.08948004990816116,
0.17119163274765015,
-0.03414352610707283,
-0.020914733409881592,
0.017712917178869247,
0.04670121893286705,
0.026985853910446167,
-0.018883170560002327,
-0.12442363798618317,
0.01235498022288084,
0.09396601468324661,
0.002359611913561821,
-0.05978254973888397,
0.08112926036119461,
-0.03828165680170059,
-0.03376328945159912,
-0.004834178369492292,
-0.09749481827020645,
0.019099729135632515,
0.006322276312857866,
-0.08288158476352692,
-0.019768934696912766,
0.03858035430312157,
0.014716193079948425,
-0.02650129422545433,
0.09978698194026947,
-0.08106034249067307,
0.02185039594769478,
-0.08661425113677979,
-0.10680398344993591,
0.017903147265315056,
-0.0847211480140686,
0.01903758943080902,
-0.10459426045417786,
-0.19062873721122742,
-0.0070509458892047405,
0.06475710868835449,
-0.01869357004761696,
-0.06680820137262344,
-0.0446598045527935,
-0.07923983782529831,
0.009182565845549107,
-0.014331415295600891,
0.1251339614391327,
-0.06613124161958694,
0.08929501473903656,
0.029013488441705704,
0.053821150213479996,
-0.056043438613414764,
0.053580932319164276,
-0.09583668410778046,
0.009967178106307983,
-0.16321894526481628,
0.04253964126110077,
-0.05672885477542877,
0.05521031841635704,
-0.07788251340389252,
-0.10970792174339294,
0.015046020038425922,
-0.008545617572963238,
0.05209499970078468,
0.0823349803686142,
-0.17201119661331177,
-0.07378672808408737,
0.1421811878681183,
-0.08395789563655853,
-0.11804282665252686,
0.12395995855331421,
-0.05760178342461586,
0.040096431970596313,
0.05442522466182709,
0.1759394258260727,
0.09057454019784927,
-0.06562479585409164,
0.006790442857891321,
0.033411525189876556,
0.04206949099898338,
-0.06806401163339615,
0.07068393379449844,
0.0191428791731596,
0.017440875992178917,
0.03589886054396629,
-0.031538426876068115,
0.06567616760730743,
-0.07883037626743317,
-0.09648585319519043,
-0.04541643336415291,
-0.0845101922750473,
0.03634423762559891,
0.07191202789545059,
0.07480383664369583,
-0.08777596801519394,
-0.07665324211120605,
0.039373867213726044,
0.08542460203170776,
-0.06039068102836609,
0.028123341500759125,
-0.05340079590678215,
0.07857871055603027,
-0.022135380655527115,
-0.012152368202805519,
-0.17895615100860596,
-0.049802783876657486,
0.01269685197621584,
-0.017068060114979744,
0.011587386019527912,
0.03449726849794388,
0.05857099965214729,
0.062182098627090454,
-0.052427344024181366,
-0.006858838256448507,
-0.036305077373981476,
-0.002348072361201048,
-0.11779595166444778,
-0.194713294506073,
-0.033400148153305054,
-0.021897103637456894,
0.16001500189304352,
-0.20776410400867462,
0.05139097571372986,
-0.009029606357216835,
0.08531299233436584,
0.003109408775344491,
-0.004246857482939959,
-0.0448564738035202,
0.05959697812795639,
-0.04208480939269066,
-0.048716846853494644,
0.08012443780899048,
0.02152360789477825,
-0.08858314156532288,
-0.049554068595170975,
-0.09015779942274094,
0.17343412339687347,
0.1291024088859558,
-0.1208619773387909,
-0.06354694813489914,
-0.005686518736183643,
-0.07318001985549927,
-0.032940808683633804,
-0.04457523673772812,
0.025874445214867592,
0.19627976417541504,
-0.00956712942570448,
0.1556379199028015,
-0.073637954890728,
-0.04745488986372948,
0.024473601952195168,
-0.031531598418951035,
0.015794798731803894,
0.1174120306968689,
0.1146426871418953,
-0.0827367752790451,
0.15341469645500183,
0.14470958709716797,
-0.09353531897068024,
0.14427484571933746,
-0.046857599169015884,
-0.05337967351078987,
-0.01239851489663124,
-0.025734765455126762,
-0.01986040733754635,
0.10185948759317398,
-0.1591099053621292,
0.010827534832060337,
0.03590237349271774,
0.02313799038529396,
0.03406902775168419,
-0.21511520445346832,
-0.034339725971221924,
0.03387860581278801,
-0.0499500147998333,
-0.002659587422385812,
-0.00680513633415103,
0.01076573971658945,
0.10498564690351486,
-0.0029810734558850527,
-0.10533218830823898,
0.05257336422801018,
-0.0062987240962684155,
-0.0899871438741684,
0.21725404262542725,
-0.08747546374797821,
-0.18725232779979706,
-0.11880498379468918,
-0.06220771372318268,
-0.07559692859649658,
-0.004758915398269892,
0.05759590119123459,
-0.08066269010305405,
-0.03158093988895416,
-0.0806441456079483,
0.01446909736841917,
-0.002284287940710783,
0.026104530319571495,
0.01951073668897152,
0.0022082950454205275,
0.08154694736003876,
-0.11645926535129547,
-0.015786167234182358,
-0.06321795284748077,
-0.04211984947323799,
0.04032707214355469,
0.04146743565797806,
0.0988055020570755,
0.15235096216201782,
-0.019287779927253723,
0.02282365784049034,
-0.02135869860649109,
0.24310164153575897,
-0.06365299969911575,
-0.018696533516049385,
0.1559664011001587,
-0.009423095732927322,
0.0503767766058445,
0.11493933945894241,
0.07142814248800278,
-0.07500948756933212,
0.006025242153555155,
0.03687376528978348,
-0.037410885095596313,
-0.2224632054567337,
-0.06534454971551895,
-0.05588307976722717,
0.020130153745412827,
0.07723241299390793,
0.03739004582166672,
0.04191461205482483,
0.059889692813158035,
0.04328415170311928,
0.06834448873996735,
-0.02811642736196518,
0.0510883592069149,
0.15787187218666077,
0.03660351783037186,
0.12926755845546722,
-0.03699519485235214,
-0.0581619068980217,
0.044791772961616516,
-0.015560021623969078,
0.2022491693496704,
-0.011031819507479668,
0.10647973418235779,
0.05714922025799751,
0.1732620894908905,
-0.01044081524014473,
0.07149824500083923,
-0.01537350844591856,
-0.03176766633987427,
-0.01585426926612854,
-0.03921106085181236,
-0.030264299362897873,
0.018982039764523506,
-0.0774182453751564,
0.060774918645620346,
-0.1135101318359375,
0.01231792476028204,
0.05960399657487869,
0.24015723168849945,
0.03204027935862541,
-0.3194168210029602,
-0.09705868363380432,
-0.004413720685988665,
-0.03755950182676315,
-0.021074343472719193,
0.033035874366760254,
0.0998033806681633,
-0.10339818149805069,
0.04004545137286186,
-0.07676628977060318,
0.09240566939115524,
-0.04698125272989273,
0.054886672645807266,
0.09951405227184296,
0.09603190422058105,
0.020772648975253105,
0.09640488028526306,
-0.2850910425186157,
0.25600531697273254,
-0.0014360295608639717,
0.05562611296772957,
-0.07749780267477036,
0.012023663148283958,
0.038247883319854736,
0.07425936311483383,
0.07044059038162231,
-0.010851399041712284,
-0.02008097432553768,
-0.1768847107887268,
-0.06550540030002594,
0.03150675445795059,
0.06631500273942947,
-0.02683223783969879,
0.07452084869146347,
-0.02882814034819603,
0.01022765226662159,
0.07610820233821869,
0.013070657849311829,
-0.05542498826980591,
-0.10068942606449127,
0.0006295169587247074,
0.052223872393369675,
-0.06668298691511154,
-0.06467423588037491,
-0.11798346787691116,
-0.1204783171415329,
0.1616269052028656,
-0.020681016147136688,
-0.052140481770038605,
-0.11009234935045242,
0.08729247003793716,
0.058916110545396805,
-0.08323338627815247,
0.04165558144450188,
-0.002712355460971594,
0.0844990462064743,
0.02244596555829048,
-0.08383654803037643,
0.09725132584571838,
-0.0815984308719635,
-0.15663672983646393,
-0.05686290189623833,
0.1098857969045639,
0.03203533589839935,
0.0587601438164711,
-0.010134214535355568,
0.0003890396619681269,
-0.04276092350482941,
-0.0972023457288742,
0.012183141894638538,
-0.009019515477120876,
0.07961194217205048,
0.023807894438505173,
-0.05628948658704758,
0.01429247297346592,
-0.06137530878186226,
-0.025577371940016747,
0.2017984837293625,
0.2101432830095291,
-0.10181240737438202,
0.019432980567216873,
0.02815667726099491,
-0.06017622724175453,
-0.19818763434886932,
0.025082634761929512,
0.06268273293972015,
-0.007134377025067806,
0.029622653499245644,
-0.17602014541625977,
0.1417960673570633,
0.09607643634080887,
-0.019160397350788116,
0.11625179648399353,
-0.3190588355064392,
-0.11957462131977081,
0.13578015565872192,
0.13038641214370728,
0.1082649901509285,
-0.1275387704372406,
-0.017696116119623184,
-0.017272336408495903,
-0.12869010865688324,
0.13378626108169556,
-0.11559057980775833,
0.11307959258556366,
-0.033666882663965225,
0.08151817321777344,
0.0013376958668231964,
-0.05681246146559715,
0.11611095815896988,
0.02491697669029236,
0.09116630256175995,
-0.062243860214948654,
-0.00852359551936388,
0.034516509622335434,
-0.03262631967663765,
0.04529916122555733,
-0.09698519855737686,
0.03298477455973625,
-0.10906574875116348,
-0.022121058776974678,
-0.07863457500934601,
0.05038856714963913,
-0.03678201884031296,
-0.07253073155879974,
-0.04535432532429695,
0.024469682946801186,
0.058498598635196686,
-0.007506924215704203,
0.13079454004764557,
0.037502311170101166,
0.1487056463956833,
0.14970152080059052,
0.06433167308568954,
-0.08174005895853043,
-0.0913252905011177,
-0.023997897282242775,
-0.008367595262825489,
0.05552287399768829,
-0.1339290738105774,
0.025475962087512016,
0.14876805245876312,
0.009036448784172535,
0.1429215669631958,
0.0885363295674324,
-0.024754323065280914,
-0.01073932833969593,
0.05416949465870857,
-0.1737722009420395,
-0.09068270772695541,
-0.022626888006925583,
-0.06832322478294373,
-0.11296725273132324,
0.05743652954697609,
0.10428833216428757,
-0.08115601539611816,
0.002589004347100854,
-0.0032136866357177496,
0.013669951818883419,
-0.057341303676366806,
0.1853988766670227,
0.069394551217556,
0.04892005771398544,
-0.1013467088341713,
0.08452849835157394,
0.04651455581188202,
-0.058266013860702515,
0.004656137898564339,
0.0800386294722557,
-0.0980244129896164,
-0.053028371185064316,
0.05821079760789871,
0.17423874139785767,
-0.0555633120238781,
-0.05913792923092842,
-0.1475023776292801,
-0.13321906328201294,
0.07928811013698578,
0.1311376690864563,
0.11735476553440094,
0.007435607723891735,
-0.07321388274431229,
0.0015353669878095388,
-0.11065926402807236,
0.10537150502204895,
0.0464034266769886,
0.054856207221746445,
-0.15510469675064087,
0.13737551867961884,
0.015811048448085785,
0.06363701820373535,
-0.025411484763026237,
0.022845277562737465,
-0.08434955775737762,
0.008706839755177498,
-0.10348886996507645,
-0.0216924250125885,
-0.016252512112259865,
0.009890284389257431,
-0.0053891371935606,
-0.048445265740156174,
-0.057025883346796036,
0.024999424815177917,
-0.10565505921840668,
-0.025227516889572144,
0.03321393206715584,
0.05191047862172127,
-0.11662878841161728,
-0.04512174427509308,
0.01624988205730915,
-0.06121593341231346,
0.08542851358652115,
0.05370628833770752,
0.01167554035782814,
0.0522066168487072,
-0.12207209318876266,
0.015587622299790382,
0.0742800235748291,
0.03451160714030266,
0.060766711831092834,
-0.09618820250034332,
-0.004330980591475964,
0.000620808859821409,
0.029649745672941208,
0.022910641506314278,
0.0777667984366417,
-0.13724935054779053,
-0.002710487926378846,
-0.02783370390534401,
-0.06454020738601685,
-0.06714898347854614,
0.020097440108656883,
0.08303855359554291,
0.03637102618813515,
0.20598363876342773,
-0.07561089843511581,
0.046382516622543335,
-0.2131723165512085,
0.007946843281388283,
-0.00018174469005316496,
-0.11797532439231873,
-0.09552253782749176,
-0.05074607580900192,
0.05392405018210411,
-0.06373390555381775,
0.15394343435764313,
0.04863494262099266,
0.014815868809819221,
0.02855541929602623,
-0.017866145819425583,
0.0212584026157856,
0.007505857385694981,
0.1868983954191208,
0.03341247886419296,
-0.030030297115445137,
0.07202771306037903,
0.038279302418231964,
0.11288387328386307,
0.11314141005277634,
0.19690823554992676,
0.13149374723434448,
-0.0017040353268384933,
0.09504646062850952,
0.04710931330919266,
-0.060035157948732376,
-0.16356506943702698,
0.05060194432735443,
-0.03684914484620094,
0.11947114020586014,
-0.02734866924583912,
0.1927868127822876,
0.06847987323999405,
-0.16221268475055695,
0.046570226550102234,
-0.04790783300995827,
-0.08729487657546997,
-0.11747626960277557,
-0.06214311346411705,
-0.07060721516609192,
-0.14227285981178284,
-0.004479394759982824,
-0.11955714225769043,
-0.0012408922193571925,
0.11118984967470169,
0.01135226245969534,
-0.034017469733953476,
0.14965499937534332,
0.004561042878776789,
0.015436336398124695,
0.06754950433969498,
0.006787540391087532,
-0.03938409313559532,
-0.10845154523849487,
-0.05584019422531128,
-0.02278803661465645,
-0.009399445727467537,
0.0328291691839695,
-0.06014551594853401,
-0.03446387127041817,
0.03999267518520355,
-0.022419964894652367,
-0.091934934258461,
0.004938766360282898,
0.008898281492292881,
0.05622522905468941,
0.05688990280032158,
0.007466019131243229,
0.018164541572332382,
-0.008829513564705849,
0.20546086132526398,
-0.07387928664684296,
-0.06927040219306946,
-0.09953052550554276,
0.2376701980829239,
0.03024473413825035,
-0.020175063982605934,
0.030588623136281967,
-0.06754070520401001,
-0.019916238263249397,
0.24630585312843323,
0.22834931313991547,
-0.08888491988182068,
-0.0023557317908853292,
0.022025488317012787,
-0.012935125268995762,
-0.015982933342456818,
0.10426705330610275,
0.14598897099494934,
0.06348761916160583,
-0.0856001153588295,
-0.030391816049814224,
-0.061101991683244705,
-0.0055675022304058075,
-0.0447319820523262,
0.0732647031545639,
0.0502418652176857,
0.009062495082616806,
-0.03542236611247063,
0.04821065813302994,
-0.07359969615936279,
-0.09477446228265762,
0.037203386425971985,
-0.20104920864105225,
-0.1598745882511139,
-0.01462660450488329,
0.09054913371801376,
0.0001615012442925945,
0.0589400939643383,
-0.030530771240592003,
0.004999163560569286,
0.08607053011655807,
-0.024409007281064987,
-0.09203089773654938,
-0.0751798152923584,
0.09285302460193634,
-0.1202358677983284,
0.22683995962142944,
-0.049215227365493774,
0.07103195041418076,
0.12016253918409348,
0.053361181169748306,
-0.06377849727869034,
0.06605267524719238,
0.04349633678793907,
-0.03749271482229233,
0.012211945839226246,
0.07883132994174957,
-0.02810114249587059,
0.0679730772972107,
0.042277514934539795,
-0.1358688473701477,
0.0054893274791538715,
-0.043648626655340195,
-0.06824120879173279,
-0.039956267923116684,
-0.03296131268143654,
-0.06381493806838989,
0.1260761022567749,
0.214870885014534,
-0.02928794175386429,
-0.012312759645283222,
-0.08210945874452591,
0.007474972400814295,
0.048175226897001266,
0.016146136447787285,
-0.04649002104997635,
-0.21020174026489258,
0.016797954216599464,
0.041385382413864136,
-0.022210748866200447,
-0.2261059582233429,
-0.09453583508729935,
0.006697536446154118,
-0.07942066341638565,
-0.08833836019039154,
0.06651146709918976,
0.08081953972578049,
0.043811965733766556,
-0.05939081683754921,
-0.029742924496531487,
-0.0846235603094101,
0.13799507915973663,
-0.1453099101781845,
-0.08864983171224594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-cola-copy2
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6173
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6192 | 1.0 | 2138 | 0.6443 | 0.0 |
| 0.6177 | 2.0 | 4276 | 0.6296 | 0.0 |
| 0.6128 | 3.0 | 6414 | 0.6173 | 0.0 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "fnet-large-finetuned-cola-copy2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE COLA", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.0, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-cola-copy2
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-cola-copy2
===============================
This model is a fine-tuned version of google/fnet-large on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6173
* Matthews Correlation: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
116,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.08743564039468765,
0.11767483502626419,
-0.004380743019282818,
0.10790295898914337,
0.11637401580810547,
-0.0017209806246683002,
0.14107565581798553,
0.15623614192008972,
-0.11030884087085724,
0.045144908130168915,
0.1322317272424698,
0.1547640860080719,
0.02968325838446617,
0.16563232243061066,
-0.0642363578081131,
-0.2750568687915802,
0.029009878635406494,
0.039126425981521606,
-0.04970425367355347,
0.12443815171718597,
0.09249506145715714,
-0.11462824791669846,
0.09580158442258835,
0.024957966059446335,
-0.16143371164798737,
-0.006230039056390524,
0.004855525214225054,
-0.0858822911977768,
0.1152195855975151,
0.016376622021198273,
0.0930122584104538,
0.036427825689315796,
0.06748495250940323,
-0.18962667882442474,
0.008104847744107246,
0.06899607926607132,
0.0016832760302349925,
0.09926356375217438,
0.04400775209069252,
-0.01347266137599945,
0.12143182754516602,
-0.08212513476610184,
0.07441894710063934,
0.030107466503977776,
-0.1144680306315422,
-0.26561182737350464,
-0.09636149555444717,
0.05653825029730797,
0.05900046229362488,
0.07754601538181305,
0.0026641280855983496,
0.1647229641675949,
-0.040871407836675644,
0.11897504329681396,
0.25628146529197693,
-0.3051452040672302,
-0.058123741298913956,
0.029499266296625137,
0.030015775933861732,
0.05616863816976547,
-0.0919642299413681,
-0.017476944252848625,
0.03169126436114311,
0.05059899762272835,
0.1649095118045807,
-0.015477427281439304,
-0.05941342934966087,
-0.013972870074212551,
-0.13660083711147308,
-0.05296807736158371,
0.14392103254795074,
0.01850743032991886,
-0.04449370875954628,
-0.07281343638896942,
-0.08043479174375534,
-0.18101654946804047,
-0.045286595821380615,
-0.02345702238380909,
0.054845940321683884,
-0.03331447020173073,
-0.05265158787369728,
-0.026161309331655502,
-0.07374801486730576,
-0.05033237487077713,
-0.05904562398791313,
0.16344593465328217,
0.05186481028795242,
0.02747851237654686,
-0.05085619539022446,
0.08638178557157516,
-0.031899940222501755,
-0.15726478397846222,
-0.009825552813708782,
0.0012038351269438863,
0.017871549353003502,
-0.034476056694984436,
-0.029067736119031906,
-0.09762363135814667,
0.014434999786317348,
0.165004163980484,
-0.12068454176187515,
0.07703053951263428,
-0.011698529124259949,
0.03547242283821106,
-0.07867158949375153,
0.18422773480415344,
-0.01435540895909071,
0.0006911102682352066,
0.013440216891467571,
0.06366860866546631,
0.04730484262108803,
-0.03536973521113396,
-0.11018085479736328,
0.036015670746564865,
0.09504545480012894,
0.026053430512547493,
-0.04832237586379051,
0.08004391938447952,
-0.03166641667485237,
-0.02564558945596218,
0.04027101770043373,
-0.11267008632421494,
0.03487589210271835,
0.002269777236506343,
-0.07723380625247955,
-0.037383366376161575,
0.03043130412697792,
-0.0019962373189628124,
-0.036616094410419464,
0.09981806576251984,
-0.07799959182739258,
0.018621152266860008,
-0.06932555884122849,
-0.13113370537757874,
0.021362939849495888,
-0.12848417460918427,
-0.005631242413073778,
-0.09391554445028305,
-0.1546381264925003,
-0.003077788045629859,
0.0627007856965065,
-0.04273265600204468,
-0.04208226874470711,
-0.04191119223833084,
-0.10450567305088043,
0.03276997432112694,
-0.010382180102169514,
0.06282049417495728,
-0.07663562148809433,
0.07579797506332397,
0.02918342687189579,
0.07881637662649155,
-0.029357898980379105,
0.04273281618952751,
-0.08799806982278824,
0.034078799188137054,
-0.22850193083286285,
0.051566559821367264,
-0.07510628551244736,
0.056090325117111206,
-0.09116026014089584,
-0.11088556051254272,
0.03380034118890762,
-0.019948432222008705,
0.07322657108306885,
0.09770583361387253,
-0.17262236773967743,
-0.0733959823846817,
0.175649031996727,
-0.10367991030216217,
-0.10880356281995773,
0.11336462944746017,
-0.047773491591215134,
0.009513420052826405,
0.05242976173758507,
0.20886559784412384,
0.0788106843829155,
-0.10590480268001556,
-0.016266174614429474,
-0.016955308616161346,
0.024396853521466255,
-0.030956219881772995,
0.05398307368159294,
0.015918901190161705,
0.07925659418106079,
0.021008586511015892,
0.0045506772585213184,
0.020594296976923943,
-0.08010459691286087,
-0.08587813377380371,
-0.053115326911211014,
-0.0664597600698471,
0.03868493810296059,
0.04499084874987602,
0.06367260217666626,
-0.11381977796554565,
-0.10220933705568314,
0.04708987474441528,
0.0733921006321907,
-0.06596197187900543,
0.04993506148457527,
-0.09810507297515869,
0.09512561559677124,
-0.022724593058228493,
-0.004561804234981537,
-0.18216335773468018,
-0.03642883524298668,
0.036716412752866745,
-0.030655095353722572,
0.013279831036925316,
-0.0196819007396698,
0.07302342355251312,
0.0695304349064827,
-0.04743723198771477,
-0.01539102103561163,
-0.028644513338804245,
0.005288679152727127,
-0.11034621298313141,
-0.2167976349592209,
-0.03551042824983597,
-0.04038777947425842,
0.0713241919875145,
-0.16449634730815887,
0.05854778364300728,
0.06711536645889282,
0.115606389939785,
0.040618471801280975,
-0.017517367377877235,
-0.016571570187807083,
0.05168769881129265,
-0.04745340347290039,
-0.07030058652162552,
0.05856065824627876,
0.017696740105748177,
-0.07770188897848129,
-0.007060549221932888,
-0.13083787262439728,
0.16187439858913422,
0.13014593720436096,
-0.0276279766112566,
-0.07028574496507645,
0.000060931513871764764,
-0.06569372117519379,
-0.025033382698893547,
-0.030358921736478806,
0.04143421724438667,
0.18133579194545746,
0.011668011546134949,
0.16289480030536652,
-0.08842679113149643,
-0.05776217207312584,
0.04946478456258774,
-0.028119197115302086,
0.0009061484015546739,
0.10098952054977417,
0.05621805414557457,
-0.08428095281124115,
0.14286209642887115,
0.12795805931091309,
-0.062181439250707626,
0.14321480691432953,
-0.05426683649420738,
-0.04305564612150192,
-0.028468569740653038,
-0.020750680938363075,
0.008798385970294476,
0.1056412011384964,
-0.11276597529649734,
-0.009754466824233532,
0.03145276755094528,
0.03182848542928696,
0.0064464653842151165,
-0.18867407739162445,
0.001119907945394516,
0.03875439614057541,
-0.07149024307727814,
-0.02601417899131775,
-0.007211062125861645,
0.019054275006055832,
0.10553236305713654,
0.010701454244554043,
-0.0831209272146225,
0.03342108428478241,
0.003800457576289773,
-0.06248343363404274,
0.1928059607744217,
-0.10157947987318039,
-0.17510256171226501,
-0.10383410006761551,
-0.07098282873630524,
-0.08274031430482864,
-0.0027725733816623688,
0.07101235538721085,
-0.09006423503160477,
-0.04768822714686394,
-0.10060358792543411,
-0.011589167639613152,
0.013604440726339817,
0.0364745669066906,
0.031003128737211227,
-0.003605401376262307,
0.07629158347845078,
-0.12393850833177567,
-0.033006466925144196,
-0.04555657505989075,
-0.013219044543802738,
0.051824234426021576,
0.0428432859480381,
0.09225337952375412,
0.12525220215320587,
-0.03685353323817253,
0.04499607905745506,
-0.031222356483340263,
0.23048293590545654,
-0.07158584892749786,
0.002903892192989588,
0.128113254904747,
-0.009865901432931423,
0.07637453079223633,
0.12956644594669342,
0.05710770934820175,
-0.09038855880498886,
-0.007001001387834549,
0.03252368047833443,
-0.049027033150196075,
-0.22198687493801117,
-0.036967821419239044,
-0.036309003829956055,
0.029623670503497124,
0.09615738689899445,
0.047311846166849136,
0.04563319683074951,
0.04312773793935776,
0.027418291196227074,
0.043114665895700455,
-0.004948867950588465,
0.09644298255443573,
0.14630675315856934,
0.04963405057787895,
0.1411486119031906,
-0.04608524590730667,
-0.036003924906253815,
0.05184009671211243,
-0.00012650452845264226,
0.20010317862033844,
-0.01347768772393465,
0.152700275182724,
0.051011644303798676,
0.15036916732788086,
0.014462711289525032,
0.055494777858257294,
-0.015461533330380917,
-0.015071466565132141,
-0.010587992146611214,
-0.04714464768767357,
-0.03068072535097599,
0.017496181651949883,
-0.0679296925663948,
0.038216203451156616,
-0.10787872970104218,
0.04574847221374512,
0.07012366503477097,
0.2988797128200531,
0.01976679638028145,
-0.342637300491333,
-0.09540979564189911,
-0.00333083001896739,
-0.050031568855047226,
-0.029893148690462112,
0.028331151232123375,
0.10448899120092392,
-0.08754544705152512,
0.08462320268154144,
-0.08089683204889297,
0.09326338768005371,
-0.05881118029356003,
0.04752538725733757,
0.08233179897069931,
0.10037606209516525,
-0.0029365180525928736,
0.05178871750831604,
-0.2691952884197235,
0.25633808970451355,
0.022381160408258438,
0.06522882729768753,
-0.07073120772838593,
0.025103304535150528,
0.03175824135541916,
0.0817071795463562,
0.07234015315771103,
-0.015968529507517815,
-0.09858917444944382,
-0.16068382561206818,
-0.09530270099639893,
0.010131043381989002,
0.09713198989629745,
0.0025584474205970764,
0.09835126250982285,
-0.021843675523996353,
-0.004678479395806789,
0.04837504401803017,
-0.0476614311337471,
-0.0459146685898304,
-0.08822435885667801,
0.013513493351638317,
0.047443751245737076,
-0.04678517207503319,
-0.06447339802980423,
-0.10904688388109207,
-0.07790695875883102,
0.17428742349147797,
-0.016054343432188034,
-0.07228447496891022,
-0.12666581571102142,
0.036060772836208344,
0.06991785764694214,
-0.08210190385580063,
0.04261589050292969,
-0.012561564333736897,
0.11424214392900467,
0.007903912104666233,
-0.07978133112192154,
0.1087435930967331,
-0.0660335123538971,
-0.19017748534679413,
-0.045689623802900314,
0.11689799278974533,
0.024517493322491646,
0.05797130987048149,
-0.009551959112286568,
0.03663337603211403,
-0.01418417040258646,
-0.08916807174682617,
0.026957863941788673,
-0.001580505515448749,
0.08717874437570572,
-0.022031525149941444,
-0.03246252238750458,
0.02733585052192211,
-0.06076717749238014,
0.0019508993718773127,
0.17182192206382751,
0.25466442108154297,
-0.10966692864894867,
0.08250599354505539,
0.034576039761304855,
-0.059216104447841644,
-0.18285834789276123,
0.015208987519145012,
0.0551300011575222,
-0.015696072950959206,
0.014366017654538155,
-0.18076233565807343,
0.07291495054960251,
0.08906340599060059,
-0.021365761756896973,
0.09372717887163162,
-0.2955153286457062,
-0.1298106461763382,
0.10892952978610992,
0.11701589077711105,
0.06536109745502472,
-0.14358477294445038,
-0.0369223952293396,
-0.012758460827171803,
-0.12481185048818588,
0.1328306347131729,
-0.12025649100542068,
0.11335647106170654,
-0.03552975878119469,
0.06847835332155228,
0.012005412019789219,
-0.055629290640354156,
0.10917140543460846,
0.009400698356330395,
0.09398885071277618,
-0.07044553011655807,
0.007394738961011171,
0.08134625852108002,
-0.072870172560215,
0.06390547007322311,
-0.10593390464782715,
0.03421108424663544,
-0.09154702723026276,
-0.015794863924384117,
-0.07686599344015121,
0.027381381019949913,
-0.03484455123543739,
-0.053873900324106216,
-0.06007566303014755,
0.013611371628940105,
0.07931453734636307,
-0.024811390787363052,
0.19006101787090302,
0.024406960234045982,
0.15162812173366547,
0.19889359176158905,
0.08722473680973053,
-0.11281854659318924,
-0.0759444385766983,
0.011107340455055237,
-0.014975705184042454,
0.056107304990291595,
-0.1584850251674652,
0.04862050339579582,
0.13855674862861633,
0.005029216408729553,
0.12276603281497955,
0.07502426207065582,
-0.03973596543073654,
-0.0009424545569345355,
0.050991497933864594,
-0.17286932468414307,
-0.09448238462209702,
0.00866671185940504,
-0.025744685903191566,
-0.10720884799957275,
0.07239202409982681,
0.12616503238677979,
-0.07179141789674759,
-0.003281058743596077,
0.00905572809278965,
0.027756813913583755,
-0.017674142494797707,
0.1783558875322342,
0.05063018202781677,
0.06191962584853172,
-0.10890751332044601,
0.097181037068367,
0.04231150075793266,
-0.08244256675243378,
0.04746931046247482,
0.1103028729557991,
-0.10566215962171555,
-0.030644213780760765,
0.04898369684815407,
0.15719570219516754,
-0.04309460148215294,
-0.05227024480700493,
-0.17153926193714142,
-0.13277235627174377,
0.0923362597823143,
0.19108040630817413,
0.08389941602945328,
0.010714779607951641,
-0.053806472569704056,
0.00975420605391264,
-0.11768358200788498,
0.10631024837493896,
0.03567760810256004,
0.0669296532869339,
-0.1497289091348648,
0.12982189655303955,
0.005341693758964539,
0.05396641045808792,
-0.02371257171034813,
0.015574776567518711,
-0.10400150716304779,
0.0033690158743411303,
-0.12405594438314438,
0.002822094364091754,
-0.0376131571829319,
0.0062647731974720955,
-0.012780656106770039,
-0.041107624769210815,
-0.06424278020858765,
0.025815950706601143,
-0.1099792867898941,
-0.026302410289645195,
0.01953716017305851,
0.03423655405640602,
-0.1348583996295929,
-0.025646591559052467,
0.0066924444399774075,
-0.08056549727916718,
0.08922701328992844,
0.04076230525970459,
-0.01307022012770176,
0.028367338702082634,
-0.04695490375161171,
0.004752045031636953,
0.06955429166555405,
0.005184907931834459,
0.07118532806634903,
-0.11160392314195633,
-0.014185999520123005,
-0.004696986172348261,
0.01286404300481081,
0.030797922983765602,
0.10816947370767593,
-0.12542583048343658,
0.022001070901751518,
-0.009404920972883701,
-0.05724182352423668,
-0.06277833878993988,
0.04870130494236946,
0.10598216205835342,
0.01417477335780859,
0.19325293600559235,
-0.08172677457332611,
0.030653884634375572,
-0.19802013039588928,
-0.0008957354002632201,
0.015270938165485859,
-0.13821306824684143,
-0.0662035420536995,
-0.028526103124022484,
0.06196122616529465,
-0.07568599283695221,
0.14368996024131775,
0.03772798925638199,
0.0055669574066996574,
0.052137356251478195,
-0.039191119372844696,
-0.04591260477900505,
0.01754194125533104,
0.1303442120552063,
0.025656146928668022,
-0.04710141569375992,
0.08540830761194229,
0.016667934134602547,
0.09710979461669922,
0.08531499654054642,
0.2152111679315567,
0.11764682829380035,
0.03843805938959122,
0.09950791299343109,
0.04869856312870979,
-0.052275002002716064,
-0.19163690507411957,
0.0727858915925026,
-0.051565565168857574,
0.14714455604553223,
-0.016257481649518013,
0.16115324199199677,
0.10640706866979599,
-0.1558646410703659,
0.06298772990703583,
-0.028543692082166672,
-0.08802931010723114,
-0.13325811922550201,
-0.07537085562944412,
-0.0834004282951355,
-0.1481153666973114,
-0.000605526496656239,
-0.11901073157787323,
0.04053756967186928,
0.062455207109451294,
0.02047530934214592,
-0.002124475548043847,
0.135965496301651,
-0.010637509636580944,
0.019526617601513863,
0.06159181520342827,
0.009904839098453522,
-0.050273358821868896,
-0.07059496641159058,
-0.06730102747678757,
-0.008663766086101532,
0.003655151231214404,
0.03550635278224945,
-0.014562315307557583,
-0.008890142664313316,
0.04000004380941391,
-0.028742238879203796,
-0.10236569494009018,
0.018867669627070427,
0.020874742418527603,
0.055810313671827316,
0.03817141056060791,
0.012387007474899292,
-0.0034413784742355347,
-0.008348316885530949,
0.19683563709259033,
-0.07859893888235092,
-0.060213807970285416,
-0.11723842471837997,
0.24300961196422577,
0.03733837604522705,
-0.024746732786297798,
0.027223732322454453,
-0.06676674634218216,
-0.04147341847419739,
0.19833578169345856,
0.21840402483940125,
-0.0368611216545105,
0.0028061482589691877,
-0.013464203104376793,
-0.011505292728543282,
-0.0046491073444485664,
0.09005796164274216,
0.12546779215335846,
0.041602104902267456,
-0.06710746884346008,
-0.028477340936660767,
-0.05119795724749565,
-0.02009907364845276,
-0.05785543471574783,
0.08331113308668137,
0.023867251351475716,
0.005446676164865494,
-0.04139407351613045,
0.033926717936992645,
-0.06702590733766556,
-0.09344159811735153,
0.042351581156253815,
-0.21387556195259094,
-0.15279929339885712,
0.0013658956158906221,
0.07474125921726227,
-0.008601312525570393,
0.05451534315943718,
0.005481261294335127,
-0.01643436960875988,
0.09123165905475616,
-0.024054139852523804,
-0.08194858580827713,
-0.08190055191516876,
0.09696811437606812,
-0.15730032324790955,
0.22305946052074432,
-0.03784516826272011,
0.03791552782058716,
0.1262645572423935,
0.0373552106320858,
-0.0908704400062561,
0.05320734903216362,
0.04622283950448036,
-0.0644288882613182,
-0.01324300467967987,
0.11642850190401077,
-0.030010322108864784,
0.07926376909017563,
0.04380663484334946,
-0.1181393712759018,
-0.015990905463695526,
-0.0638398677110672,
-0.06024209409952164,
-0.030560800805687904,
-0.042687561362981796,
-0.04742871969938278,
0.11484210938215256,
0.21196763217449188,
-0.027035662904381752,
0.0027457471005618572,
-0.07261403650045395,
0.014278346672654152,
0.05656984820961952,
-0.022736357524991035,
-0.04638136550784111,
-0.24358758330345154,
0.0249591376632452,
0.08043544739484787,
-0.009738735854625702,
-0.2280510514974594,
-0.10346635431051254,
0.013476707972586155,
-0.04691677913069725,
-0.0958060696721077,
0.07442469149827957,
0.07281406968832016,
0.05427010729908943,
-0.06060110777616501,
-0.013884657062590122,
-0.06724289804697037,
0.15483424067497253,
-0.15512728691101074,
-0.06515880674123764
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-cola-copy3
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6554
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6408 | 1.0 | 2138 | 0.7329 | 0.0 |
| 0.6589 | 2.0 | 4276 | 0.6311 | 0.0 |
| 0.6467 | 3.0 | 6414 | 0.6554 | 0.0 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "fnet-large-finetuned-cola-copy3", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE COLA", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.0, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-cola-copy3
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-cola-copy3
===============================
This model is a fine-tuned version of google/fnet-large on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6554
* Matthews Correlation: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_ratio: 0.1
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
115,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09301087260246277,
0.1165362149477005,
-0.004076502285897732,
0.10862928628921509,
0.12303128093481064,
0.0038213501684367657,
0.133059561252594,
0.162727490067482,
-0.08903800696134567,
0.04547015205025673,
0.1245267391204834,
0.14553114771842957,
0.031472641974687576,
0.1514068841934204,
-0.0585181787610054,
-0.28182074427604675,
0.021456416696310043,
0.03362289071083069,
-0.07234123349189758,
0.1266171634197235,
0.08980587124824524,
-0.10760901868343353,
0.09748174250125885,
0.016089145094156265,
-0.13794678449630737,
-0.0002208035730291158,
0.008576872758567333,
-0.08170095086097717,
0.1244734674692154,
0.02164395898580551,
0.09479637444019318,
0.029363203793764114,
0.07145970314741135,
-0.20165635645389557,
0.008965482003986835,
0.0746086910367012,
0.0017301983898505569,
0.10139960795640945,
0.05481478199362755,
-0.005074613727629185,
0.14850464463233948,
-0.0895385816693306,
0.06666826456785202,
0.033011093735694885,
-0.10629595816135406,
-0.24018633365631104,
-0.08690046519041061,
0.049973659217357635,
0.05947898328304291,
0.09090011566877365,
0.004576602950692177,
0.136745423078537,
-0.04410702735185623,
0.12025666981935501,
0.245400533080101,
-0.29207703471183777,
-0.060538072139024734,
0.04036868363618851,
0.027088914066553116,
0.05551164969801903,
-0.10351886600255966,
-0.011454152874648571,
0.026688477024435997,
0.04898132383823395,
0.15717189013957977,
-0.020919444039463997,
-0.07235714048147202,
-0.00915495865046978,
-0.1347457617521286,
-0.051618024706840515,
0.14426466822624207,
0.02326381951570511,
-0.03625188395380974,
-0.07585056871175766,
-0.08497758209705353,
-0.17258550226688385,
-0.045322053134441376,
-0.018463578075170517,
0.04932761937379837,
-0.02334989421069622,
-0.04446636885404587,
-0.03458685427904129,
-0.08329588174819946,
-0.06098838523030281,
-0.05661727488040924,
0.13871821761131287,
0.0465538315474987,
0.026007093489170074,
-0.04247439280152321,
0.09459111839532852,
-0.011820121668279171,
-0.15537500381469727,
-0.003867188934236765,
0.0064421165734529495,
0.005736639257520437,
-0.02222018875181675,
-0.029979709535837173,
-0.07612928748130798,
0.01995982974767685,
0.1382105052471161,
-0.1113898977637291,
0.0739629715681076,
-0.00638773525133729,
0.034046102315187454,
-0.09388682246208191,
0.17717446386814117,
-0.027748797088861465,
-0.024996424093842506,
0.015303602442145348,
0.0695548728108406,
0.04389562830328941,
-0.034024275839328766,
-0.10060198605060577,
0.01912396214902401,
0.09721915423870087,
0.029909146949648857,
-0.04247087240219116,
0.07904233038425446,
-0.0286897923797369,
-0.028557918965816498,
0.033576060086488724,
-0.11084962636232376,
0.02827778458595276,
0.009111941792070866,
-0.08360594511032104,
-0.02684713341295719,
0.04165097326040268,
-0.006631703115999699,
-0.036014508455991745,
0.09757255017757416,
-0.08074722439050674,
0.015272172167897224,
-0.07042846828699112,
-0.12352913618087769,
0.022866278886795044,
-0.11513428390026093,
-0.011302131228148937,
-0.08967982977628708,
-0.17309832572937012,
-0.010134540498256683,
0.058600135147571564,
-0.03917747363448143,
-0.04330001026391983,
-0.044305406510829926,
-0.10312417149543762,
0.03176642954349518,
-0.010238244198262691,
0.08443200588226318,
-0.07368341088294983,
0.08627677708864212,
0.021047011017799377,
0.07302039116621017,
-0.03000807575881481,
0.049749478697776794,
-0.09219612181186676,
0.02706768549978733,
-0.19792009890079498,
0.053515758365392685,
-0.0743684321641922,
0.044923584908246994,
-0.09568328410387039,
-0.11124670505523682,
0.01586083695292473,
-0.014961994253098965,
0.07007089257240295,
0.10570048540830612,
-0.18091021478176117,
-0.07235747575759888,
0.1610972285270691,
-0.10606841742992401,
-0.10317611694335938,
0.10625886917114258,
-0.043385207653045654,
0.011974948458373547,
0.053321901708841324,
0.19396843016147614,
0.09838869422674179,
-0.1038198173046112,
-0.014742755331099033,
0.003399489214643836,
0.029891274869441986,
-0.036803875118494034,
0.059127628803253174,
0.00892002321779728,
0.06101040914654732,
0.025395113974809647,
-0.013226301409304142,
0.029538676142692566,
-0.07808956503868103,
-0.09155115485191345,
-0.05138816684484482,
-0.06921830773353577,
0.04099563509225845,
0.06105827912688255,
0.061463598161935806,
-0.09841370582580566,
-0.09755174815654755,
0.03208714351058006,
0.08735117316246033,
-0.0725361555814743,
0.04552670568227768,
-0.07639339566230774,
0.0832270011305809,
-0.020866915583610535,
-0.003621796378865838,
-0.18997685611248016,
-0.055525004863739014,
0.032778751105070114,
-0.029448525980114937,
0.010995768941938877,
0.00873658712953329,
0.07626175880432129,
0.06579027324914932,
-0.054681990295648575,
-0.020849859341979027,
-0.030833693221211433,
-0.0023058729711920023,
-0.10917271673679352,
-0.21463541686534882,
-0.035003792494535446,
-0.034994710236787796,
0.0826965793967247,
-0.1789235919713974,
0.05022610351443291,
0.051641810685396194,
0.10538984835147858,
0.03956175595521927,
-0.01761733926832676,
-0.013976276852190495,
0.05202408507466316,
-0.04704863205552101,
-0.06892488896846771,
0.06996322423219681,
0.011051411740481853,
-0.08268366754055023,
-0.00956596527248621,
-0.1208646297454834,
0.15572814643383026,
0.12091722339391708,
-0.038047220557928085,
-0.06566265225410461,
0.0020167205948382616,
-0.06787971407175064,
-0.03312917798757553,
-0.03694428503513336,
0.03548211604356766,
0.17580284178256989,
0.01537412591278553,
0.15890230238437653,
-0.08205173164606094,
-0.05036827549338341,
0.04060784727334976,
-0.018859751522541046,
0.0006287301075644791,
0.11048604547977448,
0.06830944120883942,
-0.0692458525300026,
0.14271607995033264,
0.126186802983284,
-0.06746665388345718,
0.1517682671546936,
-0.054350439459085464,
-0.05576968565583229,
-0.020254267379641533,
-0.012002921663224697,
0.0029917543288320303,
0.10116630792617798,
-0.12183132767677307,
0.0005661078612320125,
0.03244073688983917,
0.032410409301519394,
0.012811509892344475,
-0.1945039927959442,
-0.004458114039152861,
0.03238861635327339,
-0.08343750238418579,
-0.013246320188045502,
0.006128101143985987,
0.016819391399621964,
0.10821912437677383,
0.0029815055895596743,
-0.08947796374559402,
0.029878364875912666,
-0.0073336041532456875,
-0.06891046464443207,
0.19614504277706146,
-0.09445872157812119,
-0.18897411227226257,
-0.11676841229200363,
-0.049365974962711334,
-0.09138469398021698,
-0.005132400430738926,
0.061428558081388474,
-0.08348210901021957,
-0.04205692186951637,
-0.0981605276465416,
-0.0043557933531701565,
0.010137592442333698,
0.03371090069413185,
0.015770722180604935,
0.0023978089448064566,
0.08022714406251907,
-0.12578532099723816,
-0.027370667085051537,
-0.04878130927681923,
-0.03329567238688469,
0.03796087205410004,
0.04748694971203804,
0.09599470347166061,
0.12932871282100677,
-0.024926820769906044,
0.04015101492404938,
-0.03262714296579361,
0.22161328792572021,
-0.0692979022860527,
0.005652725230902433,
0.14537808299064636,
-0.0005227861693128943,
0.07229939848184586,
0.13583369553089142,
0.05597097799181938,
-0.08421411365270615,
-0.0022875412832945585,
0.04499639943242073,
-0.04208775982260704,
-0.22007843852043152,
-0.04906018078327179,
-0.044630017131567,
0.03474627435207367,
0.09260102361440659,
0.04972273111343384,
0.0468670018017292,
0.04310069978237152,
0.02596626989543438,
0.03792349621653557,
-0.011005611158907413,
0.08089956641197205,
0.154238760471344,
0.04566176235675812,
0.137748122215271,
-0.047883789986371994,
-0.037073541432619095,
0.054428618401288986,
-0.005118471570312977,
0.2052777260541916,
-0.023152543231844902,
0.14650943875312805,
0.0469059981405735,
0.15851250290870667,
0.008188938722014427,
0.06349058449268341,
-0.015467237681150436,
-0.017139149829745293,
-0.009776048362255096,
-0.05182099714875221,
-0.037848178297281265,
0.015844685956835747,
-0.06407443434000015,
0.04691554233431816,
-0.11572804301977158,
0.04168398678302765,
0.058167990297079086,
0.28708916902542114,
0.024693859741091728,
-0.3269064128398895,
-0.09054944664239883,
-0.005274253897368908,
-0.054038021713495255,
-0.02774706855416298,
0.031230419874191284,
0.1234068050980568,
-0.09903625398874283,
0.074419304728508,
-0.08741164952516556,
0.09102374315261841,
-0.06336691230535507,
0.04515201598405838,
0.08910715579986572,
0.09896758198738098,
0.002761317417025566,
0.06286559253931046,
-0.2694365382194519,
0.26476505398750305,
0.014298629015684128,
0.06312461942434311,
-0.073744997382164,
0.024921150878071785,
0.046683501452207565,
0.07431713491678238,
0.0851576179265976,
-0.02107306197285652,
-0.08743605762720108,
-0.1652141511440277,
-0.09507960081100464,
0.015179866924881935,
0.09324079006910324,
-0.017781730741262436,
0.0946386307477951,
-0.03535100445151329,
-0.01044371072202921,
0.05491378530859947,
-0.03976564481854439,
-0.05395326390862465,
-0.09043088555335999,
0.014567730017006397,
0.057222478091716766,
-0.04592173546552658,
-0.06761996448040009,
-0.11216691881418228,
-0.0736926794052124,
0.1596253514289856,
-0.027705367654561996,
-0.06528166681528091,
-0.12897677719593048,
0.04911534860730171,
0.08875451982021332,
-0.09280448406934738,
0.039323367178440094,
-0.01456647738814354,
0.1004888266324997,
0.022169869393110275,
-0.08691825717687607,
0.09386392682790756,
-0.07087898254394531,
-0.18739396333694458,
-0.04553757235407829,
0.1296457201242447,
0.014327678829431534,
0.06173194944858551,
-0.0162330474704504,
0.02962236851453781,
-0.013036128133535385,
-0.09226306527853012,
0.013791386038064957,
0.009783731773495674,
0.07386459410190582,
-0.0035759001038968563,
-0.05157022550702095,
0.024502268061041832,
-0.05539429187774658,
-0.0031381347216665745,
0.17841428518295288,
0.231465145945549,
-0.10657961666584015,
0.07968958467245102,
0.04000621661543846,
-0.06139352545142174,
-0.2001890242099762,
0.010623790323734283,
0.05761883780360222,
-0.012167171575129032,
0.02876053936779499,
-0.18534381687641144,
0.0797281339764595,
0.08148595690727234,
-0.020436488091945648,
0.10452727973461151,
-0.31251153349876404,
-0.12702977657318115,
0.10681530088186264,
0.12101466208696365,
0.06558044999837875,
-0.14884760975837708,
-0.03961734101176262,
-0.00498741352930665,
-0.10603221505880356,
0.13227789103984833,
-0.1013336256146431,
0.12128330767154694,
-0.03918766602873802,
0.06648944318294525,
0.013956395909190178,
-0.053794197738170624,
0.10845567286014557,
0.0036864099092781544,
0.0918404683470726,
-0.06460171937942505,
0.009138393215835094,
0.06555434316396713,
-0.0669022649526596,
0.06414902210235596,
-0.10764897614717484,
0.035610344260931015,
-0.09198043495416641,
-0.015211021527647972,
-0.08733615279197693,
0.03625597432255745,
-0.03399040922522545,
-0.05753367021679878,
-0.054607097059488297,
0.022583764046430588,
0.07598743587732315,
-0.02066180296242237,
0.17783884704113007,
0.0187422763556242,
0.14843983948230743,
0.18222279846668243,
0.08284533768892288,
-0.07899879664182663,
-0.0904536247253418,
0.0004248759360052645,
-0.007672686129808426,
0.05607650429010391,
-0.15517133474349976,
0.042137324810028076,
0.14312756061553955,
0.009597654454410076,
0.12801554799079895,
0.07288850098848343,
-0.03418498858809471,
-0.007852617651224136,
0.05769909918308258,
-0.17301420867443085,
-0.09895871579647064,
-0.006843569688498974,
-0.05036896839737892,
-0.11847575008869171,
0.06025421619415283,
0.1261739283800125,
-0.07251628488302231,
0.007090608589351177,
0.006644528359174728,
0.02820798009634018,
-0.020634211599826813,
0.18873071670532227,
0.05434674024581909,
0.05485713109374046,
-0.10921553522348404,
0.10428424179553986,
0.03760535269975662,
-0.08981305360794067,
0.03691302239894867,
0.11737876385450363,
-0.1003483310341835,
-0.03729872778058052,
0.04361711069941521,
0.1618368923664093,
-0.038482312113046646,
-0.05578150972723961,
-0.1631612777709961,
-0.12322352081537247,
0.08654439449310303,
0.1776610165834427,
0.08818362653255463,
0.01900242082774639,
-0.0637180283665657,
0.007902042008936405,
-0.11326652765274048,
0.11374318599700928,
0.04239095002412796,
0.05558184161782265,
-0.15815868973731995,
0.12914922833442688,
0.009016728028655052,
0.054063402116298676,
-0.02402072586119175,
0.014459016732871532,
-0.09348476678133011,
-0.0031451350077986717,
-0.1079489216208458,
-0.0015086199855431914,
-0.03829653188586235,
0.002601129002869129,
-0.00904860533773899,
-0.047747038304805756,
-0.06414870917797089,
0.024682415649294853,
-0.10590635985136032,
-0.03197995945811272,
0.0101675596088171,
0.045166272670030594,
-0.13941380381584167,
-0.030626695603132248,
0.010812356136739254,
-0.07621261477470398,
0.08807840198278427,
0.03688540682196617,
-0.004836583975702524,
0.03821803256869316,
-0.07511642575263977,
0.011295150965452194,
0.06585656851530075,
0.007272431626915932,
0.0613417848944664,
-0.10806713998317719,
-0.005510325077921152,
-0.009588740766048431,
0.02046225033700466,
0.026465436443686485,
0.09402235597372055,
-0.13352443277835846,
0.01617470197379589,
-0.021148741245269775,
-0.0548095777630806,
-0.06339650601148605,
0.047102585434913635,
0.10433726757764816,
0.025778163224458694,
0.19815129041671753,
-0.08556283265352249,
0.0433897040784359,
-0.20769265294075012,
0.004709172528237104,
0.01001870259642601,
-0.12851005792617798,
-0.07556268572807312,
-0.036626093089580536,
0.06563806533813477,
-0.06871055066585541,
0.12149614095687866,
0.038290951400995255,
0.02894551306962967,
0.04379528760910034,
-0.03612064942717552,
-0.024062804877758026,
0.011028913781046867,
0.14903979003429413,
0.03287387639284134,
-0.040237732231616974,
0.09055115282535553,
0.01993541046977043,
0.09530000388622284,
0.09979141503572464,
0.22085313498973846,
0.12145489454269409,
0.020626403391361237,
0.10628154873847961,
0.047016989439725876,
-0.06010902673006058,
-0.18080657720565796,
0.0708027109503746,
-0.047452714294195175,
0.13811959326267242,
-0.024820968508720398,
0.1784369796514511,
0.09122668206691742,
-0.16611073911190033,
0.06690306961536407,
-0.022757867351174355,
-0.08500196039676666,
-0.13791562616825104,
-0.08273690193891525,
-0.07956396043300629,
-0.14454013109207153,
-0.0033422342967242002,
-0.1200585663318634,
0.04128372669219971,
0.07110844552516937,
0.017614014446735382,
-0.009005101397633553,
0.14653435349464417,
-0.019336193799972534,
0.011947330087423325,
0.0685882717370987,
0.018388396129012108,
-0.04008064419031143,
-0.07323119044303894,
-0.06439969688653946,
-0.014392969198524952,
-0.0038314620032906532,
0.029615601524710655,
-0.02261669933795929,
-0.020462041720747948,
0.044476717710494995,
-0.03049851581454277,
-0.09952498227357864,
0.01507981400936842,
0.016851069405674934,
0.05958835035562515,
0.07025661319494247,
0.02092687599360943,
-0.000888484064489603,
-0.009136375039815903,
0.21769966185092926,
-0.08549628406763077,
-0.058136530220508575,
-0.11641460657119751,
0.24190326035022736,
0.03659062832593918,
-0.02686716988682747,
0.026469113305211067,
-0.06655113399028778,
-0.040085259824991226,
0.22303873300552368,
0.21314527094364166,
-0.04958152398467064,
0.0027837019879370928,
-0.0022047082893550396,
-0.010450792498886585,
-0.004827470518648624,
0.09653538465499878,
0.11814805120229721,
0.06925492733716965,
-0.07028088718652725,
-0.03366682678461075,
-0.04435024410486221,
-0.020041421055793762,
-0.059444304555654526,
0.0820770412683487,
0.022323649376630783,
0.0038124455604702234,
-0.03562047332525253,
0.03955330327153206,
-0.066569983959198,
-0.10688034445047379,
0.0423816442489624,
-0.2182750254869461,
-0.15058113634586334,
-0.009379025548696518,
0.07961048930883408,
0.0019141167867928743,
0.051636066287755966,
-0.0015093758702278137,
-0.007510171737521887,
0.09535683691501617,
-0.023592805489897728,
-0.08992034941911697,
-0.08457884192466736,
0.09030694514513016,
-0.1335328072309494,
0.22268985211849213,
-0.04042007401585579,
0.03303264081478119,
0.12072031199932098,
0.042135175317525864,
-0.09328026324510574,
0.05718506500124931,
0.051313236355781555,
-0.07408850640058517,
-0.004674179013818502,
0.11927413940429688,
-0.02943282574415207,
0.08002986758947372,
0.04161539301276207,
-0.11472512781620026,
-0.0097417701035738,
-0.055857568979263306,
-0.062230054289102554,
-0.03230633959174156,
-0.036999087780714035,
-0.04953013360500336,
0.12551414966583252,
0.213893860578537,
-0.025185037404298782,
0.001172514632344246,
-0.0778343677520752,
0.009375516325235367,
0.04866311326622963,
-0.005769393872469664,
-0.04183465242385864,
-0.22653183341026306,
0.021061578765511513,
0.06180494278669357,
-0.012610889039933681,
-0.2276887446641922,
-0.10827122628688812,
0.02031170204281807,
-0.05058624967932701,
-0.10174305737018585,
0.07710724323987961,
0.06247563660144806,
0.04692690074443817,
-0.04672664776444435,
-0.022427434101700783,
-0.06062297150492668,
0.15291625261306763,
-0.16649438440799713,
-0.06932090222835541
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-cola-copy4
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6500
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6345 | 1.0 | 2138 | 0.6611 | 0.0 |
| 0.6359 | 2.0 | 4276 | 0.6840 | 0.0 |
| 0.6331 | 3.0 | 6414 | 0.6500 | 0.0 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "fnet-large-finetuned-cola-copy4", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE COLA", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.0, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-cola-copy4
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-cola-copy4
===============================
This model is a fine-tuned version of google/fnet-large on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6500
* Matthews Correlation: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 4e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: polynomial
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: polynomial\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: polynomial\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
100,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 4e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: polynomial\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09233275800943375,
0.09512945264577866,
-0.003065334865823388,
0.1247669905424118,
0.15720099210739136,
0.014819260686635971,
0.1093737781047821,
0.12509053945541382,
-0.1051739975810051,
0.01790531352162361,
0.12988686561584473,
0.1646781712770462,
0.01671697199344635,
0.1358242630958557,
-0.06855105608701706,
-0.25993701815605164,
-0.004856702871620655,
0.05779449641704559,
-0.058361832052469254,
0.13131386041641235,
0.11140216141939163,
-0.11574908345937729,
0.09287762641906738,
0.01650150865316391,
-0.1979927122592926,
0.0038727051578462124,
0.007748198229819536,
-0.0682230070233345,
0.1342443972826004,
0.02915475144982338,
0.11438985168933868,
-0.003759731538593769,
0.06577876210212708,
-0.18901580572128296,
0.008525706827640533,
0.05932474136352539,
0.0005246849032118917,
0.09454575926065445,
0.04154572635889053,
0.0160653218626976,
0.13911603391170502,
-0.08459139615297318,
0.04823097959160805,
0.018408600240945816,
-0.10388068109750748,
-0.25746238231658936,
-0.07827635854482651,
0.048235949128866196,
0.0693022608757019,
0.10835829377174377,
-0.003936451859772205,
0.14568111300468445,
-0.06244703382253647,
0.1065765917301178,
0.2077585905790329,
-0.29930374026298523,
-0.06187231093645096,
0.03686806187033653,
0.003206562716513872,
0.04481332749128342,
-0.08877745270729065,
-0.03627109155058861,
0.04832904040813446,
0.060771457850933075,
0.14348381757736206,
-0.030308183282613754,
-0.10318541526794434,
-0.0020865588448941708,
-0.14790591597557068,
-0.03919351473450661,
0.1780187040567398,
0.040289994329214096,
-0.02904658578336239,
-0.06276993453502655,
-0.06828074902296066,
-0.12733641266822815,
-0.04040741175413132,
-0.018814735114574432,
0.043984729796648026,
-0.02478707768023014,
-0.028355807065963745,
-0.020001152530312538,
-0.10822728276252747,
-0.04445340484380722,
-0.08476538956165314,
0.115586057305336,
0.03922015801072121,
0.029852185398340225,
-0.04210149496793747,
0.09404022246599197,
-0.024852003902196884,
-0.12719585001468658,
0.018784616142511368,
0.010985801927745342,
0.03145045414566994,
-0.03605981916189194,
-0.05844790115952492,
-0.07217251509428024,
0.01470858883112669,
0.12766627967357635,
-0.08736921846866608,
0.04928255081176758,
0.03669619932770729,
0.045623622834682465,
-0.07648246735334396,
0.17523980140686035,
-0.04114501178264618,
-0.014168416149914265,
0.02477232739329338,
0.034544818103313446,
0.028209369629621506,
-0.01585005410015583,
-0.12699714303016663,
0.015706531703472137,
0.10109711438417435,
-0.0017797715263441205,
-0.06115017086267471,
0.08852759748697281,
-0.032349616289138794,
-0.030706737190485,
0.012949003838002682,
-0.09854428470134735,
0.026816805824637413,
0.0012309937737882137,
-0.08246472477912903,
-0.022865192964673042,
0.03647700697183609,
0.013990442268550396,
-0.04176129400730133,
0.0968250036239624,
-0.07641268521547318,
0.02463344670832157,
-0.08395158499479294,
-0.11545702069997787,
0.021551189944148064,
-0.08830283582210541,
0.02501392737030983,
-0.10790706425905228,
-0.17335820198059082,
-0.006669905502349138,
0.062387850135564804,
-0.024137940257787704,
-0.06518234312534332,
-0.04265695437788963,
-0.0865042582154274,
0.009156285785138607,
-0.017632286995649338,
0.12698838114738464,
-0.0660666972398758,
0.08387266099452972,
0.03743436560034752,
0.05658388510346413,
-0.06848257035017014,
0.050200145691633224,
-0.09195152670145035,
0.013190405443310738,
-0.1792236864566803,
0.04520026594400406,
-0.05468466132879257,
0.058493778109550476,
-0.079399473965168,
-0.11424562335014343,
0.02771143801510334,
-0.008864525705575943,
0.05954881012439728,
0.07509133219718933,
-0.14773103594779968,
-0.06853806227445602,
0.14635972678661346,
-0.07936394214630127,
-0.12363702803850174,
0.12627044320106506,
-0.0633881539106369,
0.03210955113172531,
0.05946922302246094,
0.1840452402830124,
0.08191072940826416,
-0.07619595527648926,
-0.00024150576791726053,
0.035252466797828674,
0.03499715402722359,
-0.07122062146663666,
0.05987473949790001,
0.024856816977262497,
0.034891482442617416,
0.037574294954538345,
-0.02263655513525009,
0.061248525977134705,
-0.07868243753910065,
-0.08805044740438461,
-0.04709918797016144,
-0.08671076595783234,
0.025309136137366295,
0.0700942724943161,
0.083065465092659,
-0.09565918147563934,
-0.07823575288057327,
0.050391800701618195,
0.0746447965502739,
-0.06368174403905869,
0.03222660720348358,
-0.06352562457323074,
0.09136843681335449,
-0.025265958160161972,
-0.014246711507439613,
-0.18371163308620453,
-0.032627083361148834,
0.014205286279320717,
0.0016399379819631577,
0.006114912685006857,
0.01740933023393154,
0.059542153030633926,
0.055963389575481415,
-0.055016353726387024,
0.0023560954723507166,
-0.012001898139715195,
0.004328112583607435,
-0.12324753403663635,
-0.18478065729141235,
-0.0336662158370018,
-0.024640899151563644,
0.14523087441921234,
-0.20531970262527466,
0.04802195727825165,
0.007260829675942659,
0.09133994579315186,
0.002927270019426942,
0.0007110468577593565,
-0.04424472153186798,
0.059639547020196915,
-0.040603142231702805,
-0.047593239694833755,
0.08031855523586273,
0.02992701530456543,
-0.07765638828277588,
-0.04063931852579117,
-0.10103578120470047,
0.17500773072242737,
0.1376151591539383,
-0.12657193839550018,
-0.06505127251148224,
-0.012219647876918316,
-0.06840192526578903,
-0.027870384976267815,
-0.03493181988596916,
0.028798134997487068,
0.19706763327121735,
-0.01458041463047266,
0.16647209227085114,
-0.08209680765867233,
-0.05784948170185089,
0.032615628093481064,
-0.03674580901861191,
0.01474632229655981,
0.1215551421046257,
0.10172944515943527,
-0.09988900274038315,
0.15515221655368805,
0.13784228265285492,
-0.08681371808052063,
0.12966054677963257,
-0.05156385526061058,
-0.05125381797552109,
-0.01754993386566639,
-0.022134356200695038,
-0.017823707312345505,
0.09421492367982864,
-0.15958133339881897,
0.003134101163595915,
0.03757501766085625,
0.023562928661704063,
0.033355507999658585,
-0.2084152102470398,
-0.022593621164560318,
0.035269055515527725,
-0.045329052954912186,
-0.00580307561904192,
-0.011826040223240852,
0.01568181999027729,
0.10432140529155731,
-0.0019289932679384947,
-0.10209478437900543,
0.05519438162446022,
-0.002293555997312069,
-0.08838749676942825,
0.21670860052108765,
-0.09239088743925095,
-0.18949273228645325,
-0.10426730662584305,
-0.075583815574646,
-0.0670001432299614,
-0.0037729968316853046,
0.057917334139347076,
-0.07278434932231903,
-0.0342622734606266,
-0.08709461987018585,
-0.002623616484925151,
-0.006207508500665426,
0.024962766095995903,
0.019852055236697197,
0.0011892204638570547,
0.07990993559360504,
-0.10965033620595932,
-0.02263079397380352,
-0.06664131581783295,
-0.028440099209547043,
0.04562980681657791,
0.03406325355172157,
0.09519621729850769,
0.14147649705410004,
-0.03038725070655346,
0.027144741266965866,
-0.027893392369151115,
0.2512238621711731,
-0.05921139195561409,
-0.016831308603286743,
0.15021313726902008,
-0.010622978210449219,
0.05199234187602997,
0.11375553160905838,
0.06421957910060883,
-0.07900237292051315,
0.0029531349427998066,
0.028535865247249603,
-0.040035001933574677,
-0.21894553303718567,
-0.06637489050626755,
-0.05583241209387779,
0.01487064827233553,
0.08067278563976288,
0.030623022466897964,
0.03454406186938286,
0.05771647393703461,
0.04508524388074875,
0.07552418857812881,
-0.026971518993377686,
0.05574258789420128,
0.1720634400844574,
0.03834931552410126,
0.13267982006072998,
-0.04087376967072487,
-0.05850344896316528,
0.04537014290690422,
-0.010815618559718132,
0.19896966218948364,
-0.017764857038855553,
0.1131005808711052,
0.05028647929430008,
0.17524059116840363,
-0.008845261298120022,
0.06706786155700684,
-0.021544305607676506,
-0.03201686590909958,
-0.012980715371668339,
-0.03330077975988388,
-0.030726173892617226,
0.017198627814650536,
-0.07334477454423904,
0.0482911616563797,
-0.11843068152666092,
0.015372583642601967,
0.058800581842660904,
0.2477571964263916,
0.03380488604307175,
-0.32424047589302063,
-0.09683819115161896,
-0.007774604018777609,
-0.044241391122341156,
-0.02778438851237297,
0.03309032693505287,
0.10417934507131577,
-0.09845829010009766,
0.047793999314308167,
-0.07568784803152084,
0.09309320896863937,
-0.04706546291708946,
0.05592344328761101,
0.09275910258293152,
0.09212326258420944,
0.02123185619711876,
0.09268396347761154,
-0.2914220690727234,
0.25891977548599243,
0.004228134173899889,
0.05892276018857956,
-0.08187481760978699,
0.014904065988957882,
0.028283890336751938,
0.06994342058897018,
0.07775736600160599,
-0.011895506642758846,
-0.044009629637002945,
-0.17269784212112427,
-0.06302323937416077,
0.027976326644420624,
0.07082512229681015,
-0.0011837704805657268,
0.07714352011680603,
-0.024835195392370224,
0.0076799169182777405,
0.07382135838270187,
-0.0016453050775453448,
-0.041140321642160416,
-0.1088331788778305,
-0.005208827089518309,
0.049144089221954346,
-0.08543877303600311,
-0.061183251440525055,
-0.10903105139732361,
-0.12609942257404327,
0.16371701657772064,
-0.018206670880317688,
-0.061221644282341,
-0.10899707674980164,
0.09358765184879303,
0.05264882370829582,
-0.08415330201387405,
0.04582873731851578,
-0.002150632906705141,
0.08258465677499771,
0.021668823435902596,
-0.0820770412683487,
0.10388047993183136,
-0.0748441070318222,
-0.15076611936092377,
-0.050239499658346176,
0.1122477799654007,
0.031452324241399765,
0.05578535050153732,
0.000581849948503077,
0.011316362768411636,
-0.043634865432977676,
-0.09546930342912674,
0.010381952859461308,
-0.016542639583349228,
0.10063199698925018,
0.012495787814259529,
-0.04976334050297737,
0.01473340205848217,
-0.057483356446027756,
-0.017827924340963364,
0.21181000769138336,
0.21221749484539032,
-0.10816640406847,
0.023791581392288208,
0.0230953898280859,
-0.062437716871500015,
-0.20183077454566956,
0.022502422332763672,
0.06667298823595047,
-0.0073587228544056416,
0.02168300561606884,
-0.16593994200229645,
0.13801442086696625,
0.09317358583211899,
-0.01634751819074154,
0.10123719274997711,
-0.2984839975833893,
-0.12198083102703094,
0.1368626058101654,
0.12732064723968506,
0.10460025817155838,
-0.13190113008022308,
-0.016833331435918808,
-0.01915566436946392,
-0.12713098526000977,
0.14362749457359314,
-0.11503395438194275,
0.10724924504756927,
-0.026324685662984848,
0.08466499298810959,
0.0038833869621157646,
-0.05768710747361183,
0.11118490993976593,
0.02210315689444542,
0.0894501581788063,
-0.06575895100831985,
-0.012948116287589073,
0.04429974779486656,
-0.03683861345052719,
0.051103368401527405,
-0.09431545436382294,
0.031746719032526016,
-0.1138581708073616,
-0.023863855749368668,
-0.08127841353416443,
0.04316647723317146,
-0.035768941044807434,
-0.06282056123018265,
-0.05115922912955284,
0.019959958270192146,
0.06679927557706833,
-0.006885247305035591,
0.1349387764930725,
0.03270858898758888,
0.14021867513656616,
0.1730620115995407,
0.08050861954689026,
-0.09682311117649078,
-0.07620519399642944,
-0.014986664988100529,
-0.01329360343515873,
0.05166109651327133,
-0.13897410035133362,
0.03210613504052162,
0.14939337968826294,
0.00653050048276782,
0.13511453568935394,
0.08364485204219818,
-0.025246642529964447,
-0.012206786312162876,
0.049290698021650314,
-0.17614205181598663,
-0.09459704905748367,
-0.01580153964459896,
-0.06113683432340622,
-0.10869400203227997,
0.05536152794957161,
0.10568933933973312,
-0.07989883422851562,
-0.003682148875668645,
-0.005677937995642424,
0.012828405946493149,
-0.05436980724334717,
0.18447230756282806,
0.07306311279535294,
0.04691896587610245,
-0.10561326891183853,
0.0796433687210083,
0.04686886444687843,
-0.06615936756134033,
0.01825185865163803,
0.08183855563402176,
-0.09784441441297531,
-0.053216081112623215,
0.05644121393561363,
0.16658157110214233,
-0.05047152563929558,
-0.06140705570578575,
-0.14625822007656097,
-0.1395581066608429,
0.08723899722099304,
0.14527234435081482,
0.1146295890212059,
0.0048127686604857445,
-0.06903456151485443,
-0.0020093952771276236,
-0.12212307006120682,
0.10608971118927002,
0.04618038982152939,
0.05523946136236191,
-0.14981551468372345,
0.1381446272134781,
0.0044164699502289295,
0.06445992738008499,
-0.02691238932311535,
0.028297582641243935,
-0.09044655412435532,
0.008752789348363876,
-0.11739658564329147,
-0.006538760848343372,
-0.018917301669716835,
0.01019852701574564,
-0.004398648161441088,
-0.04151419922709465,
-0.05624544247984886,
0.027534134685993195,
-0.10786119103431702,
-0.020445266738533974,
0.036384742707014084,
0.04365149885416031,
-0.11402267962694168,
-0.04841025173664093,
0.00970064103603363,
-0.06001601740717888,
0.08584681153297424,
0.051551930606365204,
0.004418548196554184,
0.048336461186409,
-0.11712591350078583,
0.0003040441661141813,
0.07435277849435806,
0.03268049657344818,
0.06797278672456741,
-0.10176217555999756,
-9.178542086374364e-7,
0.015004346147179604,
0.03171688690781593,
0.025623533874750137,
0.08581408858299255,
-0.13344013690948486,
-0.005912679713219404,
-0.023418787866830826,
-0.05936991795897484,
-0.06850530207157135,
0.02604781836271286,
0.08042415231466293,
0.029985297471284866,
0.20086462795734406,
-0.07570258527994156,
0.03648511320352554,
-0.21049068868160248,
0.006063299719244242,
-0.0004535551415756345,
-0.12867338955402374,
-0.08793282508850098,
-0.04498898237943649,
0.060817401856184006,
-0.056758251041173935,
0.16669900715351105,
0.05250585079193115,
0.010740572586655617,
0.03446506708860397,
-0.022191546857357025,
0.01781933754682541,
0.01640031859278679,
0.18934032320976257,
0.023245561867952347,
-0.03899821266531944,
0.06896817684173584,
0.03949577361345291,
0.11209707707166672,
0.13094016909599304,
0.19307085871696472,
0.1355232149362564,
0.0009488848736509681,
0.09569990634918213,
0.044620100408792496,
-0.048473093658685684,
-0.1554585099220276,
0.05598536506295204,
-0.04467339813709259,
0.12895017862319946,
-0.02971630170941353,
0.17981234192848206,
0.06735704839229584,
-0.1655966341495514,
0.04575532674789429,
-0.05950343236327171,
-0.09475401043891907,
-0.11735506355762482,
-0.06831412762403488,
-0.0774361714720726,
-0.13776525855064392,
-0.003766128560528159,
-0.12722247838974,
-0.00041125761345028877,
0.10296083986759186,
0.01259323488920927,
-0.03712952136993408,
0.13460665941238403,
-0.002584537724032998,
0.017914963886141777,
0.0751262903213501,
0.004339549690485001,
-0.03591177612543106,
-0.09981860220432281,
-0.05253158137202263,
-0.01963178440928459,
-0.006725162733346224,
0.03545477241277695,
-0.0516805499792099,
-0.02611083723604679,
0.040677759796381,
-0.02363325096666813,
-0.0899670198559761,
0.004122834652662277,
0.018914181739091873,
0.0635032057762146,
0.05918271094560623,
0.00532264169305563,
0.015121815726161003,
-0.009309454821050167,
0.19844715297222137,
-0.07516686618328094,
-0.0684531182050705,
-0.10911398380994797,
0.24448417127132416,
0.032753437757492065,
-0.016633272171020508,
0.03297156095504761,
-0.06548818200826645,
-0.01991979405283928,
0.24830777943134308,
0.2277710735797882,
-0.08171620965003967,
-0.0024547544308006763,
0.013445607386529446,
-0.011507655493915081,
-0.012147458270192146,
0.10888799279928207,
0.15053923428058624,
0.06152213364839554,
-0.0858483612537384,
-0.03787026181817055,
-0.05885991081595421,
-0.005216280464082956,
-0.053977303206920624,
0.072474904358387,
0.05398649349808693,
0.00975072756409645,
-0.029473964124917984,
0.042551543563604355,
-0.07279711216688156,
-0.097336046397686,
0.03348561376333237,
-0.20607492327690125,
-0.15776149928569794,
-0.0027424416039139032,
0.07768314331769943,
0.006029955577105284,
0.05825537443161011,
-0.026230378076434135,
0.01024033036082983,
0.09320371598005295,
-0.02758374810218811,
-0.09376383572816849,
-0.06231313198804855,
0.0939749926328659,
-0.1360514760017395,
0.21829846501350403,
-0.043917443603277206,
0.07810305804014206,
0.12809188663959503,
0.05089157074689865,
-0.06530290096998215,
0.060449711978435516,
0.040770523250103,
-0.03336869925260544,
0.01779806986451149,
0.07466362416744232,
-0.026095615699887276,
0.061446528881788254,
0.039817020297050476,
-0.13605281710624695,
0.0014098514802753925,
-0.05766027793288231,
-0.06557714939117432,
-0.0423487052321434,
-0.0335911400616169,
-0.0659194067120552,
0.12307722866535187,
0.21699008345603943,
-0.031558360904455185,
-0.007292434107512236,
-0.07748071104288101,
0.004243048373609781,
0.04589390754699707,
0.004072258714586496,
-0.05104261636734009,
-0.21407414972782135,
0.016236145049333572,
0.05227192863821983,
-0.021652353927493095,
-0.22077935934066772,
-0.08553380519151688,
-0.0022107886616140604,
-0.07395731657743454,
-0.09429778158664703,
0.06886898726224899,
0.08060850203037262,
0.04384591057896614,
-0.06514234095811844,
-0.012651219964027405,
-0.08636467158794403,
0.1397591382265091,
-0.1356358826160431,
-0.08479206264019012
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-cola
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6243
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.6195 | 1.0 | 2138 | 0.6527 | 0.0 |
| 0.6168 | 2.0 | 4276 | 0.6259 | 0.0 |
| 0.616 | 3.0 | 6414 | 0.6243 | 0.0 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "fnet-large-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE COLA", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.0, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-cola
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-cola
=========================
This model is a fine-tuned version of google/fnet-large on the GLUE COLA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6243
* Matthews Correlation: 0.0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09435097873210907,
0.08421194553375244,
-0.0027880712877959013,
0.12547148764133453,
0.15904685854911804,
0.02615882083773613,
0.12069503217935562,
0.1241118311882019,
-0.09248494356870651,
0.011560988612473011,
0.12044978886842728,
0.16161780059337616,
0.016701841726899147,
0.12821362912654877,
-0.05494244024157524,
-0.25881245732307434,
-0.007771983742713928,
0.05115014687180519,
-0.07189690321683884,
0.13537853956222534,
0.10689590126276016,
-0.1100701242685318,
0.08611386269330978,
0.012377075850963593,
-0.1874523162841797,
0.004920303355902433,
0.006109331268817186,
-0.061822667717933655,
0.14474309980869293,
0.026294700801372528,
0.11684288084506989,
-0.004879044834524393,
0.07424716651439667,
-0.19475165009498596,
0.010953853838145733,
0.061599865555763245,
-0.0006898609572090209,
0.09198741614818573,
0.04121238365769386,
0.016308827325701714,
0.1290297955274582,
-0.07783091813325882,
0.04760799556970596,
0.02466152422130108,
-0.10124235600233078,
-0.2494400143623352,
-0.06905415654182434,
0.04278074577450752,
0.07005875557661057,
0.11541929841041565,
-0.002867956180125475,
0.12875932455062866,
-0.0707789734005928,
0.10008362680673599,
0.20369721949100494,
-0.2828853130340576,
-0.0647275447845459,
0.05513303726911545,
-0.00020236789714545012,
0.05580358952283859,
-0.09518837183713913,
-0.035108208656311035,
0.047065891325473785,
0.05840061232447624,
0.13589154183864594,
-0.024001594632864,
-0.11207377165555954,
-0.0003246897831559181,
-0.14765164256095886,
-0.0339391864836216,
0.1837419718503952,
0.042182985693216324,
-0.03098602592945099,
-0.055377792567014694,
-0.0698784738779068,
-0.13246363401412964,
-0.03366117179393768,
-0.015889128670096397,
0.04205029830336571,
-0.01720787212252617,
-0.02871776558458805,
-0.019317759200930595,
-0.10914051532745361,
-0.06094752997159958,
-0.07925911992788315,
0.12776292860507965,
0.03751957416534424,
0.02563166245818138,
-0.036446068435907364,
0.10727158188819885,
-0.008296688087284565,
-0.1256551891565323,
0.023854248225688934,
0.015420990064740181,
0.023794319480657578,
-0.02945687063038349,
-0.057711854577064514,
-0.0770222619175911,
0.01329171285033226,
0.11630858480930328,
-0.06644808501005173,
0.045717719942331314,
0.037436164915561676,
0.052477072924375534,
-0.08913107961416245,
0.1716603934764862,
-0.03443245589733124,
-0.020622581243515015,
0.017747394740581512,
0.04704255610704422,
0.02674841694533825,
-0.01886206492781639,
-0.12413250654935837,
0.013097265735268593,
0.09388524293899536,
0.002535618608817458,
-0.059954434633255005,
0.08183281868696213,
-0.038194689899683,
-0.033323898911476135,
-0.004796795081347227,
-0.09764222055673599,
0.018554529175162315,
0.0063773770816624165,
-0.08251193910837173,
-0.019761210307478905,
0.03827506676316261,
0.014088655821979046,
-0.02665414661169052,
0.10024002194404602,
-0.08179781585931778,
0.021830689162015915,
-0.08678390830755234,
-0.10702545940876007,
0.017558949068188667,
-0.08509497344493866,
0.0191718228161335,
-0.10397984087467194,
-0.19083718955516815,
-0.007889867760241032,
0.06494387984275818,
-0.01861901767551899,
-0.06665554642677307,
-0.045192331075668335,
-0.07960332185029984,
0.009245702065527439,
-0.014210081659257412,
0.1256672590970993,
-0.06625761836767197,
0.08960525691509247,
0.029059095308184624,
0.05361932888627052,
-0.05511719360947609,
0.05413501709699631,
-0.0957532525062561,
0.010021492838859558,
-0.16411904990673065,
0.042935315519571304,
-0.05779934301972389,
0.056367989629507065,
-0.07770342379808426,
-0.10982010513544083,
0.0150128323584795,
-0.008366541936993599,
0.05230874568223953,
0.08241644501686096,
-0.17146196961402893,
-0.07396326959133148,
0.14204980432987213,
-0.08391902595758438,
-0.11846151947975159,
0.12392982840538025,
-0.05783189460635185,
0.03956197202205658,
0.05418945103883743,
0.1759374886751175,
0.09033717960119247,
-0.06612788885831833,
0.006188058760017157,
0.03327813372015953,
0.04193338379263878,
-0.06798793375492096,
0.07015901058912277,
0.01925581693649292,
0.01686580665409565,
0.03599608689546585,
-0.031133893877267838,
0.06564007699489594,
-0.07929158210754395,
-0.09648094326257706,
-0.045717786997556686,
-0.08445670455694199,
0.035287342965602875,
0.07255207747220993,
0.07468617707490921,
-0.08789877593517303,
-0.076981320977211,
0.037754978984594345,
0.08531701564788818,
-0.06047174707055092,
0.027737783268094063,
-0.052868492901325226,
0.07869548350572586,
-0.022249728441238403,
-0.011849423870444298,
-0.17947974801063538,
-0.05010382458567619,
0.01248926017433405,
-0.01671680435538292,
0.011573772877454758,
0.0340462364256382,
0.05847066640853882,
0.06170773133635521,
-0.05226020887494087,
-0.006742169614881277,
-0.03587859496474266,
-0.0023466311395168304,
-0.1176379919052124,
-0.1949375718832016,
-0.033513251692056656,
-0.022251050919294357,
0.15985170006752014,
-0.20768983662128448,
0.05144825577735901,
-0.00960769597440958,
0.08428152650594711,
0.002657628385350108,
-0.004502991680055857,
-0.044524095952510834,
0.060092028230428696,
-0.04211358726024628,
-0.04889362305402756,
0.07984418421983719,
0.021294519305229187,
-0.08888031542301178,
-0.04996927082538605,
-0.09011505544185638,
0.17323893308639526,
0.1292029470205307,
-0.12180405110120773,
-0.0646495670080185,
-0.005630622152239084,
-0.07315146178007126,
-0.032705649733543396,
-0.0447271466255188,
0.026152674108743668,
0.19703759253025055,
-0.009650024585425854,
0.1556764841079712,
-0.07343330979347229,
-0.047128260135650635,
0.025215646252036095,
-0.0313401073217392,
0.01606483943760395,
0.1185009628534317,
0.1145080178976059,
-0.08201359212398529,
0.15379412472248077,
0.14382275938987732,
-0.09326194226741791,
0.1439264416694641,
-0.04755913466215134,
-0.053455084562301636,
-0.012387750670313835,
-0.0256897434592247,
-0.019831068813800812,
0.10194675624370575,
-0.16055643558502197,
0.010317876935005188,
0.03571614995598793,
0.023414237424731255,
0.03399579972028732,
-0.2150789350271225,
-0.03453387692570686,
0.033867496997117996,
-0.050128690898418427,
-0.0031231536995619535,
-0.007135854568332434,
0.011046592146158218,
0.10478085279464722,
-0.0031779888086020947,
-0.10512698441743851,
0.052144378423690796,
-0.006299050059169531,
-0.08943869173526764,
0.21798454225063324,
-0.08710946142673492,
-0.18648970127105713,
-0.1189730167388916,
-0.06294762343168259,
-0.07567699253559113,
-0.004962088540196419,
0.05785910040140152,
-0.08131179213523865,
-0.03176778554916382,
-0.0806071013212204,
0.014716072008013725,
-0.0023878870997577906,
0.025631599128246307,
0.019848903641104698,
0.0016764418687671423,
0.08099467307329178,
-0.1167442798614502,
-0.016106992959976196,
-0.06341078877449036,
-0.04202750697731972,
0.04064585268497467,
0.041270118206739426,
0.0988377034664154,
0.15267616510391235,
-0.019211841747164726,
0.023092860355973244,
-0.021616701036691666,
0.24383974075317383,
-0.06418726593255997,
-0.019024580717086792,
0.15514269471168518,
-0.008728032931685448,
0.050296343863010406,
0.11488067358732224,
0.07195909321308136,
-0.07545755803585052,
0.005948644131422043,
0.03710569813847542,
-0.037614114582538605,
-0.222452774643898,
-0.06562527269124985,
-0.05583110451698303,
0.020605383440852165,
0.07728266716003418,
0.0375279039144516,
0.04231022298336029,
0.06005708873271942,
0.04346669837832451,
0.06828698515892029,
-0.028533868491649628,
0.05104093626141548,
0.15856194496154785,
0.03599826246500015,
0.12920033931732178,
-0.03730641305446625,
-0.05810714140534401,
0.044433075934648514,
-0.01565716788172722,
0.2037656158208847,
-0.010564416646957397,
0.10792942345142365,
0.05741063505411148,
0.17277804017066956,
-0.010603698901832104,
0.07187259942293167,
-0.0156282726675272,
-0.03243174031376839,
-0.015594683587551117,
-0.03912680223584175,
-0.03011733666062355,
0.018046243116259575,
-0.07835627347230911,
0.06085377186536789,
-0.11395221948623657,
0.011723238043487072,
0.059220366179943085,
0.23981203138828278,
0.031606465578079224,
-0.3190716505050659,
-0.09729049354791641,
-0.0049738045781850815,
-0.03747718408703804,
-0.021319236606359482,
0.03352221101522446,
0.09847861528396606,
-0.10294825583696365,
0.03968765586614609,
-0.07640799134969711,
0.09297142922878265,
-0.047262147068977356,
0.055051255971193314,
0.09929996728897095,
0.09606239944696426,
0.02009269781410694,
0.09647487848997116,
-0.2856273353099823,
0.2574538588523865,
-0.0015189285622909665,
0.05592987686395645,
-0.07764812558889389,
0.01195956114679575,
0.03816894814372063,
0.0747467428445816,
0.07086154818534851,
-0.010851860977709293,
-0.01917938143014908,
-0.17785249650478363,
-0.06546854972839355,
0.031339410692453384,
0.06611523777246475,
-0.026507152244448662,
0.07482331246137619,
-0.02870245650410652,
0.010354732163250446,
0.07601786404848099,
0.013834723271429539,
-0.05484699457883835,
-0.10062555968761444,
0.000409413012675941,
0.05253540724515915,
-0.06643876433372498,
-0.06450878828763962,
-0.11783771961927414,
-0.12188690900802612,
0.1601131409406662,
-0.019451530650258064,
-0.052096664905548096,
-0.10999548435211182,
0.08760962635278702,
0.05935923010110855,
-0.08340995758771896,
0.041804440319538116,
-0.0031501352787017822,
0.0827924981713295,
0.02227838709950447,
-0.083372101187706,
0.09709896892309189,
-0.08145904541015625,
-0.15635253489017487,
-0.056967683136463165,
0.10923594236373901,
0.031557079404592514,
0.058893609791994095,
-0.010793891735374928,
0.000662430131342262,
-0.043718304485082626,
-0.09768255800008774,
0.012185530737042427,
-0.010665759444236755,
0.07959064096212387,
0.024116169661283493,
-0.05656000226736069,
0.014147202484309673,
-0.061718329787254333,
-0.025965824723243713,
0.20178057253360748,
0.2106800675392151,
-0.1018415093421936,
0.01837395504117012,
0.028721075505018234,
-0.060019660741090775,
-0.19802434742450714,
0.02453368529677391,
0.06254914402961731,
-0.007151673547923565,
0.02866550162434578,
-0.17643575370311737,
0.14242802560329437,
0.09663483500480652,
-0.01912073977291584,
0.1173851490020752,
-0.3189247250556946,
-0.11955010890960693,
0.13650289177894592,
0.13054005801677704,
0.10922560840845108,
-0.12862245738506317,
-0.01757681742310524,
-0.016488945111632347,
-0.12940441071987152,
0.13275215029716492,
-0.11361365020275116,
0.11405333131551743,
-0.033281151205301285,
0.08113711327314377,
0.0011480002431198955,
-0.05719071999192238,
0.11574403196573257,
0.025619203224778175,
0.09182140231132507,
-0.06217515841126442,
-0.009550973773002625,
0.034517161548137665,
-0.03253617510199547,
0.045991234481334686,
-0.09658768028020859,
0.03295158967375755,
-0.10942183434963226,
-0.022280361503362656,
-0.07863087952136993,
0.050379473716020584,
-0.03720703721046448,
-0.07268305867910385,
-0.045583344995975494,
0.024610958993434906,
0.058386627584695816,
-0.007836049422621727,
0.1320382058620453,
0.03703298792243004,
0.1492828130722046,
0.14895401895046234,
0.0633077323436737,
-0.08276274800300598,
-0.09166482090950012,
-0.02376432716846466,
-0.008475062437355518,
0.05538862198591232,
-0.13468174636363983,
0.02516511268913746,
0.14910313487052917,
0.009283688850700855,
0.1434093564748764,
0.08868255466222763,
-0.024138309061527252,
-0.010903424583375454,
0.05391009524464607,
-0.17287656664848328,
-0.0912737175822258,
-0.022629106417298317,
-0.06910163909196854,
-0.11277623474597931,
0.0578177273273468,
0.10377898067235947,
-0.08135166764259338,
0.0027894554659724236,
-0.0035725918132811785,
0.013347047381103039,
-0.05685470253229141,
0.18538351356983185,
0.06988276541233063,
0.04949139058589935,
-0.10178396850824356,
0.08455973118543625,
0.04742245376110077,
-0.05938874930143356,
0.004325795918703079,
0.0802963599562645,
-0.09815271943807602,
-0.05330231040716171,
0.0575033500790596,
0.17558030784130096,
-0.05554148927330971,
-0.05840103700757027,
-0.14799155294895172,
-0.1337965577840805,
0.08001518994569778,
0.13224905729293823,
0.11734503507614136,
0.007621163967996836,
-0.0731721743941307,
0.0018782420083880424,
-0.11119019240140915,
0.10568799078464508,
0.04611429572105408,
0.054635100066661835,
-0.1548280417919159,
0.13684171438217163,
0.016183797270059586,
0.06361408531665802,
-0.02537795715034008,
0.022304683923721313,
-0.08460108935832977,
0.008917045779526234,
-0.10307580232620239,
-0.021674778312444687,
-0.016643773764371872,
0.009908930398523808,
-0.005108180455863476,
-0.048300500959157944,
-0.05661845579743385,
0.024906059727072716,
-0.10549948364496231,
-0.025202332064509392,
0.03343742713332176,
0.05218002200126648,
-0.11639901250600815,
-0.04493873566389084,
0.01591404154896736,
-0.06088894605636597,
0.08527985960245132,
0.053717825561761856,
0.011508917436003685,
0.05170179903507233,
-0.12246845662593842,
0.016817618161439896,
0.0738915354013443,
0.0340840183198452,
0.06104329973459244,
-0.09633228182792664,
-0.0042505268938839436,
0.000009833319381868932,
0.02962779812514782,
0.022913290187716484,
0.07663044333457947,
-0.1377095878124237,
-0.002321540145203471,
-0.02691950649023056,
-0.0646195262670517,
-0.06714501231908798,
0.020013272762298584,
0.08294350653886795,
0.03559565171599388,
0.2057630866765976,
-0.07553393393754959,
0.04691246524453163,
-0.2135545313358307,
0.007876020856201649,
-0.000216558575630188,
-0.11841729283332825,
-0.09525562822818756,
-0.05129927024245262,
0.054044146090745926,
-0.06362396478652954,
0.15451955795288086,
0.04878443479537964,
0.014883937314152718,
0.02846788428723812,
-0.017297670245170593,
0.02067759819328785,
0.007559371646493673,
0.18741635978221893,
0.03372693434357643,
-0.03029473125934601,
0.07180187851190567,
0.03866606578230858,
0.11274004727602005,
0.11382657289505005,
0.1963241696357727,
0.13170114159584045,
-0.0013890049885958433,
0.09478294849395752,
0.04671492055058479,
-0.060518838465213776,
-0.16289113461971283,
0.050670377910137177,
-0.03682221099734306,
0.11981121450662613,
-0.02724079228937626,
0.19231046736240387,
0.0680035874247551,
-0.16162194311618805,
0.04740280285477638,
-0.04798690602183342,
-0.08724562078714371,
-0.11724910885095596,
-0.06196066364645958,
-0.0703415796160698,
-0.14260202646255493,
-0.004314409103244543,
-0.11993777751922607,
-0.0011970066698268056,
0.11179531365633011,
0.011872668750584126,
-0.03381989151239395,
0.15067675709724426,
0.0049347300082445145,
0.01584346778690815,
0.06775166839361191,
0.006889836862683296,
-0.03921027481555939,
-0.10968586057424545,
-0.05533943325281143,
-0.022852586582303047,
-0.010463658720254898,
0.03234920650720596,
-0.05988675728440285,
-0.03473479300737381,
0.04022863507270813,
-0.022115882486104965,
-0.09211068600416183,
0.0050157164223492146,
0.008778516203165054,
0.05603949353098869,
0.058358147740364075,
0.006786048412322998,
0.01867271214723587,
-0.00904162973165512,
0.20554876327514648,
-0.07349023222923279,
-0.06961073726415634,
-0.0994342714548111,
0.23794832825660706,
0.03023211844265461,
-0.020472804084420204,
0.030288010835647583,
-0.06730993092060089,
-0.01924297772347927,
0.24743548035621643,
0.22831939160823822,
-0.08813437819480896,
-0.002375643001869321,
0.022723764181137085,
-0.012981930747628212,
-0.015660367906093597,
0.10397703945636749,
0.14611119031906128,
0.06380722671747208,
-0.08587156981229782,
-0.03012181632220745,
-0.061355117708444595,
-0.0057125394232571125,
-0.04494314268231392,
0.07262426614761353,
0.05102216824889183,
0.0090905437245965,
-0.03557809814810753,
0.0481061227619648,
-0.07406124472618103,
-0.09439066797494888,
0.03818196430802345,
-0.20141011476516724,
-0.15944211184978485,
-0.015332120470702648,
0.0906912311911583,
0.00081416848115623,
0.05876147747039795,
-0.029903501272201538,
0.004966472741216421,
0.0858391523361206,
-0.02451716549694538,
-0.09145668894052505,
-0.07586037367582321,
0.0924239307641983,
-0.11985260248184204,
0.2271886169910431,
-0.04932300001382828,
0.07011982798576355,
0.12035950273275375,
0.05313970148563385,
-0.06426870077848434,
0.06659746915102005,
0.04335322976112366,
-0.03714822977781296,
0.012033305130898952,
0.07828888297080994,
-0.027794616296887398,
0.06781288981437683,
0.04152531176805496,
-0.13622832298278809,
0.006067936774343252,
-0.04256989061832428,
-0.06817709654569626,
-0.039848849177360535,
-0.03317209705710411,
-0.06347433477640152,
0.1263163983821869,
0.21544258296489716,
-0.029096631333231926,
-0.012426361441612244,
-0.0822412520647049,
0.0072579337283968925,
0.04832668974995613,
0.01631281152367592,
-0.04694430157542229,
-0.21067176759243011,
0.017074674367904663,
0.04200243577361107,
-0.022419050335884094,
-0.22656740248203278,
-0.09428352117538452,
0.007315013557672501,
-0.07921107113361359,
-0.08809321373701096,
0.06588101387023926,
0.08122624456882477,
0.044515326619148254,
-0.05935506150126457,
-0.02902861498296261,
-0.08415134996175766,
0.13841569423675537,
-0.14506271481513977,
-0.08880157023668289
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-mrpc
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0872
- Accuracy: 0.8260
- F1: 0.8799
- Combined Score: 0.8529
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.5656 | 1.0 | 917 | 0.6999 | 0.7843 | 0.8581 | 0.8212 |
| 0.3874 | 2.0 | 1834 | 0.7280 | 0.8088 | 0.8691 | 0.8390 |
| 0.1627 | 3.0 | 2751 | 1.1274 | 0.8162 | 0.8780 | 0.8471 |
| 0.0751 | 4.0 | 3668 | 1.0289 | 0.8333 | 0.8870 | 0.8602 |
| 0.0339 | 5.0 | 4585 | 1.0872 | 0.8260 | 0.8799 | 0.8529 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "fnet-large-finetuned-mrpc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE MRPC", "type": "glue", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.8259803921568627, "name": "Accuracy"}, {"type": "f1", "value": 0.8798646362098139, "name": "F1"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-mrpc
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-mrpc
=========================
This model is a fine-tuned version of google/fnet-large on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
* Loss: 1.0872
* Accuracy: 0.8260
* F1: 0.8799
* Combined Score: 0.8529
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09546191245317459,
0.08599092066287994,
-0.002819906221702695,
0.12520475685596466,
0.15879055857658386,
0.026399699971079826,
0.11949319392442703,
0.12398721277713776,
-0.09084320068359375,
0.011874276213347912,
0.11916262656450272,
0.15975700318813324,
0.017096880823373795,
0.1293363869190216,
-0.05514799430966377,
-0.2581601142883301,
-0.009066539816558361,
0.05219760909676552,
-0.0730782225728035,
0.13537991046905518,
0.10601045191287994,
-0.11119034886360168,
0.0858348160982132,
0.012250698171555996,
-0.19032633304595947,
0.005159530322998762,
0.007800718303769827,
-0.06162724271416664,
0.14478495717048645,
0.026664696633815765,
0.11614209413528442,
-0.005005354061722755,
0.07336033135652542,
-0.19494523108005524,
0.011148547753691673,
0.062217213213443756,
-0.0008450024761259556,
0.09304920583963394,
0.04146657511591911,
0.01810573972761631,
0.13009214401245117,
-0.07812145352363586,
0.04607972502708435,
0.024277985095977783,
-0.10066843777894974,
-0.25224390625953674,
-0.06780645996332169,
0.04294602945446968,
0.06958787888288498,
0.1144355908036232,
-0.003222366329282522,
0.12924230098724365,
-0.07112066447734833,
0.09992790967226028,
0.20228120684623718,
-0.2846123278141022,
-0.06414385139942169,
0.05315433070063591,
-0.0006625417736358941,
0.05653926730155945,
-0.09665023535490036,
-0.03479963168501854,
0.047205254435539246,
0.056852784007787704,
0.13653863966464996,
-0.02365931309759617,
-0.11345908790826797,
-0.00036166355130262673,
-0.14659838378429413,
-0.03392684459686279,
0.18420065939426422,
0.04295152425765991,
-0.029913606122136116,
-0.05532674118876457,
-0.07111954689025879,
-0.1298159509897232,
-0.033634983003139496,
-0.01663203164935112,
0.042924873530864716,
-0.017111431807279587,
-0.027516013011336327,
-0.018441833555698395,
-0.10968123376369476,
-0.061496250331401825,
-0.07938845455646515,
0.12648801505565643,
0.0373440757393837,
0.026029396802186966,
-0.03856557980179787,
0.10718294978141785,
-0.01039015781134367,
-0.12635262310504913,
0.02376563660800457,
0.014790150336921215,
0.023429078981280327,
-0.028944209218025208,
-0.05729011818766594,
-0.07710803300142288,
0.013970785774290562,
0.11589844524860382,
-0.06570203602313995,
0.04521610215306282,
0.03654171898961067,
0.05179869011044502,
-0.08960038423538208,
0.17069628834724426,
-0.033626291900873184,
-0.022550251334905624,
0.017764555290341377,
0.04625367373228073,
0.02610119618475437,
-0.018981298431754112,
-0.12368905544281006,
0.01198206003755331,
0.09392620623111725,
0.0013035978190600872,
-0.05933376029133797,
0.08104883879423141,
-0.038034044206142426,
-0.033083993941545486,
-0.005496066529303789,
-0.09730266779661179,
0.019998257979750633,
0.006821903865784407,
-0.08317822217941284,
-0.020987385883927345,
0.037898458540439606,
0.013462735339999199,
-0.027173368260264397,
0.10108006745576859,
-0.08223720639944077,
0.021280832588672638,
-0.0873541384935379,
-0.1072964146733284,
0.01761503331363201,
-0.08617527782917023,
0.019810689613223076,
-0.10400725156068802,
-0.18891578912734985,
-0.0077062081545591354,
0.06357033550739288,
-0.01852494478225708,
-0.06713766604661942,
-0.04489322751760483,
-0.08013653755187988,
0.009742194786667824,
-0.014273520559072495,
0.12686266005039215,
-0.06624139845371246,
0.09025576710700989,
0.02989248000085354,
0.05423325300216675,
-0.055293235927820206,
0.05391918867826462,
-0.09691685438156128,
0.008810688741505146,
-0.16241350769996643,
0.041948363184928894,
-0.05697361379861832,
0.05600197985768318,
-0.07802683860063553,
-0.10861963778734207,
0.014010758139193058,
-0.009118323214352131,
0.051885224878787994,
0.0832173079252243,
-0.1714225709438324,
-0.07527991384267807,
0.1451883614063263,
-0.08433152735233307,
-0.117959164083004,
0.12364167720079422,
-0.05751069262623787,
0.03996153548359871,
0.055080514401197433,
0.17722031474113464,
0.09120480716228485,
-0.06544680148363113,
0.005950626451522112,
0.03313767910003662,
0.04212610796093941,
-0.06831853836774826,
0.06972771137952805,
0.019984891638159752,
0.016782991588115692,
0.03638952597975731,
-0.03108343482017517,
0.06633422523736954,
-0.07898469269275665,
-0.09579379111528397,
-0.046762142330408096,
-0.08413276076316833,
0.03685159981250763,
0.07383585721254349,
0.07521867007017136,
-0.08728065341711044,
-0.07636827975511551,
0.03872332349419594,
0.08539536595344543,
-0.05908842012286186,
0.026852475479245186,
-0.053241580724716187,
0.07960749417543411,
-0.022701889276504517,
-0.010898844338953495,
-0.18033812940120697,
-0.0489765927195549,
0.013111790642142296,
-0.019065599888563156,
0.011413183994591236,
0.034898094832897186,
0.05896642804145813,
0.06169478967785835,
-0.05224358290433884,
-0.006700506899505854,
-0.03707636147737503,
-0.002736605005338788,
-0.11889437586069107,
-0.19659262895584106,
-0.03258073702454567,
-0.0219099298119545,
0.1598203033208847,
-0.20940329134464264,
0.051057182252407074,
-0.007849784567952156,
0.08635152131319046,
0.004428023472428322,
-0.004245837219059467,
-0.04512714967131615,
0.05986940115690231,
-0.04119914397597313,
-0.04754257947206497,
0.0797266736626625,
0.020951641723513603,
-0.08738018572330475,
-0.05052084103226662,
-0.08946986496448517,
0.17589566111564636,
0.13004666566848755,
-0.1203686073422432,
-0.06231275945901871,
-0.005228476598858833,
-0.07227027416229248,
-0.03346370905637741,
-0.04478921368718147,
0.027298297733068466,
0.19398172199726105,
-0.010374259203672409,
0.15592092275619507,
-0.07345370203256607,
-0.04698825627565384,
0.025998717173933983,
-0.029928049072623253,
0.01762254908680916,
0.11882616579532623,
0.11529583483934402,
-0.08247628062963486,
0.15382464230060577,
0.14487947523593903,
-0.09263201802968979,
0.14530399441719055,
-0.04800577461719513,
-0.05440796539187431,
-0.012026961892843246,
-0.02653445303440094,
-0.02075771987438202,
0.10357513278722763,
-0.1614956557750702,
0.010308556258678436,
0.03666393458843231,
0.022368796169757843,
0.033521343022584915,
-0.21500913798809052,
-0.033284418284893036,
0.03360406681895256,
-0.04915229231119156,
-0.005554342642426491,
-0.00600228039547801,
0.011097459122538567,
0.10555184632539749,
-0.0037921082694083452,
-0.1044158786535263,
0.05074157938361168,
-0.006137927062809467,
-0.08894295990467072,
0.21745462715625763,
-0.08632136881351471,
-0.18696673214435577,
-0.11720063537359238,
-0.05977078899741173,
-0.07440357655286789,
-0.005697417538613081,
0.05866274610161781,
-0.08173168450593948,
-0.03133115917444229,
-0.07904272526502609,
0.014714551158249378,
-0.0008177160052582622,
0.02673378773033619,
0.017737895250320435,
0.003071904182434082,
0.08112871646881104,
-0.11631204187870026,
-0.014812331646680832,
-0.06412842124700546,
-0.04106014594435692,
0.039631348103284836,
0.04294215887784958,
0.09911496192216873,
0.15236623585224152,
-0.01932796649634838,
0.022595083341002464,
-0.021412083879113197,
0.24335768818855286,
-0.06545376032590866,
-0.018363337963819504,
0.15558351576328278,
-0.00867502298206091,
0.04948771744966507,
0.11573895066976547,
0.0715351402759552,
-0.07415314763784409,
0.0050702085718512535,
0.03689371049404144,
-0.036363378167152405,
-0.22274886071681976,
-0.06535296142101288,
-0.05658477544784546,
0.02026711404323578,
0.07761484384536743,
0.03714492544531822,
0.04264776408672333,
0.05929701030254364,
0.04340836778283119,
0.0686696395277977,
-0.02882838249206543,
0.05045050010085106,
0.15857917070388794,
0.03623516485095024,
0.12905411422252655,
-0.0383649580180645,
-0.05830521136522293,
0.04401826858520508,
-0.01650388538837433,
0.2045838087797165,
-0.010560386814177036,
0.10752256959676743,
0.05580081418156624,
0.17469125986099243,
-0.009762926027178764,
0.07077998667955399,
-0.015204674564301968,
-0.032061636447906494,
-0.015053500421345234,
-0.038609445095062256,
-0.029359186068177223,
0.01937456801533699,
-0.07770969718694687,
0.060527801513671875,
-0.11411376297473907,
0.010677093639969826,
0.059778086841106415,
0.23909689486026764,
0.03223899379372597,
-0.3180825710296631,
-0.09792063385248184,
-0.005235941149294376,
-0.03779191896319389,
-0.022020915523171425,
0.032515332102775574,
0.10104595869779587,
-0.10247465968132019,
0.039183102548122406,
-0.0768279880285263,
0.09344609081745148,
-0.04620886594057083,
0.05521034076809883,
0.09871474653482437,
0.09597338736057281,
0.02048015221953392,
0.096494659781456,
-0.2858371138572693,
0.25810134410858154,
-0.0008917004452086985,
0.055689625442028046,
-0.07825443148612976,
0.011470974422991276,
0.03769155591726303,
0.07573051750659943,
0.07050973176956177,
-0.010998306795954704,
-0.02132379449903965,
-0.17557914555072784,
-0.06484248489141464,
0.03208262845873833,
0.06640531122684479,
-0.02824542112648487,
0.07367397099733353,
-0.02855999954044819,
0.009517839178442955,
0.07583609968423843,
0.010695524513721466,
-0.05560624599456787,
-0.09996896237134933,
0.0005532020586542785,
0.053071361035108566,
-0.06607106328010559,
-0.06424704939126968,
-0.1176302582025528,
-0.11913973838090897,
0.15881624817848206,
-0.024910470470786095,
-0.05115595459938049,
-0.10973544418811798,
0.0894245058298111,
0.05847169831395149,
-0.08369704335927963,
0.042348362505435944,
-0.003034900175407529,
0.08414874970912933,
0.022848933935165405,
-0.08384497463703156,
0.09918875247240067,
-0.08087218552827835,
-0.15695929527282715,
-0.056926511228084564,
0.11002043634653091,
0.03164108842611313,
0.058370400220155716,
-0.01081100944429636,
0.0006559081957675517,
-0.04384872689843178,
-0.09692469984292984,
0.013017899356782436,
-0.009734023362398148,
0.07791733741760254,
0.02332180365920067,
-0.055519409477710724,
0.010724114254117012,
-0.06127122417092323,
-0.025878654792904854,
0.1999949812889099,
0.20991948246955872,
-0.10273119807243347,
0.01914804056286812,
0.02892344444990158,
-0.06101452186703682,
-0.19818688929080963,
0.026290491223335266,
0.06219090148806572,
-0.006466645747423172,
0.028350498527288437,
-0.1772705316543579,
0.1424008160829544,
0.09497801214456558,
-0.019320080056786537,
0.11905593425035477,
-0.3191659450531006,
-0.11980247497558594,
0.13569417595863342,
0.12890441715717316,
0.11123138666152954,
-0.12752926349639893,
-0.018578607589006424,
-0.01675238274037838,
-0.12915006279945374,
0.1329793781042099,
-0.1172524243593216,
0.11238279938697815,
-0.03414227068424225,
0.08010146021842957,
0.001642079558223486,
-0.0570412315428257,
0.11512328684329987,
0.025005081668496132,
0.09225330501794815,
-0.06164787337183952,
-0.008495654910802841,
0.03452535346150398,
-0.03190139681100845,
0.04483703896403313,
-0.09700168669223785,
0.033334556967020035,
-0.10865794122219086,
-0.0214571263641119,
-0.07938719540834427,
0.05095565319061279,
-0.03664569556713104,
-0.07135503739118576,
-0.045485079288482666,
0.02403452806174755,
0.057519298046827316,
-0.00724073639139533,
0.13187195360660553,
0.037778809666633606,
0.14953835308551788,
0.1497054547071457,
0.06296437233686447,
-0.0816614180803299,
-0.09189815074205399,
-0.02339370734989643,
-0.007961608469486237,
0.05737365782260895,
-0.13682818412780762,
0.025487486273050308,
0.14920346438884735,
0.01067504845559597,
0.1419643610715866,
0.08810736984014511,
-0.024375448003411293,
-0.011216367594897747,
0.05464911088347435,
-0.17467109858989716,
-0.09439196437597275,
-0.02270573377609253,
-0.06931719183921814,
-0.11218071728944778,
0.05837605893611908,
0.10388980060815811,
-0.08238784968852997,
0.0025478980969637632,
-0.0036189118400216103,
0.01330843847244978,
-0.05678478628396988,
0.18629495799541473,
0.06895172595977783,
0.04923379421234131,
-0.10211517661809921,
0.0844893679022789,
0.04786021634936333,
-0.061863500624895096,
0.003803879488259554,
0.07889994233846664,
-0.09867862612009048,
-0.053205981850624084,
0.05900222063064575,
0.17500045895576477,
-0.05377772077918053,
-0.0596037320792675,
-0.14735089242458344,
-0.13265158236026764,
0.07988547533750534,
0.13361380994319916,
0.11714371293783188,
0.0073603312484920025,
-0.07273177057504654,
0.002185244346037507,
-0.11119045317173004,
0.10686644166707993,
0.0465870127081871,
0.055424951016902924,
-0.1553489863872528,
0.13742566108703613,
0.016396204009652138,
0.06254139542579651,
-0.02566162869334221,
0.02288627438247204,
-0.08485285937786102,
0.008428186178207397,
-0.10126855224370956,
-0.02166859805583954,
-0.015184788964688778,
0.009756003506481647,
-0.005930773913860321,
-0.04936628416180611,
-0.05678648129105568,
0.02513021230697632,
-0.10625382512807846,
-0.02489090897142887,
0.033812735229730606,
0.05252465978264809,
-0.1173783391714096,
-0.04442751407623291,
0.016967257484793663,
-0.06054551899433136,
0.08527270704507828,
0.053835052996873856,
0.012136760167777538,
0.05291440710425377,
-0.12204016745090485,
0.016342202201485634,
0.07364647090435028,
0.0349479541182518,
0.06090749427676201,
-0.09562382102012634,
-0.003282394725829363,
-0.00001865075682871975,
0.030004533007740974,
0.022373711690306664,
0.07710137963294983,
-0.1379968672990799,
-0.0034996774047613144,
-0.027388708665966988,
-0.06447653472423553,
-0.06685411930084229,
0.019427789375185966,
0.08218405395746231,
0.03522159904241562,
0.20542800426483154,
-0.07487155497074127,
0.045620422810316086,
-0.21430227160453796,
0.00823394488543272,
-0.0004625043657142669,
-0.11957801133394241,
-0.09532622992992401,
-0.051133062690496445,
0.05481477826833725,
-0.06431839615106583,
0.1534840315580368,
0.04941427335143089,
0.01592053472995758,
0.028808899223804474,
-0.01811968721449375,
0.018443580716848373,
0.0080113485455513,
0.18708765506744385,
0.03429579362273216,
-0.02985781617462635,
0.07225026190280914,
0.039080578833818436,
0.11195438355207443,
0.11603232473134995,
0.19704307615756989,
0.13246352970600128,
-0.0008535053348168731,
0.0951245054602623,
0.04693981632590294,
-0.06035655736923218,
-0.16422057151794434,
0.05283075198531151,
-0.03835199028253555,
0.12152128666639328,
-0.028580691665410995,
0.19253279268741608,
0.06901441514492035,
-0.16116325557231903,
0.04764442518353462,
-0.04824594035744667,
-0.08681273460388184,
-0.11683184653520584,
-0.06121692433953285,
-0.07114308327436447,
-0.1440211683511734,
-0.004636962898075581,
-0.11918430775403976,
-0.0008277559536509216,
0.1097910925745964,
0.010498684830963612,
-0.03498762845993042,
0.15053655207157135,
0.006402645725756884,
0.015238703228533268,
0.0695597305893898,
0.006615471560508013,
-0.04100614786148071,
-0.10920800268650055,
-0.05520232766866684,
-0.022179335355758667,
-0.010697388090193272,
0.03149294853210449,
-0.06045083701610565,
-0.03633178025484085,
0.04040398821234703,
-0.02195586822926998,
-0.09119494259357452,
0.005079627502709627,
0.00920786615461111,
0.05570284649729729,
0.05661286786198616,
0.007228526286780834,
0.018431752920150757,
-0.009099789895117283,
0.20788635313510895,
-0.0740276426076889,
-0.06805037707090378,
-0.09928686916828156,
0.23586858808994293,
0.031392164528369904,
-0.020330391824245453,
0.03078066185116768,
-0.06767775863409042,
-0.018910378217697144,
0.2457672357559204,
0.22598128020763397,
-0.08714233338832855,
-0.002273096702992916,
0.022417528554797173,
-0.01315001118928194,
-0.016113251447677612,
0.10431814938783646,
0.14606770873069763,
0.06461671739816666,
-0.08543983101844788,
-0.029877889901399612,
-0.06068196892738342,
-0.006037788465619087,
-0.04389375448226929,
0.07362931221723557,
0.049937255680561066,
0.007489452138543129,
-0.03518691286444664,
0.048443753272295,
-0.07235494256019592,
-0.09499064832925797,
0.038542162626981735,
-0.20198656618595123,
-0.16029119491577148,
-0.015390058979392052,
0.09282878786325455,
0.0010308976052328944,
0.057557959109544754,
-0.030114294961094856,
0.0048396289348602295,
0.08691193163394928,
-0.024416247382760048,
-0.09137684851884842,
-0.07855791598558426,
0.09151489287614822,
-0.12239585816860199,
0.22643736004829407,
-0.04916463419795036,
0.07056054472923279,
0.11964943259954453,
0.05358283966779709,
-0.06439071148633957,
0.06493911147117615,
0.04323429614305496,
-0.03877938538789749,
0.012006440199911594,
0.08003684133291245,
-0.027095813304185867,
0.06842819601297379,
0.041951268911361694,
-0.1349412053823471,
0.005635885521769524,
-0.04544267803430557,
-0.06833399087190628,
-0.040099576115608215,
-0.03195123374462128,
-0.0631319135427475,
0.12548556923866272,
0.2150724083185196,
-0.029118625447154045,
-0.012089218944311142,
-0.08287710696458817,
0.007270357105880976,
0.048678286373615265,
0.018202396109700203,
-0.046538036316633224,
-0.2092742621898651,
0.017721407115459442,
0.0428425669670105,
-0.02169046550989151,
-0.224505215883255,
-0.09362480044364929,
0.00570263434201479,
-0.07918917387723923,
-0.08765849471092224,
0.06609953939914703,
0.08304844796657562,
0.04343817010521889,
-0.059563811868429184,
-0.03427022323012352,
-0.08322528004646301,
0.1385328769683838,
-0.145302414894104,
-0.08898579329252243
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-qqp
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5515
- Accuracy: 0.8943
- F1: 0.8557
- Combined Score: 0.8750
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:------:|:---------------:|:--------:|:------:|:--------------:|
| 0.4574 | 1.0 | 90962 | 0.4946 | 0.8694 | 0.8297 | 0.8496 |
| 0.3387 | 2.0 | 181924 | 0.4745 | 0.8874 | 0.8437 | 0.8655 |
| 0.2029 | 3.0 | 272886 | 0.5515 | 0.8943 | 0.8557 | 0.8750 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "fnet-large-finetuned-qqp", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE QQP", "type": "glue", "args": "qqp"}, "metrics": [{"type": "accuracy", "value": 0.8943111550828593, "name": "Accuracy"}, {"type": "f1", "value": 0.8556565212985171, "name": "F1"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-qqp
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-qqp
========================
This model is a fine-tuned version of google/fnet-large on the GLUE QQP dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5515
* Accuracy: 0.8943
* F1: 0.8557
* Combined Score: 0.8750
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09392090886831284,
0.086028553545475,
-0.002810414880514145,
0.12563712894916534,
0.15835294127464294,
0.02616388164460659,
0.12096844613552094,
0.12440948188304901,
-0.09253979474306107,
0.01095916610211134,
0.11961711943149567,
0.16139951348304749,
0.017056036740541458,
0.12809212505817413,
-0.055080391466617584,
-0.25876080989837646,
-0.008076829835772514,
0.051302410662174225,
-0.07248687744140625,
0.13545523583889008,
0.10696391016244888,
-0.10981715470552444,
0.08594673126935959,
0.012483679689466953,
-0.188598170876503,
0.0053088548593223095,
0.006561026908457279,
-0.06197373941540718,
0.14454741775989532,
0.02627757005393505,
0.11637239903211594,
-0.004603952635079622,
0.07419975847005844,
-0.1947721242904663,
0.010915560647845268,
0.06143670529127121,
-0.0008093014475889504,
0.0922541469335556,
0.04200132563710213,
0.016881262883543968,
0.12951737642288208,
-0.07753071933984756,
0.047175586223602295,
0.024195468053221703,
-0.10057501494884491,
-0.2488769292831421,
-0.06924816966056824,
0.04293599724769592,
0.0703204795718193,
0.114735446870327,
-0.0025232243351638317,
0.12956556677818298,
-0.07075148820877075,
0.09971623867750168,
0.2038886547088623,
-0.28335192799568176,
-0.06467258930206299,
0.053865209221839905,
0.00007339009607676417,
0.05563786253333092,
-0.09489043802022934,
-0.03489343822002411,
0.0464896559715271,
0.05855744704604149,
0.13576450943946838,
-0.024170638993382454,
-0.11294703930616379,
-0.00035783895873464644,
-0.14739225804805756,
-0.03345825895667076,
0.18343965709209442,
0.04216032102704048,
-0.031146908178925514,
-0.0553949810564518,
-0.07025839388370514,
-0.13019998371601105,
-0.03334832936525345,
-0.016549253836274147,
0.042229484766721725,
-0.017025606706738472,
-0.02913014031946659,
-0.02009456790983677,
-0.10942849516868591,
-0.06127852201461792,
-0.07925374060869217,
0.12757611274719238,
0.03740452602505684,
0.026067685335874557,
-0.036684900522232056,
0.1074313297867775,
-0.008039964362978935,
-0.12542767822742462,
0.02348942495882511,
0.014696755446493626,
0.02391907386481762,
-0.029253896325826645,
-0.05762016028165817,
-0.07737185806035995,
0.013022209517657757,
0.11649444699287415,
-0.06627753376960754,
0.045372430235147476,
0.03719029203057289,
0.052638933062553406,
-0.08948004990816116,
0.17119163274765015,
-0.03414352610707283,
-0.020914733409881592,
0.017712917178869247,
0.04670121893286705,
0.026985853910446167,
-0.018883170560002327,
-0.12442363798618317,
0.01235498022288084,
0.09396601468324661,
0.002359611913561821,
-0.05978254973888397,
0.08112926036119461,
-0.03828165680170059,
-0.03376328945159912,
-0.004834178369492292,
-0.09749481827020645,
0.019099729135632515,
0.006322276312857866,
-0.08288158476352692,
-0.019768934696912766,
0.03858035430312157,
0.014716193079948425,
-0.02650129422545433,
0.09978698194026947,
-0.08106034249067307,
0.02185039594769478,
-0.08661425113677979,
-0.10680398344993591,
0.017903147265315056,
-0.0847211480140686,
0.01903758943080902,
-0.10459426045417786,
-0.19062873721122742,
-0.0070509458892047405,
0.06475710868835449,
-0.01869357004761696,
-0.06680820137262344,
-0.0446598045527935,
-0.07923983782529831,
0.009182565845549107,
-0.014331415295600891,
0.1251339614391327,
-0.06613124161958694,
0.08929501473903656,
0.029013488441705704,
0.053821150213479996,
-0.056043438613414764,
0.053580932319164276,
-0.09583668410778046,
0.009967178106307983,
-0.16321894526481628,
0.04253964126110077,
-0.05672885477542877,
0.05521031841635704,
-0.07788251340389252,
-0.10970792174339294,
0.015046020038425922,
-0.008545617572963238,
0.05209499970078468,
0.0823349803686142,
-0.17201119661331177,
-0.07378672808408737,
0.1421811878681183,
-0.08395789563655853,
-0.11804282665252686,
0.12395995855331421,
-0.05760178342461586,
0.040096431970596313,
0.05442522466182709,
0.1759394258260727,
0.09057454019784927,
-0.06562479585409164,
0.006790442857891321,
0.033411525189876556,
0.04206949099898338,
-0.06806401163339615,
0.07068393379449844,
0.0191428791731596,
0.017440875992178917,
0.03589886054396629,
-0.031538426876068115,
0.06567616760730743,
-0.07883037626743317,
-0.09648585319519043,
-0.04541643336415291,
-0.0845101922750473,
0.03634423762559891,
0.07191202789545059,
0.07480383664369583,
-0.08777596801519394,
-0.07665324211120605,
0.039373867213726044,
0.08542460203170776,
-0.06039068102836609,
0.028123341500759125,
-0.05340079590678215,
0.07857871055603027,
-0.022135380655527115,
-0.012152368202805519,
-0.17895615100860596,
-0.049802783876657486,
0.01269685197621584,
-0.017068060114979744,
0.011587386019527912,
0.03449726849794388,
0.05857099965214729,
0.062182098627090454,
-0.052427344024181366,
-0.006858838256448507,
-0.036305077373981476,
-0.002348072361201048,
-0.11779595166444778,
-0.194713294506073,
-0.033400148153305054,
-0.021897103637456894,
0.16001500189304352,
-0.20776410400867462,
0.05139097571372986,
-0.009029606357216835,
0.08531299233436584,
0.003109408775344491,
-0.004246857482939959,
-0.0448564738035202,
0.05959697812795639,
-0.04208480939269066,
-0.048716846853494644,
0.08012443780899048,
0.02152360789477825,
-0.08858314156532288,
-0.049554068595170975,
-0.09015779942274094,
0.17343412339687347,
0.1291024088859558,
-0.1208619773387909,
-0.06354694813489914,
-0.005686518736183643,
-0.07318001985549927,
-0.032940808683633804,
-0.04457523673772812,
0.025874445214867592,
0.19627976417541504,
-0.00956712942570448,
0.1556379199028015,
-0.073637954890728,
-0.04745488986372948,
0.024473601952195168,
-0.031531598418951035,
0.015794798731803894,
0.1174120306968689,
0.1146426871418953,
-0.0827367752790451,
0.15341469645500183,
0.14470958709716797,
-0.09353531897068024,
0.14427484571933746,
-0.046857599169015884,
-0.05337967351078987,
-0.01239851489663124,
-0.025734765455126762,
-0.01986040733754635,
0.10185948759317398,
-0.1591099053621292,
0.010827534832060337,
0.03590237349271774,
0.02313799038529396,
0.03406902775168419,
-0.21511520445346832,
-0.034339725971221924,
0.03387860581278801,
-0.0499500147998333,
-0.002659587422385812,
-0.00680513633415103,
0.01076573971658945,
0.10498564690351486,
-0.0029810734558850527,
-0.10533218830823898,
0.05257336422801018,
-0.0062987240962684155,
-0.0899871438741684,
0.21725404262542725,
-0.08747546374797821,
-0.18725232779979706,
-0.11880498379468918,
-0.06220771372318268,
-0.07559692859649658,
-0.004758915398269892,
0.05759590119123459,
-0.08066269010305405,
-0.03158093988895416,
-0.0806441456079483,
0.01446909736841917,
-0.002284287940710783,
0.026104530319571495,
0.01951073668897152,
0.0022082950454205275,
0.08154694736003876,
-0.11645926535129547,
-0.015786167234182358,
-0.06321795284748077,
-0.04211984947323799,
0.04032707214355469,
0.04146743565797806,
0.0988055020570755,
0.15235096216201782,
-0.019287779927253723,
0.02282365784049034,
-0.02135869860649109,
0.24310164153575897,
-0.06365299969911575,
-0.018696533516049385,
0.1559664011001587,
-0.009423095732927322,
0.0503767766058445,
0.11493933945894241,
0.07142814248800278,
-0.07500948756933212,
0.006025242153555155,
0.03687376528978348,
-0.037410885095596313,
-0.2224632054567337,
-0.06534454971551895,
-0.05588307976722717,
0.020130153745412827,
0.07723241299390793,
0.03739004582166672,
0.04191461205482483,
0.059889692813158035,
0.04328415170311928,
0.06834448873996735,
-0.02811642736196518,
0.0510883592069149,
0.15787187218666077,
0.03660351783037186,
0.12926755845546722,
-0.03699519485235214,
-0.0581619068980217,
0.044791772961616516,
-0.015560021623969078,
0.2022491693496704,
-0.011031819507479668,
0.10647973418235779,
0.05714922025799751,
0.1732620894908905,
-0.01044081524014473,
0.07149824500083923,
-0.01537350844591856,
-0.03176766633987427,
-0.01585426926612854,
-0.03921106085181236,
-0.030264299362897873,
0.018982039764523506,
-0.0774182453751564,
0.060774918645620346,
-0.1135101318359375,
0.01231792476028204,
0.05960399657487869,
0.24015723168849945,
0.03204027935862541,
-0.3194168210029602,
-0.09705868363380432,
-0.004413720685988665,
-0.03755950182676315,
-0.021074343472719193,
0.033035874366760254,
0.0998033806681633,
-0.10339818149805069,
0.04004545137286186,
-0.07676628977060318,
0.09240566939115524,
-0.04698125272989273,
0.054886672645807266,
0.09951405227184296,
0.09603190422058105,
0.020772648975253105,
0.09640488028526306,
-0.2850910425186157,
0.25600531697273254,
-0.0014360295608639717,
0.05562611296772957,
-0.07749780267477036,
0.012023663148283958,
0.038247883319854736,
0.07425936311483383,
0.07044059038162231,
-0.010851399041712284,
-0.02008097432553768,
-0.1768847107887268,
-0.06550540030002594,
0.03150675445795059,
0.06631500273942947,
-0.02683223783969879,
0.07452084869146347,
-0.02882814034819603,
0.01022765226662159,
0.07610820233821869,
0.013070657849311829,
-0.05542498826980591,
-0.10068942606449127,
0.0006295169587247074,
0.052223872393369675,
-0.06668298691511154,
-0.06467423588037491,
-0.11798346787691116,
-0.1204783171415329,
0.1616269052028656,
-0.020681016147136688,
-0.052140481770038605,
-0.11009234935045242,
0.08729247003793716,
0.058916110545396805,
-0.08323338627815247,
0.04165558144450188,
-0.002712355460971594,
0.0844990462064743,
0.02244596555829048,
-0.08383654803037643,
0.09725132584571838,
-0.0815984308719635,
-0.15663672983646393,
-0.05686290189623833,
0.1098857969045639,
0.03203533589839935,
0.0587601438164711,
-0.010134214535355568,
0.0003890396619681269,
-0.04276092350482941,
-0.0972023457288742,
0.012183141894638538,
-0.009019515477120876,
0.07961194217205048,
0.023807894438505173,
-0.05628948658704758,
0.01429247297346592,
-0.06137530878186226,
-0.025577371940016747,
0.2017984837293625,
0.2101432830095291,
-0.10181240737438202,
0.019432980567216873,
0.02815667726099491,
-0.06017622724175453,
-0.19818763434886932,
0.025082634761929512,
0.06268273293972015,
-0.007134377025067806,
0.029622653499245644,
-0.17602014541625977,
0.1417960673570633,
0.09607643634080887,
-0.019160397350788116,
0.11625179648399353,
-0.3190588355064392,
-0.11957462131977081,
0.13578015565872192,
0.13038641214370728,
0.1082649901509285,
-0.1275387704372406,
-0.017696116119623184,
-0.017272336408495903,
-0.12869010865688324,
0.13378626108169556,
-0.11559057980775833,
0.11307959258556366,
-0.033666882663965225,
0.08151817321777344,
0.0013376958668231964,
-0.05681246146559715,
0.11611095815896988,
0.02491697669029236,
0.09116630256175995,
-0.062243860214948654,
-0.00852359551936388,
0.034516509622335434,
-0.03262631967663765,
0.04529916122555733,
-0.09698519855737686,
0.03298477455973625,
-0.10906574875116348,
-0.022121058776974678,
-0.07863457500934601,
0.05038856714963913,
-0.03678201884031296,
-0.07253073155879974,
-0.04535432532429695,
0.024469682946801186,
0.058498598635196686,
-0.007506924215704203,
0.13079454004764557,
0.037502311170101166,
0.1487056463956833,
0.14970152080059052,
0.06433167308568954,
-0.08174005895853043,
-0.0913252905011177,
-0.023997897282242775,
-0.008367595262825489,
0.05552287399768829,
-0.1339290738105774,
0.025475962087512016,
0.14876805245876312,
0.009036448784172535,
0.1429215669631958,
0.0885363295674324,
-0.024754323065280914,
-0.01073932833969593,
0.05416949465870857,
-0.1737722009420395,
-0.09068270772695541,
-0.022626888006925583,
-0.06832322478294373,
-0.11296725273132324,
0.05743652954697609,
0.10428833216428757,
-0.08115601539611816,
0.002589004347100854,
-0.0032136866357177496,
0.013669951818883419,
-0.057341303676366806,
0.1853988766670227,
0.069394551217556,
0.04892005771398544,
-0.1013467088341713,
0.08452849835157394,
0.04651455581188202,
-0.058266013860702515,
0.004656137898564339,
0.0800386294722557,
-0.0980244129896164,
-0.053028371185064316,
0.05821079760789871,
0.17423874139785767,
-0.0555633120238781,
-0.05913792923092842,
-0.1475023776292801,
-0.13321906328201294,
0.07928811013698578,
0.1311376690864563,
0.11735476553440094,
0.007435607723891735,
-0.07321388274431229,
0.0015353669878095388,
-0.11065926402807236,
0.10537150502204895,
0.0464034266769886,
0.054856207221746445,
-0.15510469675064087,
0.13737551867961884,
0.015811048448085785,
0.06363701820373535,
-0.025411484763026237,
0.022845277562737465,
-0.08434955775737762,
0.008706839755177498,
-0.10348886996507645,
-0.0216924250125885,
-0.016252512112259865,
0.009890284389257431,
-0.0053891371935606,
-0.048445265740156174,
-0.057025883346796036,
0.024999424815177917,
-0.10565505921840668,
-0.025227516889572144,
0.03321393206715584,
0.05191047862172127,
-0.11662878841161728,
-0.04512174427509308,
0.01624988205730915,
-0.06121593341231346,
0.08542851358652115,
0.05370628833770752,
0.01167554035782814,
0.0522066168487072,
-0.12207209318876266,
0.015587622299790382,
0.0742800235748291,
0.03451160714030266,
0.060766711831092834,
-0.09618820250034332,
-0.004330980591475964,
0.000620808859821409,
0.029649745672941208,
0.022910641506314278,
0.0777667984366417,
-0.13724935054779053,
-0.002710487926378846,
-0.02783370390534401,
-0.06454020738601685,
-0.06714898347854614,
0.020097440108656883,
0.08303855359554291,
0.03637102618813515,
0.20598363876342773,
-0.07561089843511581,
0.046382516622543335,
-0.2131723165512085,
0.007946843281388283,
-0.00018174469005316496,
-0.11797532439231873,
-0.09552253782749176,
-0.05074607580900192,
0.05392405018210411,
-0.06373390555381775,
0.15394343435764313,
0.04863494262099266,
0.014815868809819221,
0.02855541929602623,
-0.017866145819425583,
0.0212584026157856,
0.007505857385694981,
0.1868983954191208,
0.03341247886419296,
-0.030030297115445137,
0.07202771306037903,
0.038279302418231964,
0.11288387328386307,
0.11314141005277634,
0.19690823554992676,
0.13149374723434448,
-0.0017040353268384933,
0.09504646062850952,
0.04710931330919266,
-0.060035157948732376,
-0.16356506943702698,
0.05060194432735443,
-0.03684914484620094,
0.11947114020586014,
-0.02734866924583912,
0.1927868127822876,
0.06847987323999405,
-0.16221268475055695,
0.046570226550102234,
-0.04790783300995827,
-0.08729487657546997,
-0.11747626960277557,
-0.06214311346411705,
-0.07060721516609192,
-0.14227285981178284,
-0.004479394759982824,
-0.11955714225769043,
-0.0012408922193571925,
0.11118984967470169,
0.01135226245969534,
-0.034017469733953476,
0.14965499937534332,
0.004561042878776789,
0.015436336398124695,
0.06754950433969498,
0.006787540391087532,
-0.03938409313559532,
-0.10845154523849487,
-0.05584019422531128,
-0.02278803661465645,
-0.009399445727467537,
0.0328291691839695,
-0.06014551594853401,
-0.03446387127041817,
0.03999267518520355,
-0.022419964894652367,
-0.091934934258461,
0.004938766360282898,
0.008898281492292881,
0.05622522905468941,
0.05688990280032158,
0.007466019131243229,
0.018164541572332382,
-0.008829513564705849,
0.20546086132526398,
-0.07387928664684296,
-0.06927040219306946,
-0.09953052550554276,
0.2376701980829239,
0.03024473413825035,
-0.020175063982605934,
0.030588623136281967,
-0.06754070520401001,
-0.019916238263249397,
0.24630585312843323,
0.22834931313991547,
-0.08888491988182068,
-0.0023557317908853292,
0.022025488317012787,
-0.012935125268995762,
-0.015982933342456818,
0.10426705330610275,
0.14598897099494934,
0.06348761916160583,
-0.0856001153588295,
-0.030391816049814224,
-0.061101991683244705,
-0.0055675022304058075,
-0.0447319820523262,
0.0732647031545639,
0.0502418652176857,
0.009062495082616806,
-0.03542236611247063,
0.04821065813302994,
-0.07359969615936279,
-0.09477446228265762,
0.037203386425971985,
-0.20104920864105225,
-0.1598745882511139,
-0.01462660450488329,
0.09054913371801376,
0.0001615012442925945,
0.0589400939643383,
-0.030530771240592003,
0.004999163560569286,
0.08607053011655807,
-0.024409007281064987,
-0.09203089773654938,
-0.0751798152923584,
0.09285302460193634,
-0.1202358677983284,
0.22683995962142944,
-0.049215227365493774,
0.07103195041418076,
0.12016253918409348,
0.053361181169748306,
-0.06377849727869034,
0.06605267524719238,
0.04349633678793907,
-0.03749271482229233,
0.012211945839226246,
0.07883132994174957,
-0.02810114249587059,
0.0679730772972107,
0.042277514934539795,
-0.1358688473701477,
0.0054893274791538715,
-0.043648626655340195,
-0.06824120879173279,
-0.039956267923116684,
-0.03296131268143654,
-0.06381493806838989,
0.1260761022567749,
0.214870885014534,
-0.02928794175386429,
-0.012312759645283222,
-0.08210945874452591,
0.007474972400814295,
0.048175226897001266,
0.016146136447787285,
-0.04649002104997635,
-0.21020174026489258,
0.016797954216599464,
0.041385382413864136,
-0.022210748866200447,
-0.2261059582233429,
-0.09453583508729935,
0.006697536446154118,
-0.07942066341638565,
-0.08833836019039154,
0.06651146709918976,
0.08081953972578049,
0.043811965733766556,
-0.05939081683754921,
-0.029742924496531487,
-0.0846235603094101,
0.13799507915973663,
-0.1453099101781845,
-0.08864983171224594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-rte
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7528
- Accuracy: 0.6426
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7105 | 1.0 | 623 | 0.6887 | 0.5740 |
| 0.6714 | 2.0 | 1246 | 0.6742 | 0.6209 |
| 0.509 | 3.0 | 1869 | 0.7528 | 0.6426 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-large-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE RTE", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.6425992779783394, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-rte
========================
This model is a fine-tuned version of google/fnet-large on the GLUE RTE dataset.
It achieves the following results on the evaluation set:
* Loss: 0.7528
* Accuracy: 0.6426
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09392090886831284,
0.086028553545475,
-0.002810414880514145,
0.12563712894916534,
0.15835294127464294,
0.02616388164460659,
0.12096844613552094,
0.12440948188304901,
-0.09253979474306107,
0.01095916610211134,
0.11961711943149567,
0.16139951348304749,
0.017056036740541458,
0.12809212505817413,
-0.055080391466617584,
-0.25876080989837646,
-0.008076829835772514,
0.051302410662174225,
-0.07248687744140625,
0.13545523583889008,
0.10696391016244888,
-0.10981715470552444,
0.08594673126935959,
0.012483679689466953,
-0.188598170876503,
0.0053088548593223095,
0.006561026908457279,
-0.06197373941540718,
0.14454741775989532,
0.02627757005393505,
0.11637239903211594,
-0.004603952635079622,
0.07419975847005844,
-0.1947721242904663,
0.010915560647845268,
0.06143670529127121,
-0.0008093014475889504,
0.0922541469335556,
0.04200132563710213,
0.016881262883543968,
0.12951737642288208,
-0.07753071933984756,
0.047175586223602295,
0.024195468053221703,
-0.10057501494884491,
-0.2488769292831421,
-0.06924816966056824,
0.04293599724769592,
0.0703204795718193,
0.114735446870327,
-0.0025232243351638317,
0.12956556677818298,
-0.07075148820877075,
0.09971623867750168,
0.2038886547088623,
-0.28335192799568176,
-0.06467258930206299,
0.053865209221839905,
0.00007339009607676417,
0.05563786253333092,
-0.09489043802022934,
-0.03489343822002411,
0.0464896559715271,
0.05855744704604149,
0.13576450943946838,
-0.024170638993382454,
-0.11294703930616379,
-0.00035783895873464644,
-0.14739225804805756,
-0.03345825895667076,
0.18343965709209442,
0.04216032102704048,
-0.031146908178925514,
-0.0553949810564518,
-0.07025839388370514,
-0.13019998371601105,
-0.03334832936525345,
-0.016549253836274147,
0.042229484766721725,
-0.017025606706738472,
-0.02913014031946659,
-0.02009456790983677,
-0.10942849516868591,
-0.06127852201461792,
-0.07925374060869217,
0.12757611274719238,
0.03740452602505684,
0.026067685335874557,
-0.036684900522232056,
0.1074313297867775,
-0.008039964362978935,
-0.12542767822742462,
0.02348942495882511,
0.014696755446493626,
0.02391907386481762,
-0.029253896325826645,
-0.05762016028165817,
-0.07737185806035995,
0.013022209517657757,
0.11649444699287415,
-0.06627753376960754,
0.045372430235147476,
0.03719029203057289,
0.052638933062553406,
-0.08948004990816116,
0.17119163274765015,
-0.03414352610707283,
-0.020914733409881592,
0.017712917178869247,
0.04670121893286705,
0.026985853910446167,
-0.018883170560002327,
-0.12442363798618317,
0.01235498022288084,
0.09396601468324661,
0.002359611913561821,
-0.05978254973888397,
0.08112926036119461,
-0.03828165680170059,
-0.03376328945159912,
-0.004834178369492292,
-0.09749481827020645,
0.019099729135632515,
0.006322276312857866,
-0.08288158476352692,
-0.019768934696912766,
0.03858035430312157,
0.014716193079948425,
-0.02650129422545433,
0.09978698194026947,
-0.08106034249067307,
0.02185039594769478,
-0.08661425113677979,
-0.10680398344993591,
0.017903147265315056,
-0.0847211480140686,
0.01903758943080902,
-0.10459426045417786,
-0.19062873721122742,
-0.0070509458892047405,
0.06475710868835449,
-0.01869357004761696,
-0.06680820137262344,
-0.0446598045527935,
-0.07923983782529831,
0.009182565845549107,
-0.014331415295600891,
0.1251339614391327,
-0.06613124161958694,
0.08929501473903656,
0.029013488441705704,
0.053821150213479996,
-0.056043438613414764,
0.053580932319164276,
-0.09583668410778046,
0.009967178106307983,
-0.16321894526481628,
0.04253964126110077,
-0.05672885477542877,
0.05521031841635704,
-0.07788251340389252,
-0.10970792174339294,
0.015046020038425922,
-0.008545617572963238,
0.05209499970078468,
0.0823349803686142,
-0.17201119661331177,
-0.07378672808408737,
0.1421811878681183,
-0.08395789563655853,
-0.11804282665252686,
0.12395995855331421,
-0.05760178342461586,
0.040096431970596313,
0.05442522466182709,
0.1759394258260727,
0.09057454019784927,
-0.06562479585409164,
0.006790442857891321,
0.033411525189876556,
0.04206949099898338,
-0.06806401163339615,
0.07068393379449844,
0.0191428791731596,
0.017440875992178917,
0.03589886054396629,
-0.031538426876068115,
0.06567616760730743,
-0.07883037626743317,
-0.09648585319519043,
-0.04541643336415291,
-0.0845101922750473,
0.03634423762559891,
0.07191202789545059,
0.07480383664369583,
-0.08777596801519394,
-0.07665324211120605,
0.039373867213726044,
0.08542460203170776,
-0.06039068102836609,
0.028123341500759125,
-0.05340079590678215,
0.07857871055603027,
-0.022135380655527115,
-0.012152368202805519,
-0.17895615100860596,
-0.049802783876657486,
0.01269685197621584,
-0.017068060114979744,
0.011587386019527912,
0.03449726849794388,
0.05857099965214729,
0.062182098627090454,
-0.052427344024181366,
-0.006858838256448507,
-0.036305077373981476,
-0.002348072361201048,
-0.11779595166444778,
-0.194713294506073,
-0.033400148153305054,
-0.021897103637456894,
0.16001500189304352,
-0.20776410400867462,
0.05139097571372986,
-0.009029606357216835,
0.08531299233436584,
0.003109408775344491,
-0.004246857482939959,
-0.0448564738035202,
0.05959697812795639,
-0.04208480939269066,
-0.048716846853494644,
0.08012443780899048,
0.02152360789477825,
-0.08858314156532288,
-0.049554068595170975,
-0.09015779942274094,
0.17343412339687347,
0.1291024088859558,
-0.1208619773387909,
-0.06354694813489914,
-0.005686518736183643,
-0.07318001985549927,
-0.032940808683633804,
-0.04457523673772812,
0.025874445214867592,
0.19627976417541504,
-0.00956712942570448,
0.1556379199028015,
-0.073637954890728,
-0.04745488986372948,
0.024473601952195168,
-0.031531598418951035,
0.015794798731803894,
0.1174120306968689,
0.1146426871418953,
-0.0827367752790451,
0.15341469645500183,
0.14470958709716797,
-0.09353531897068024,
0.14427484571933746,
-0.046857599169015884,
-0.05337967351078987,
-0.01239851489663124,
-0.025734765455126762,
-0.01986040733754635,
0.10185948759317398,
-0.1591099053621292,
0.010827534832060337,
0.03590237349271774,
0.02313799038529396,
0.03406902775168419,
-0.21511520445346832,
-0.034339725971221924,
0.03387860581278801,
-0.0499500147998333,
-0.002659587422385812,
-0.00680513633415103,
0.01076573971658945,
0.10498564690351486,
-0.0029810734558850527,
-0.10533218830823898,
0.05257336422801018,
-0.0062987240962684155,
-0.0899871438741684,
0.21725404262542725,
-0.08747546374797821,
-0.18725232779979706,
-0.11880498379468918,
-0.06220771372318268,
-0.07559692859649658,
-0.004758915398269892,
0.05759590119123459,
-0.08066269010305405,
-0.03158093988895416,
-0.0806441456079483,
0.01446909736841917,
-0.002284287940710783,
0.026104530319571495,
0.01951073668897152,
0.0022082950454205275,
0.08154694736003876,
-0.11645926535129547,
-0.015786167234182358,
-0.06321795284748077,
-0.04211984947323799,
0.04032707214355469,
0.04146743565797806,
0.0988055020570755,
0.15235096216201782,
-0.019287779927253723,
0.02282365784049034,
-0.02135869860649109,
0.24310164153575897,
-0.06365299969911575,
-0.018696533516049385,
0.1559664011001587,
-0.009423095732927322,
0.0503767766058445,
0.11493933945894241,
0.07142814248800278,
-0.07500948756933212,
0.006025242153555155,
0.03687376528978348,
-0.037410885095596313,
-0.2224632054567337,
-0.06534454971551895,
-0.05588307976722717,
0.020130153745412827,
0.07723241299390793,
0.03739004582166672,
0.04191461205482483,
0.059889692813158035,
0.04328415170311928,
0.06834448873996735,
-0.02811642736196518,
0.0510883592069149,
0.15787187218666077,
0.03660351783037186,
0.12926755845546722,
-0.03699519485235214,
-0.0581619068980217,
0.044791772961616516,
-0.015560021623969078,
0.2022491693496704,
-0.011031819507479668,
0.10647973418235779,
0.05714922025799751,
0.1732620894908905,
-0.01044081524014473,
0.07149824500083923,
-0.01537350844591856,
-0.03176766633987427,
-0.01585426926612854,
-0.03921106085181236,
-0.030264299362897873,
0.018982039764523506,
-0.0774182453751564,
0.060774918645620346,
-0.1135101318359375,
0.01231792476028204,
0.05960399657487869,
0.24015723168849945,
0.03204027935862541,
-0.3194168210029602,
-0.09705868363380432,
-0.004413720685988665,
-0.03755950182676315,
-0.021074343472719193,
0.033035874366760254,
0.0998033806681633,
-0.10339818149805069,
0.04004545137286186,
-0.07676628977060318,
0.09240566939115524,
-0.04698125272989273,
0.054886672645807266,
0.09951405227184296,
0.09603190422058105,
0.020772648975253105,
0.09640488028526306,
-0.2850910425186157,
0.25600531697273254,
-0.0014360295608639717,
0.05562611296772957,
-0.07749780267477036,
0.012023663148283958,
0.038247883319854736,
0.07425936311483383,
0.07044059038162231,
-0.010851399041712284,
-0.02008097432553768,
-0.1768847107887268,
-0.06550540030002594,
0.03150675445795059,
0.06631500273942947,
-0.02683223783969879,
0.07452084869146347,
-0.02882814034819603,
0.01022765226662159,
0.07610820233821869,
0.013070657849311829,
-0.05542498826980591,
-0.10068942606449127,
0.0006295169587247074,
0.052223872393369675,
-0.06668298691511154,
-0.06467423588037491,
-0.11798346787691116,
-0.1204783171415329,
0.1616269052028656,
-0.020681016147136688,
-0.052140481770038605,
-0.11009234935045242,
0.08729247003793716,
0.058916110545396805,
-0.08323338627815247,
0.04165558144450188,
-0.002712355460971594,
0.0844990462064743,
0.02244596555829048,
-0.08383654803037643,
0.09725132584571838,
-0.0815984308719635,
-0.15663672983646393,
-0.05686290189623833,
0.1098857969045639,
0.03203533589839935,
0.0587601438164711,
-0.010134214535355568,
0.0003890396619681269,
-0.04276092350482941,
-0.0972023457288742,
0.012183141894638538,
-0.009019515477120876,
0.07961194217205048,
0.023807894438505173,
-0.05628948658704758,
0.01429247297346592,
-0.06137530878186226,
-0.025577371940016747,
0.2017984837293625,
0.2101432830095291,
-0.10181240737438202,
0.019432980567216873,
0.02815667726099491,
-0.06017622724175453,
-0.19818763434886932,
0.025082634761929512,
0.06268273293972015,
-0.007134377025067806,
0.029622653499245644,
-0.17602014541625977,
0.1417960673570633,
0.09607643634080887,
-0.019160397350788116,
0.11625179648399353,
-0.3190588355064392,
-0.11957462131977081,
0.13578015565872192,
0.13038641214370728,
0.1082649901509285,
-0.1275387704372406,
-0.017696116119623184,
-0.017272336408495903,
-0.12869010865688324,
0.13378626108169556,
-0.11559057980775833,
0.11307959258556366,
-0.033666882663965225,
0.08151817321777344,
0.0013376958668231964,
-0.05681246146559715,
0.11611095815896988,
0.02491697669029236,
0.09116630256175995,
-0.062243860214948654,
-0.00852359551936388,
0.034516509622335434,
-0.03262631967663765,
0.04529916122555733,
-0.09698519855737686,
0.03298477455973625,
-0.10906574875116348,
-0.022121058776974678,
-0.07863457500934601,
0.05038856714963913,
-0.03678201884031296,
-0.07253073155879974,
-0.04535432532429695,
0.024469682946801186,
0.058498598635196686,
-0.007506924215704203,
0.13079454004764557,
0.037502311170101166,
0.1487056463956833,
0.14970152080059052,
0.06433167308568954,
-0.08174005895853043,
-0.0913252905011177,
-0.023997897282242775,
-0.008367595262825489,
0.05552287399768829,
-0.1339290738105774,
0.025475962087512016,
0.14876805245876312,
0.009036448784172535,
0.1429215669631958,
0.0885363295674324,
-0.024754323065280914,
-0.01073932833969593,
0.05416949465870857,
-0.1737722009420395,
-0.09068270772695541,
-0.022626888006925583,
-0.06832322478294373,
-0.11296725273132324,
0.05743652954697609,
0.10428833216428757,
-0.08115601539611816,
0.002589004347100854,
-0.0032136866357177496,
0.013669951818883419,
-0.057341303676366806,
0.1853988766670227,
0.069394551217556,
0.04892005771398544,
-0.1013467088341713,
0.08452849835157394,
0.04651455581188202,
-0.058266013860702515,
0.004656137898564339,
0.0800386294722557,
-0.0980244129896164,
-0.053028371185064316,
0.05821079760789871,
0.17423874139785767,
-0.0555633120238781,
-0.05913792923092842,
-0.1475023776292801,
-0.13321906328201294,
0.07928811013698578,
0.1311376690864563,
0.11735476553440094,
0.007435607723891735,
-0.07321388274431229,
0.0015353669878095388,
-0.11065926402807236,
0.10537150502204895,
0.0464034266769886,
0.054856207221746445,
-0.15510469675064087,
0.13737551867961884,
0.015811048448085785,
0.06363701820373535,
-0.025411484763026237,
0.022845277562737465,
-0.08434955775737762,
0.008706839755177498,
-0.10348886996507645,
-0.0216924250125885,
-0.016252512112259865,
0.009890284389257431,
-0.0053891371935606,
-0.048445265740156174,
-0.057025883346796036,
0.024999424815177917,
-0.10565505921840668,
-0.025227516889572144,
0.03321393206715584,
0.05191047862172127,
-0.11662878841161728,
-0.04512174427509308,
0.01624988205730915,
-0.06121593341231346,
0.08542851358652115,
0.05370628833770752,
0.01167554035782814,
0.0522066168487072,
-0.12207209318876266,
0.015587622299790382,
0.0742800235748291,
0.03451160714030266,
0.060766711831092834,
-0.09618820250034332,
-0.004330980591475964,
0.000620808859821409,
0.029649745672941208,
0.022910641506314278,
0.0777667984366417,
-0.13724935054779053,
-0.002710487926378846,
-0.02783370390534401,
-0.06454020738601685,
-0.06714898347854614,
0.020097440108656883,
0.08303855359554291,
0.03637102618813515,
0.20598363876342773,
-0.07561089843511581,
0.046382516622543335,
-0.2131723165512085,
0.007946843281388283,
-0.00018174469005316496,
-0.11797532439231873,
-0.09552253782749176,
-0.05074607580900192,
0.05392405018210411,
-0.06373390555381775,
0.15394343435764313,
0.04863494262099266,
0.014815868809819221,
0.02855541929602623,
-0.017866145819425583,
0.0212584026157856,
0.007505857385694981,
0.1868983954191208,
0.03341247886419296,
-0.030030297115445137,
0.07202771306037903,
0.038279302418231964,
0.11288387328386307,
0.11314141005277634,
0.19690823554992676,
0.13149374723434448,
-0.0017040353268384933,
0.09504646062850952,
0.04710931330919266,
-0.060035157948732376,
-0.16356506943702698,
0.05060194432735443,
-0.03684914484620094,
0.11947114020586014,
-0.02734866924583912,
0.1927868127822876,
0.06847987323999405,
-0.16221268475055695,
0.046570226550102234,
-0.04790783300995827,
-0.08729487657546997,
-0.11747626960277557,
-0.06214311346411705,
-0.07060721516609192,
-0.14227285981178284,
-0.004479394759982824,
-0.11955714225769043,
-0.0012408922193571925,
0.11118984967470169,
0.01135226245969534,
-0.034017469733953476,
0.14965499937534332,
0.004561042878776789,
0.015436336398124695,
0.06754950433969498,
0.006787540391087532,
-0.03938409313559532,
-0.10845154523849487,
-0.05584019422531128,
-0.02278803661465645,
-0.009399445727467537,
0.0328291691839695,
-0.06014551594853401,
-0.03446387127041817,
0.03999267518520355,
-0.022419964894652367,
-0.091934934258461,
0.004938766360282898,
0.008898281492292881,
0.05622522905468941,
0.05688990280032158,
0.007466019131243229,
0.018164541572332382,
-0.008829513564705849,
0.20546086132526398,
-0.07387928664684296,
-0.06927040219306946,
-0.09953052550554276,
0.2376701980829239,
0.03024473413825035,
-0.020175063982605934,
0.030588623136281967,
-0.06754070520401001,
-0.019916238263249397,
0.24630585312843323,
0.22834931313991547,
-0.08888491988182068,
-0.0023557317908853292,
0.022025488317012787,
-0.012935125268995762,
-0.015982933342456818,
0.10426705330610275,
0.14598897099494934,
0.06348761916160583,
-0.0856001153588295,
-0.030391816049814224,
-0.061101991683244705,
-0.0055675022304058075,
-0.0447319820523262,
0.0732647031545639,
0.0502418652176857,
0.009062495082616806,
-0.03542236611247063,
0.04821065813302994,
-0.07359969615936279,
-0.09477446228265762,
0.037203386425971985,
-0.20104920864105225,
-0.1598745882511139,
-0.01462660450488329,
0.09054913371801376,
0.0001615012442925945,
0.0589400939643383,
-0.030530771240592003,
0.004999163560569286,
0.08607053011655807,
-0.024409007281064987,
-0.09203089773654938,
-0.0751798152923584,
0.09285302460193634,
-0.1202358677983284,
0.22683995962142944,
-0.049215227365493774,
0.07103195041418076,
0.12016253918409348,
0.053361181169748306,
-0.06377849727869034,
0.06605267524719238,
0.04349633678793907,
-0.03749271482229233,
0.012211945839226246,
0.07883132994174957,
-0.02810114249587059,
0.0679730772972107,
0.042277514934539795,
-0.1358688473701477,
0.0054893274791538715,
-0.043648626655340195,
-0.06824120879173279,
-0.039956267923116684,
-0.03296131268143654,
-0.06381493806838989,
0.1260761022567749,
0.214870885014534,
-0.02928794175386429,
-0.012312759645283222,
-0.08210945874452591,
0.007474972400814295,
0.048175226897001266,
0.016146136447787285,
-0.04649002104997635,
-0.21020174026489258,
0.016797954216599464,
0.041385382413864136,
-0.022210748866200447,
-0.2261059582233429,
-0.09453583508729935,
0.006697536446154118,
-0.07942066341638565,
-0.08833836019039154,
0.06651146709918976,
0.08081953972578049,
0.043811965733766556,
-0.05939081683754921,
-0.029742924496531487,
-0.0846235603094101,
0.13799507915973663,
-0.1453099101781845,
-0.08864983171224594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-sst2
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5240
- Accuracy: 0.9048
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.394 | 1.0 | 16838 | 0.3896 | 0.8968 |
| 0.2076 | 2.0 | 33676 | 0.5100 | 0.8956 |
| 0.1148 | 3.0 | 50514 | 0.5240 | 0.9048 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-large-finetuned-sst2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE SST2", "type": "glue", "args": "sst2"}, "metrics": [{"type": "accuracy", "value": 0.9048165137614679, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-sst2
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-sst2
=========================
This model is a fine-tuned version of google/fnet-large on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5240
* Accuracy: 0.9048
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09392090886831284,
0.086028553545475,
-0.002810414880514145,
0.12563712894916534,
0.15835294127464294,
0.02616388164460659,
0.12096844613552094,
0.12440948188304901,
-0.09253979474306107,
0.01095916610211134,
0.11961711943149567,
0.16139951348304749,
0.017056036740541458,
0.12809212505817413,
-0.055080391466617584,
-0.25876080989837646,
-0.008076829835772514,
0.051302410662174225,
-0.07248687744140625,
0.13545523583889008,
0.10696391016244888,
-0.10981715470552444,
0.08594673126935959,
0.012483679689466953,
-0.188598170876503,
0.0053088548593223095,
0.006561026908457279,
-0.06197373941540718,
0.14454741775989532,
0.02627757005393505,
0.11637239903211594,
-0.004603952635079622,
0.07419975847005844,
-0.1947721242904663,
0.010915560647845268,
0.06143670529127121,
-0.0008093014475889504,
0.0922541469335556,
0.04200132563710213,
0.016881262883543968,
0.12951737642288208,
-0.07753071933984756,
0.047175586223602295,
0.024195468053221703,
-0.10057501494884491,
-0.2488769292831421,
-0.06924816966056824,
0.04293599724769592,
0.0703204795718193,
0.114735446870327,
-0.0025232243351638317,
0.12956556677818298,
-0.07075148820877075,
0.09971623867750168,
0.2038886547088623,
-0.28335192799568176,
-0.06467258930206299,
0.053865209221839905,
0.00007339009607676417,
0.05563786253333092,
-0.09489043802022934,
-0.03489343822002411,
0.0464896559715271,
0.05855744704604149,
0.13576450943946838,
-0.024170638993382454,
-0.11294703930616379,
-0.00035783895873464644,
-0.14739225804805756,
-0.03345825895667076,
0.18343965709209442,
0.04216032102704048,
-0.031146908178925514,
-0.0553949810564518,
-0.07025839388370514,
-0.13019998371601105,
-0.03334832936525345,
-0.016549253836274147,
0.042229484766721725,
-0.017025606706738472,
-0.02913014031946659,
-0.02009456790983677,
-0.10942849516868591,
-0.06127852201461792,
-0.07925374060869217,
0.12757611274719238,
0.03740452602505684,
0.026067685335874557,
-0.036684900522232056,
0.1074313297867775,
-0.008039964362978935,
-0.12542767822742462,
0.02348942495882511,
0.014696755446493626,
0.02391907386481762,
-0.029253896325826645,
-0.05762016028165817,
-0.07737185806035995,
0.013022209517657757,
0.11649444699287415,
-0.06627753376960754,
0.045372430235147476,
0.03719029203057289,
0.052638933062553406,
-0.08948004990816116,
0.17119163274765015,
-0.03414352610707283,
-0.020914733409881592,
0.017712917178869247,
0.04670121893286705,
0.026985853910446167,
-0.018883170560002327,
-0.12442363798618317,
0.01235498022288084,
0.09396601468324661,
0.002359611913561821,
-0.05978254973888397,
0.08112926036119461,
-0.03828165680170059,
-0.03376328945159912,
-0.004834178369492292,
-0.09749481827020645,
0.019099729135632515,
0.006322276312857866,
-0.08288158476352692,
-0.019768934696912766,
0.03858035430312157,
0.014716193079948425,
-0.02650129422545433,
0.09978698194026947,
-0.08106034249067307,
0.02185039594769478,
-0.08661425113677979,
-0.10680398344993591,
0.017903147265315056,
-0.0847211480140686,
0.01903758943080902,
-0.10459426045417786,
-0.19062873721122742,
-0.0070509458892047405,
0.06475710868835449,
-0.01869357004761696,
-0.06680820137262344,
-0.0446598045527935,
-0.07923983782529831,
0.009182565845549107,
-0.014331415295600891,
0.1251339614391327,
-0.06613124161958694,
0.08929501473903656,
0.029013488441705704,
0.053821150213479996,
-0.056043438613414764,
0.053580932319164276,
-0.09583668410778046,
0.009967178106307983,
-0.16321894526481628,
0.04253964126110077,
-0.05672885477542877,
0.05521031841635704,
-0.07788251340389252,
-0.10970792174339294,
0.015046020038425922,
-0.008545617572963238,
0.05209499970078468,
0.0823349803686142,
-0.17201119661331177,
-0.07378672808408737,
0.1421811878681183,
-0.08395789563655853,
-0.11804282665252686,
0.12395995855331421,
-0.05760178342461586,
0.040096431970596313,
0.05442522466182709,
0.1759394258260727,
0.09057454019784927,
-0.06562479585409164,
0.006790442857891321,
0.033411525189876556,
0.04206949099898338,
-0.06806401163339615,
0.07068393379449844,
0.0191428791731596,
0.017440875992178917,
0.03589886054396629,
-0.031538426876068115,
0.06567616760730743,
-0.07883037626743317,
-0.09648585319519043,
-0.04541643336415291,
-0.0845101922750473,
0.03634423762559891,
0.07191202789545059,
0.07480383664369583,
-0.08777596801519394,
-0.07665324211120605,
0.039373867213726044,
0.08542460203170776,
-0.06039068102836609,
0.028123341500759125,
-0.05340079590678215,
0.07857871055603027,
-0.022135380655527115,
-0.012152368202805519,
-0.17895615100860596,
-0.049802783876657486,
0.01269685197621584,
-0.017068060114979744,
0.011587386019527912,
0.03449726849794388,
0.05857099965214729,
0.062182098627090454,
-0.052427344024181366,
-0.006858838256448507,
-0.036305077373981476,
-0.002348072361201048,
-0.11779595166444778,
-0.194713294506073,
-0.033400148153305054,
-0.021897103637456894,
0.16001500189304352,
-0.20776410400867462,
0.05139097571372986,
-0.009029606357216835,
0.08531299233436584,
0.003109408775344491,
-0.004246857482939959,
-0.0448564738035202,
0.05959697812795639,
-0.04208480939269066,
-0.048716846853494644,
0.08012443780899048,
0.02152360789477825,
-0.08858314156532288,
-0.049554068595170975,
-0.09015779942274094,
0.17343412339687347,
0.1291024088859558,
-0.1208619773387909,
-0.06354694813489914,
-0.005686518736183643,
-0.07318001985549927,
-0.032940808683633804,
-0.04457523673772812,
0.025874445214867592,
0.19627976417541504,
-0.00956712942570448,
0.1556379199028015,
-0.073637954890728,
-0.04745488986372948,
0.024473601952195168,
-0.031531598418951035,
0.015794798731803894,
0.1174120306968689,
0.1146426871418953,
-0.0827367752790451,
0.15341469645500183,
0.14470958709716797,
-0.09353531897068024,
0.14427484571933746,
-0.046857599169015884,
-0.05337967351078987,
-0.01239851489663124,
-0.025734765455126762,
-0.01986040733754635,
0.10185948759317398,
-0.1591099053621292,
0.010827534832060337,
0.03590237349271774,
0.02313799038529396,
0.03406902775168419,
-0.21511520445346832,
-0.034339725971221924,
0.03387860581278801,
-0.0499500147998333,
-0.002659587422385812,
-0.00680513633415103,
0.01076573971658945,
0.10498564690351486,
-0.0029810734558850527,
-0.10533218830823898,
0.05257336422801018,
-0.0062987240962684155,
-0.0899871438741684,
0.21725404262542725,
-0.08747546374797821,
-0.18725232779979706,
-0.11880498379468918,
-0.06220771372318268,
-0.07559692859649658,
-0.004758915398269892,
0.05759590119123459,
-0.08066269010305405,
-0.03158093988895416,
-0.0806441456079483,
0.01446909736841917,
-0.002284287940710783,
0.026104530319571495,
0.01951073668897152,
0.0022082950454205275,
0.08154694736003876,
-0.11645926535129547,
-0.015786167234182358,
-0.06321795284748077,
-0.04211984947323799,
0.04032707214355469,
0.04146743565797806,
0.0988055020570755,
0.15235096216201782,
-0.019287779927253723,
0.02282365784049034,
-0.02135869860649109,
0.24310164153575897,
-0.06365299969911575,
-0.018696533516049385,
0.1559664011001587,
-0.009423095732927322,
0.0503767766058445,
0.11493933945894241,
0.07142814248800278,
-0.07500948756933212,
0.006025242153555155,
0.03687376528978348,
-0.037410885095596313,
-0.2224632054567337,
-0.06534454971551895,
-0.05588307976722717,
0.020130153745412827,
0.07723241299390793,
0.03739004582166672,
0.04191461205482483,
0.059889692813158035,
0.04328415170311928,
0.06834448873996735,
-0.02811642736196518,
0.0510883592069149,
0.15787187218666077,
0.03660351783037186,
0.12926755845546722,
-0.03699519485235214,
-0.0581619068980217,
0.044791772961616516,
-0.015560021623969078,
0.2022491693496704,
-0.011031819507479668,
0.10647973418235779,
0.05714922025799751,
0.1732620894908905,
-0.01044081524014473,
0.07149824500083923,
-0.01537350844591856,
-0.03176766633987427,
-0.01585426926612854,
-0.03921106085181236,
-0.030264299362897873,
0.018982039764523506,
-0.0774182453751564,
0.060774918645620346,
-0.1135101318359375,
0.01231792476028204,
0.05960399657487869,
0.24015723168849945,
0.03204027935862541,
-0.3194168210029602,
-0.09705868363380432,
-0.004413720685988665,
-0.03755950182676315,
-0.021074343472719193,
0.033035874366760254,
0.0998033806681633,
-0.10339818149805069,
0.04004545137286186,
-0.07676628977060318,
0.09240566939115524,
-0.04698125272989273,
0.054886672645807266,
0.09951405227184296,
0.09603190422058105,
0.020772648975253105,
0.09640488028526306,
-0.2850910425186157,
0.25600531697273254,
-0.0014360295608639717,
0.05562611296772957,
-0.07749780267477036,
0.012023663148283958,
0.038247883319854736,
0.07425936311483383,
0.07044059038162231,
-0.010851399041712284,
-0.02008097432553768,
-0.1768847107887268,
-0.06550540030002594,
0.03150675445795059,
0.06631500273942947,
-0.02683223783969879,
0.07452084869146347,
-0.02882814034819603,
0.01022765226662159,
0.07610820233821869,
0.013070657849311829,
-0.05542498826980591,
-0.10068942606449127,
0.0006295169587247074,
0.052223872393369675,
-0.06668298691511154,
-0.06467423588037491,
-0.11798346787691116,
-0.1204783171415329,
0.1616269052028656,
-0.020681016147136688,
-0.052140481770038605,
-0.11009234935045242,
0.08729247003793716,
0.058916110545396805,
-0.08323338627815247,
0.04165558144450188,
-0.002712355460971594,
0.0844990462064743,
0.02244596555829048,
-0.08383654803037643,
0.09725132584571838,
-0.0815984308719635,
-0.15663672983646393,
-0.05686290189623833,
0.1098857969045639,
0.03203533589839935,
0.0587601438164711,
-0.010134214535355568,
0.0003890396619681269,
-0.04276092350482941,
-0.0972023457288742,
0.012183141894638538,
-0.009019515477120876,
0.07961194217205048,
0.023807894438505173,
-0.05628948658704758,
0.01429247297346592,
-0.06137530878186226,
-0.025577371940016747,
0.2017984837293625,
0.2101432830095291,
-0.10181240737438202,
0.019432980567216873,
0.02815667726099491,
-0.06017622724175453,
-0.19818763434886932,
0.025082634761929512,
0.06268273293972015,
-0.007134377025067806,
0.029622653499245644,
-0.17602014541625977,
0.1417960673570633,
0.09607643634080887,
-0.019160397350788116,
0.11625179648399353,
-0.3190588355064392,
-0.11957462131977081,
0.13578015565872192,
0.13038641214370728,
0.1082649901509285,
-0.1275387704372406,
-0.017696116119623184,
-0.017272336408495903,
-0.12869010865688324,
0.13378626108169556,
-0.11559057980775833,
0.11307959258556366,
-0.033666882663965225,
0.08151817321777344,
0.0013376958668231964,
-0.05681246146559715,
0.11611095815896988,
0.02491697669029236,
0.09116630256175995,
-0.062243860214948654,
-0.00852359551936388,
0.034516509622335434,
-0.03262631967663765,
0.04529916122555733,
-0.09698519855737686,
0.03298477455973625,
-0.10906574875116348,
-0.022121058776974678,
-0.07863457500934601,
0.05038856714963913,
-0.03678201884031296,
-0.07253073155879974,
-0.04535432532429695,
0.024469682946801186,
0.058498598635196686,
-0.007506924215704203,
0.13079454004764557,
0.037502311170101166,
0.1487056463956833,
0.14970152080059052,
0.06433167308568954,
-0.08174005895853043,
-0.0913252905011177,
-0.023997897282242775,
-0.008367595262825489,
0.05552287399768829,
-0.1339290738105774,
0.025475962087512016,
0.14876805245876312,
0.009036448784172535,
0.1429215669631958,
0.0885363295674324,
-0.024754323065280914,
-0.01073932833969593,
0.05416949465870857,
-0.1737722009420395,
-0.09068270772695541,
-0.022626888006925583,
-0.06832322478294373,
-0.11296725273132324,
0.05743652954697609,
0.10428833216428757,
-0.08115601539611816,
0.002589004347100854,
-0.0032136866357177496,
0.013669951818883419,
-0.057341303676366806,
0.1853988766670227,
0.069394551217556,
0.04892005771398544,
-0.1013467088341713,
0.08452849835157394,
0.04651455581188202,
-0.058266013860702515,
0.004656137898564339,
0.0800386294722557,
-0.0980244129896164,
-0.053028371185064316,
0.05821079760789871,
0.17423874139785767,
-0.0555633120238781,
-0.05913792923092842,
-0.1475023776292801,
-0.13321906328201294,
0.07928811013698578,
0.1311376690864563,
0.11735476553440094,
0.007435607723891735,
-0.07321388274431229,
0.0015353669878095388,
-0.11065926402807236,
0.10537150502204895,
0.0464034266769886,
0.054856207221746445,
-0.15510469675064087,
0.13737551867961884,
0.015811048448085785,
0.06363701820373535,
-0.025411484763026237,
0.022845277562737465,
-0.08434955775737762,
0.008706839755177498,
-0.10348886996507645,
-0.0216924250125885,
-0.016252512112259865,
0.009890284389257431,
-0.0053891371935606,
-0.048445265740156174,
-0.057025883346796036,
0.024999424815177917,
-0.10565505921840668,
-0.025227516889572144,
0.03321393206715584,
0.05191047862172127,
-0.11662878841161728,
-0.04512174427509308,
0.01624988205730915,
-0.06121593341231346,
0.08542851358652115,
0.05370628833770752,
0.01167554035782814,
0.0522066168487072,
-0.12207209318876266,
0.015587622299790382,
0.0742800235748291,
0.03451160714030266,
0.060766711831092834,
-0.09618820250034332,
-0.004330980591475964,
0.000620808859821409,
0.029649745672941208,
0.022910641506314278,
0.0777667984366417,
-0.13724935054779053,
-0.002710487926378846,
-0.02783370390534401,
-0.06454020738601685,
-0.06714898347854614,
0.020097440108656883,
0.08303855359554291,
0.03637102618813515,
0.20598363876342773,
-0.07561089843511581,
0.046382516622543335,
-0.2131723165512085,
0.007946843281388283,
-0.00018174469005316496,
-0.11797532439231873,
-0.09552253782749176,
-0.05074607580900192,
0.05392405018210411,
-0.06373390555381775,
0.15394343435764313,
0.04863494262099266,
0.014815868809819221,
0.02855541929602623,
-0.017866145819425583,
0.0212584026157856,
0.007505857385694981,
0.1868983954191208,
0.03341247886419296,
-0.030030297115445137,
0.07202771306037903,
0.038279302418231964,
0.11288387328386307,
0.11314141005277634,
0.19690823554992676,
0.13149374723434448,
-0.0017040353268384933,
0.09504646062850952,
0.04710931330919266,
-0.060035157948732376,
-0.16356506943702698,
0.05060194432735443,
-0.03684914484620094,
0.11947114020586014,
-0.02734866924583912,
0.1927868127822876,
0.06847987323999405,
-0.16221268475055695,
0.046570226550102234,
-0.04790783300995827,
-0.08729487657546997,
-0.11747626960277557,
-0.06214311346411705,
-0.07060721516609192,
-0.14227285981178284,
-0.004479394759982824,
-0.11955714225769043,
-0.0012408922193571925,
0.11118984967470169,
0.01135226245969534,
-0.034017469733953476,
0.14965499937534332,
0.004561042878776789,
0.015436336398124695,
0.06754950433969498,
0.006787540391087532,
-0.03938409313559532,
-0.10845154523849487,
-0.05584019422531128,
-0.02278803661465645,
-0.009399445727467537,
0.0328291691839695,
-0.06014551594853401,
-0.03446387127041817,
0.03999267518520355,
-0.022419964894652367,
-0.091934934258461,
0.004938766360282898,
0.008898281492292881,
0.05622522905468941,
0.05688990280032158,
0.007466019131243229,
0.018164541572332382,
-0.008829513564705849,
0.20546086132526398,
-0.07387928664684296,
-0.06927040219306946,
-0.09953052550554276,
0.2376701980829239,
0.03024473413825035,
-0.020175063982605934,
0.030588623136281967,
-0.06754070520401001,
-0.019916238263249397,
0.24630585312843323,
0.22834931313991547,
-0.08888491988182068,
-0.0023557317908853292,
0.022025488317012787,
-0.012935125268995762,
-0.015982933342456818,
0.10426705330610275,
0.14598897099494934,
0.06348761916160583,
-0.0856001153588295,
-0.030391816049814224,
-0.061101991683244705,
-0.0055675022304058075,
-0.0447319820523262,
0.0732647031545639,
0.0502418652176857,
0.009062495082616806,
-0.03542236611247063,
0.04821065813302994,
-0.07359969615936279,
-0.09477446228265762,
0.037203386425971985,
-0.20104920864105225,
-0.1598745882511139,
-0.01462660450488329,
0.09054913371801376,
0.0001615012442925945,
0.0589400939643383,
-0.030530771240592003,
0.004999163560569286,
0.08607053011655807,
-0.024409007281064987,
-0.09203089773654938,
-0.0751798152923584,
0.09285302460193634,
-0.1202358677983284,
0.22683995962142944,
-0.049215227365493774,
0.07103195041418076,
0.12016253918409348,
0.053361181169748306,
-0.06377849727869034,
0.06605267524719238,
0.04349633678793907,
-0.03749271482229233,
0.012211945839226246,
0.07883132994174957,
-0.02810114249587059,
0.0679730772972107,
0.042277514934539795,
-0.1358688473701477,
0.0054893274791538715,
-0.043648626655340195,
-0.06824120879173279,
-0.039956267923116684,
-0.03296131268143654,
-0.06381493806838989,
0.1260761022567749,
0.214870885014534,
-0.02928794175386429,
-0.012312759645283222,
-0.08210945874452591,
0.007474972400814295,
0.048175226897001266,
0.016146136447787285,
-0.04649002104997635,
-0.21020174026489258,
0.016797954216599464,
0.041385382413864136,
-0.022210748866200447,
-0.2261059582233429,
-0.09453583508729935,
0.006697536446154118,
-0.07942066341638565,
-0.08833836019039154,
0.06651146709918976,
0.08081953972578049,
0.043811965733766556,
-0.05939081683754921,
-0.029742924496531487,
-0.0846235603094101,
0.13799507915973663,
-0.1453099101781845,
-0.08864983171224594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-stsb
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6250
- Pearson: 0.8554
- Spearmanr: 0.8533
- Combined Score: 0.8543
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Pearson | Spearmanr | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:--------------:|
| 1.0727 | 1.0 | 1438 | 0.7718 | 0.8187 | 0.8240 | 0.8214 |
| 0.4619 | 2.0 | 2876 | 0.7704 | 0.8472 | 0.8500 | 0.8486 |
| 0.2401 | 3.0 | 4314 | 0.6250 | 0.8554 | 0.8533 | 0.8543 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["spearmanr"], "model-index": [{"name": "fnet-large-finetuned-stsb", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE STSB", "type": "glue", "args": "stsb"}, "metrics": [{"type": "spearmanr", "value": 0.8532669137129205, "name": "Spearmanr"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-stsb
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-stsb
=========================
This model is a fine-tuned version of google/fnet-large on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6250
* Pearson: 0.8554
* Spearmanr: 0.8533
* Combined Score: 0.8543
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09392090886831284,
0.086028553545475,
-0.002810414880514145,
0.12563712894916534,
0.15835294127464294,
0.02616388164460659,
0.12096844613552094,
0.12440948188304901,
-0.09253979474306107,
0.01095916610211134,
0.11961711943149567,
0.16139951348304749,
0.017056036740541458,
0.12809212505817413,
-0.055080391466617584,
-0.25876080989837646,
-0.008076829835772514,
0.051302410662174225,
-0.07248687744140625,
0.13545523583889008,
0.10696391016244888,
-0.10981715470552444,
0.08594673126935959,
0.012483679689466953,
-0.188598170876503,
0.0053088548593223095,
0.006561026908457279,
-0.06197373941540718,
0.14454741775989532,
0.02627757005393505,
0.11637239903211594,
-0.004603952635079622,
0.07419975847005844,
-0.1947721242904663,
0.010915560647845268,
0.06143670529127121,
-0.0008093014475889504,
0.0922541469335556,
0.04200132563710213,
0.016881262883543968,
0.12951737642288208,
-0.07753071933984756,
0.047175586223602295,
0.024195468053221703,
-0.10057501494884491,
-0.2488769292831421,
-0.06924816966056824,
0.04293599724769592,
0.0703204795718193,
0.114735446870327,
-0.0025232243351638317,
0.12956556677818298,
-0.07075148820877075,
0.09971623867750168,
0.2038886547088623,
-0.28335192799568176,
-0.06467258930206299,
0.053865209221839905,
0.00007339009607676417,
0.05563786253333092,
-0.09489043802022934,
-0.03489343822002411,
0.0464896559715271,
0.05855744704604149,
0.13576450943946838,
-0.024170638993382454,
-0.11294703930616379,
-0.00035783895873464644,
-0.14739225804805756,
-0.03345825895667076,
0.18343965709209442,
0.04216032102704048,
-0.031146908178925514,
-0.0553949810564518,
-0.07025839388370514,
-0.13019998371601105,
-0.03334832936525345,
-0.016549253836274147,
0.042229484766721725,
-0.017025606706738472,
-0.02913014031946659,
-0.02009456790983677,
-0.10942849516868591,
-0.06127852201461792,
-0.07925374060869217,
0.12757611274719238,
0.03740452602505684,
0.026067685335874557,
-0.036684900522232056,
0.1074313297867775,
-0.008039964362978935,
-0.12542767822742462,
0.02348942495882511,
0.014696755446493626,
0.02391907386481762,
-0.029253896325826645,
-0.05762016028165817,
-0.07737185806035995,
0.013022209517657757,
0.11649444699287415,
-0.06627753376960754,
0.045372430235147476,
0.03719029203057289,
0.052638933062553406,
-0.08948004990816116,
0.17119163274765015,
-0.03414352610707283,
-0.020914733409881592,
0.017712917178869247,
0.04670121893286705,
0.026985853910446167,
-0.018883170560002327,
-0.12442363798618317,
0.01235498022288084,
0.09396601468324661,
0.002359611913561821,
-0.05978254973888397,
0.08112926036119461,
-0.03828165680170059,
-0.03376328945159912,
-0.004834178369492292,
-0.09749481827020645,
0.019099729135632515,
0.006322276312857866,
-0.08288158476352692,
-0.019768934696912766,
0.03858035430312157,
0.014716193079948425,
-0.02650129422545433,
0.09978698194026947,
-0.08106034249067307,
0.02185039594769478,
-0.08661425113677979,
-0.10680398344993591,
0.017903147265315056,
-0.0847211480140686,
0.01903758943080902,
-0.10459426045417786,
-0.19062873721122742,
-0.0070509458892047405,
0.06475710868835449,
-0.01869357004761696,
-0.06680820137262344,
-0.0446598045527935,
-0.07923983782529831,
0.009182565845549107,
-0.014331415295600891,
0.1251339614391327,
-0.06613124161958694,
0.08929501473903656,
0.029013488441705704,
0.053821150213479996,
-0.056043438613414764,
0.053580932319164276,
-0.09583668410778046,
0.009967178106307983,
-0.16321894526481628,
0.04253964126110077,
-0.05672885477542877,
0.05521031841635704,
-0.07788251340389252,
-0.10970792174339294,
0.015046020038425922,
-0.008545617572963238,
0.05209499970078468,
0.0823349803686142,
-0.17201119661331177,
-0.07378672808408737,
0.1421811878681183,
-0.08395789563655853,
-0.11804282665252686,
0.12395995855331421,
-0.05760178342461586,
0.040096431970596313,
0.05442522466182709,
0.1759394258260727,
0.09057454019784927,
-0.06562479585409164,
0.006790442857891321,
0.033411525189876556,
0.04206949099898338,
-0.06806401163339615,
0.07068393379449844,
0.0191428791731596,
0.017440875992178917,
0.03589886054396629,
-0.031538426876068115,
0.06567616760730743,
-0.07883037626743317,
-0.09648585319519043,
-0.04541643336415291,
-0.0845101922750473,
0.03634423762559891,
0.07191202789545059,
0.07480383664369583,
-0.08777596801519394,
-0.07665324211120605,
0.039373867213726044,
0.08542460203170776,
-0.06039068102836609,
0.028123341500759125,
-0.05340079590678215,
0.07857871055603027,
-0.022135380655527115,
-0.012152368202805519,
-0.17895615100860596,
-0.049802783876657486,
0.01269685197621584,
-0.017068060114979744,
0.011587386019527912,
0.03449726849794388,
0.05857099965214729,
0.062182098627090454,
-0.052427344024181366,
-0.006858838256448507,
-0.036305077373981476,
-0.002348072361201048,
-0.11779595166444778,
-0.194713294506073,
-0.033400148153305054,
-0.021897103637456894,
0.16001500189304352,
-0.20776410400867462,
0.05139097571372986,
-0.009029606357216835,
0.08531299233436584,
0.003109408775344491,
-0.004246857482939959,
-0.0448564738035202,
0.05959697812795639,
-0.04208480939269066,
-0.048716846853494644,
0.08012443780899048,
0.02152360789477825,
-0.08858314156532288,
-0.049554068595170975,
-0.09015779942274094,
0.17343412339687347,
0.1291024088859558,
-0.1208619773387909,
-0.06354694813489914,
-0.005686518736183643,
-0.07318001985549927,
-0.032940808683633804,
-0.04457523673772812,
0.025874445214867592,
0.19627976417541504,
-0.00956712942570448,
0.1556379199028015,
-0.073637954890728,
-0.04745488986372948,
0.024473601952195168,
-0.031531598418951035,
0.015794798731803894,
0.1174120306968689,
0.1146426871418953,
-0.0827367752790451,
0.15341469645500183,
0.14470958709716797,
-0.09353531897068024,
0.14427484571933746,
-0.046857599169015884,
-0.05337967351078987,
-0.01239851489663124,
-0.025734765455126762,
-0.01986040733754635,
0.10185948759317398,
-0.1591099053621292,
0.010827534832060337,
0.03590237349271774,
0.02313799038529396,
0.03406902775168419,
-0.21511520445346832,
-0.034339725971221924,
0.03387860581278801,
-0.0499500147998333,
-0.002659587422385812,
-0.00680513633415103,
0.01076573971658945,
0.10498564690351486,
-0.0029810734558850527,
-0.10533218830823898,
0.05257336422801018,
-0.0062987240962684155,
-0.0899871438741684,
0.21725404262542725,
-0.08747546374797821,
-0.18725232779979706,
-0.11880498379468918,
-0.06220771372318268,
-0.07559692859649658,
-0.004758915398269892,
0.05759590119123459,
-0.08066269010305405,
-0.03158093988895416,
-0.0806441456079483,
0.01446909736841917,
-0.002284287940710783,
0.026104530319571495,
0.01951073668897152,
0.0022082950454205275,
0.08154694736003876,
-0.11645926535129547,
-0.015786167234182358,
-0.06321795284748077,
-0.04211984947323799,
0.04032707214355469,
0.04146743565797806,
0.0988055020570755,
0.15235096216201782,
-0.019287779927253723,
0.02282365784049034,
-0.02135869860649109,
0.24310164153575897,
-0.06365299969911575,
-0.018696533516049385,
0.1559664011001587,
-0.009423095732927322,
0.0503767766058445,
0.11493933945894241,
0.07142814248800278,
-0.07500948756933212,
0.006025242153555155,
0.03687376528978348,
-0.037410885095596313,
-0.2224632054567337,
-0.06534454971551895,
-0.05588307976722717,
0.020130153745412827,
0.07723241299390793,
0.03739004582166672,
0.04191461205482483,
0.059889692813158035,
0.04328415170311928,
0.06834448873996735,
-0.02811642736196518,
0.0510883592069149,
0.15787187218666077,
0.03660351783037186,
0.12926755845546722,
-0.03699519485235214,
-0.0581619068980217,
0.044791772961616516,
-0.015560021623969078,
0.2022491693496704,
-0.011031819507479668,
0.10647973418235779,
0.05714922025799751,
0.1732620894908905,
-0.01044081524014473,
0.07149824500083923,
-0.01537350844591856,
-0.03176766633987427,
-0.01585426926612854,
-0.03921106085181236,
-0.030264299362897873,
0.018982039764523506,
-0.0774182453751564,
0.060774918645620346,
-0.1135101318359375,
0.01231792476028204,
0.05960399657487869,
0.24015723168849945,
0.03204027935862541,
-0.3194168210029602,
-0.09705868363380432,
-0.004413720685988665,
-0.03755950182676315,
-0.021074343472719193,
0.033035874366760254,
0.0998033806681633,
-0.10339818149805069,
0.04004545137286186,
-0.07676628977060318,
0.09240566939115524,
-0.04698125272989273,
0.054886672645807266,
0.09951405227184296,
0.09603190422058105,
0.020772648975253105,
0.09640488028526306,
-0.2850910425186157,
0.25600531697273254,
-0.0014360295608639717,
0.05562611296772957,
-0.07749780267477036,
0.012023663148283958,
0.038247883319854736,
0.07425936311483383,
0.07044059038162231,
-0.010851399041712284,
-0.02008097432553768,
-0.1768847107887268,
-0.06550540030002594,
0.03150675445795059,
0.06631500273942947,
-0.02683223783969879,
0.07452084869146347,
-0.02882814034819603,
0.01022765226662159,
0.07610820233821869,
0.013070657849311829,
-0.05542498826980591,
-0.10068942606449127,
0.0006295169587247074,
0.052223872393369675,
-0.06668298691511154,
-0.06467423588037491,
-0.11798346787691116,
-0.1204783171415329,
0.1616269052028656,
-0.020681016147136688,
-0.052140481770038605,
-0.11009234935045242,
0.08729247003793716,
0.058916110545396805,
-0.08323338627815247,
0.04165558144450188,
-0.002712355460971594,
0.0844990462064743,
0.02244596555829048,
-0.08383654803037643,
0.09725132584571838,
-0.0815984308719635,
-0.15663672983646393,
-0.05686290189623833,
0.1098857969045639,
0.03203533589839935,
0.0587601438164711,
-0.010134214535355568,
0.0003890396619681269,
-0.04276092350482941,
-0.0972023457288742,
0.012183141894638538,
-0.009019515477120876,
0.07961194217205048,
0.023807894438505173,
-0.05628948658704758,
0.01429247297346592,
-0.06137530878186226,
-0.025577371940016747,
0.2017984837293625,
0.2101432830095291,
-0.10181240737438202,
0.019432980567216873,
0.02815667726099491,
-0.06017622724175453,
-0.19818763434886932,
0.025082634761929512,
0.06268273293972015,
-0.007134377025067806,
0.029622653499245644,
-0.17602014541625977,
0.1417960673570633,
0.09607643634080887,
-0.019160397350788116,
0.11625179648399353,
-0.3190588355064392,
-0.11957462131977081,
0.13578015565872192,
0.13038641214370728,
0.1082649901509285,
-0.1275387704372406,
-0.017696116119623184,
-0.017272336408495903,
-0.12869010865688324,
0.13378626108169556,
-0.11559057980775833,
0.11307959258556366,
-0.033666882663965225,
0.08151817321777344,
0.0013376958668231964,
-0.05681246146559715,
0.11611095815896988,
0.02491697669029236,
0.09116630256175995,
-0.062243860214948654,
-0.00852359551936388,
0.034516509622335434,
-0.03262631967663765,
0.04529916122555733,
-0.09698519855737686,
0.03298477455973625,
-0.10906574875116348,
-0.022121058776974678,
-0.07863457500934601,
0.05038856714963913,
-0.03678201884031296,
-0.07253073155879974,
-0.04535432532429695,
0.024469682946801186,
0.058498598635196686,
-0.007506924215704203,
0.13079454004764557,
0.037502311170101166,
0.1487056463956833,
0.14970152080059052,
0.06433167308568954,
-0.08174005895853043,
-0.0913252905011177,
-0.023997897282242775,
-0.008367595262825489,
0.05552287399768829,
-0.1339290738105774,
0.025475962087512016,
0.14876805245876312,
0.009036448784172535,
0.1429215669631958,
0.0885363295674324,
-0.024754323065280914,
-0.01073932833969593,
0.05416949465870857,
-0.1737722009420395,
-0.09068270772695541,
-0.022626888006925583,
-0.06832322478294373,
-0.11296725273132324,
0.05743652954697609,
0.10428833216428757,
-0.08115601539611816,
0.002589004347100854,
-0.0032136866357177496,
0.013669951818883419,
-0.057341303676366806,
0.1853988766670227,
0.069394551217556,
0.04892005771398544,
-0.1013467088341713,
0.08452849835157394,
0.04651455581188202,
-0.058266013860702515,
0.004656137898564339,
0.0800386294722557,
-0.0980244129896164,
-0.053028371185064316,
0.05821079760789871,
0.17423874139785767,
-0.0555633120238781,
-0.05913792923092842,
-0.1475023776292801,
-0.13321906328201294,
0.07928811013698578,
0.1311376690864563,
0.11735476553440094,
0.007435607723891735,
-0.07321388274431229,
0.0015353669878095388,
-0.11065926402807236,
0.10537150502204895,
0.0464034266769886,
0.054856207221746445,
-0.15510469675064087,
0.13737551867961884,
0.015811048448085785,
0.06363701820373535,
-0.025411484763026237,
0.022845277562737465,
-0.08434955775737762,
0.008706839755177498,
-0.10348886996507645,
-0.0216924250125885,
-0.016252512112259865,
0.009890284389257431,
-0.0053891371935606,
-0.048445265740156174,
-0.057025883346796036,
0.024999424815177917,
-0.10565505921840668,
-0.025227516889572144,
0.03321393206715584,
0.05191047862172127,
-0.11662878841161728,
-0.04512174427509308,
0.01624988205730915,
-0.06121593341231346,
0.08542851358652115,
0.05370628833770752,
0.01167554035782814,
0.0522066168487072,
-0.12207209318876266,
0.015587622299790382,
0.0742800235748291,
0.03451160714030266,
0.060766711831092834,
-0.09618820250034332,
-0.004330980591475964,
0.000620808859821409,
0.029649745672941208,
0.022910641506314278,
0.0777667984366417,
-0.13724935054779053,
-0.002710487926378846,
-0.02783370390534401,
-0.06454020738601685,
-0.06714898347854614,
0.020097440108656883,
0.08303855359554291,
0.03637102618813515,
0.20598363876342773,
-0.07561089843511581,
0.046382516622543335,
-0.2131723165512085,
0.007946843281388283,
-0.00018174469005316496,
-0.11797532439231873,
-0.09552253782749176,
-0.05074607580900192,
0.05392405018210411,
-0.06373390555381775,
0.15394343435764313,
0.04863494262099266,
0.014815868809819221,
0.02855541929602623,
-0.017866145819425583,
0.0212584026157856,
0.007505857385694981,
0.1868983954191208,
0.03341247886419296,
-0.030030297115445137,
0.07202771306037903,
0.038279302418231964,
0.11288387328386307,
0.11314141005277634,
0.19690823554992676,
0.13149374723434448,
-0.0017040353268384933,
0.09504646062850952,
0.04710931330919266,
-0.060035157948732376,
-0.16356506943702698,
0.05060194432735443,
-0.03684914484620094,
0.11947114020586014,
-0.02734866924583912,
0.1927868127822876,
0.06847987323999405,
-0.16221268475055695,
0.046570226550102234,
-0.04790783300995827,
-0.08729487657546997,
-0.11747626960277557,
-0.06214311346411705,
-0.07060721516609192,
-0.14227285981178284,
-0.004479394759982824,
-0.11955714225769043,
-0.0012408922193571925,
0.11118984967470169,
0.01135226245969534,
-0.034017469733953476,
0.14965499937534332,
0.004561042878776789,
0.015436336398124695,
0.06754950433969498,
0.006787540391087532,
-0.03938409313559532,
-0.10845154523849487,
-0.05584019422531128,
-0.02278803661465645,
-0.009399445727467537,
0.0328291691839695,
-0.06014551594853401,
-0.03446387127041817,
0.03999267518520355,
-0.022419964894652367,
-0.091934934258461,
0.004938766360282898,
0.008898281492292881,
0.05622522905468941,
0.05688990280032158,
0.007466019131243229,
0.018164541572332382,
-0.008829513564705849,
0.20546086132526398,
-0.07387928664684296,
-0.06927040219306946,
-0.09953052550554276,
0.2376701980829239,
0.03024473413825035,
-0.020175063982605934,
0.030588623136281967,
-0.06754070520401001,
-0.019916238263249397,
0.24630585312843323,
0.22834931313991547,
-0.08888491988182068,
-0.0023557317908853292,
0.022025488317012787,
-0.012935125268995762,
-0.015982933342456818,
0.10426705330610275,
0.14598897099494934,
0.06348761916160583,
-0.0856001153588295,
-0.030391816049814224,
-0.061101991683244705,
-0.0055675022304058075,
-0.0447319820523262,
0.0732647031545639,
0.0502418652176857,
0.009062495082616806,
-0.03542236611247063,
0.04821065813302994,
-0.07359969615936279,
-0.09477446228265762,
0.037203386425971985,
-0.20104920864105225,
-0.1598745882511139,
-0.01462660450488329,
0.09054913371801376,
0.0001615012442925945,
0.0589400939643383,
-0.030530771240592003,
0.004999163560569286,
0.08607053011655807,
-0.024409007281064987,
-0.09203089773654938,
-0.0751798152923584,
0.09285302460193634,
-0.1202358677983284,
0.22683995962142944,
-0.049215227365493774,
0.07103195041418076,
0.12016253918409348,
0.053361181169748306,
-0.06377849727869034,
0.06605267524719238,
0.04349633678793907,
-0.03749271482229233,
0.012211945839226246,
0.07883132994174957,
-0.02810114249587059,
0.0679730772972107,
0.042277514934539795,
-0.1358688473701477,
0.0054893274791538715,
-0.043648626655340195,
-0.06824120879173279,
-0.039956267923116684,
-0.03296131268143654,
-0.06381493806838989,
0.1260761022567749,
0.214870885014534,
-0.02928794175386429,
-0.012312759645283222,
-0.08210945874452591,
0.007474972400814295,
0.048175226897001266,
0.016146136447787285,
-0.04649002104997635,
-0.21020174026489258,
0.016797954216599464,
0.041385382413864136,
-0.022210748866200447,
-0.2261059582233429,
-0.09453583508729935,
0.006697536446154118,
-0.07942066341638565,
-0.08833836019039154,
0.06651146709918976,
0.08081953972578049,
0.043811965733766556,
-0.05939081683754921,
-0.029742924496531487,
-0.0846235603094101,
0.13799507915973663,
-0.1453099101781845,
-0.08864983171224594
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fnet-large-finetuned-wnli
This model is a fine-tuned version of [google/fnet-large](https://huggingface.co/google/fnet-large) on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6953
- Accuracy: 0.3803
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.7217 | 1.0 | 159 | 0.6864 | 0.5634 |
| 0.7056 | 2.0 | 318 | 0.6869 | 0.5634 |
| 0.706 | 3.0 | 477 | 0.6875 | 0.5634 |
| 0.7032 | 4.0 | 636 | 0.6931 | 0.5634 |
| 0.7025 | 5.0 | 795 | 0.6953 | 0.3803 |
### Framework versions
- Transformers 4.11.0.dev0
- Pytorch 1.9.0
- Datasets 1.12.1
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "fnet-large-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "GLUE WNLI", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.38028169014084506, "name": "Accuracy"}]}]}]}
|
text-classification
|
gchhablani/fnet-large-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"fnet",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
fnet-large-finetuned-wnli
=========================
This model is a fine-tuned version of google/fnet-large on the GLUE WNLI dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6953
* Accuracy: 0.3803
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5.0
### Training results
### Framework versions
* Transformers 4.11.0.dev0
* Pytorch 1.9.0
* Datasets 1.12.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
68,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #fnet #text-classification #generated_from_trainer #en #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5.0### Training results### Framework versions\n\n\n* Transformers 4.11.0.dev0\n* Pytorch 1.9.0\n* Datasets 1.12.1\n* Tokenizers 0.10.3"
] |
[
-0.09546191245317459,
0.08599092066287994,
-0.002819906221702695,
0.12520475685596466,
0.15879055857658386,
0.026399699971079826,
0.11949319392442703,
0.12398721277713776,
-0.09084320068359375,
0.011874276213347912,
0.11916262656450272,
0.15975700318813324,
0.017096880823373795,
0.1293363869190216,
-0.05514799430966377,
-0.2581601142883301,
-0.009066539816558361,
0.05219760909676552,
-0.0730782225728035,
0.13537991046905518,
0.10601045191287994,
-0.11119034886360168,
0.0858348160982132,
0.012250698171555996,
-0.19032633304595947,
0.005159530322998762,
0.007800718303769827,
-0.06162724271416664,
0.14478495717048645,
0.026664696633815765,
0.11614209413528442,
-0.005005354061722755,
0.07336033135652542,
-0.19494523108005524,
0.011148547753691673,
0.062217213213443756,
-0.0008450024761259556,
0.09304920583963394,
0.04146657511591911,
0.01810573972761631,
0.13009214401245117,
-0.07812145352363586,
0.04607972502708435,
0.024277985095977783,
-0.10066843777894974,
-0.25224390625953674,
-0.06780645996332169,
0.04294602945446968,
0.06958787888288498,
0.1144355908036232,
-0.003222366329282522,
0.12924230098724365,
-0.07112066447734833,
0.09992790967226028,
0.20228120684623718,
-0.2846123278141022,
-0.06414385139942169,
0.05315433070063591,
-0.0006625417736358941,
0.05653926730155945,
-0.09665023535490036,
-0.03479963168501854,
0.047205254435539246,
0.056852784007787704,
0.13653863966464996,
-0.02365931309759617,
-0.11345908790826797,
-0.00036166355130262673,
-0.14659838378429413,
-0.03392684459686279,
0.18420065939426422,
0.04295152425765991,
-0.029913606122136116,
-0.05532674118876457,
-0.07111954689025879,
-0.1298159509897232,
-0.033634983003139496,
-0.01663203164935112,
0.042924873530864716,
-0.017111431807279587,
-0.027516013011336327,
-0.018441833555698395,
-0.10968123376369476,
-0.061496250331401825,
-0.07938845455646515,
0.12648801505565643,
0.0373440757393837,
0.026029396802186966,
-0.03856557980179787,
0.10718294978141785,
-0.01039015781134367,
-0.12635262310504913,
0.02376563660800457,
0.014790150336921215,
0.023429078981280327,
-0.028944209218025208,
-0.05729011818766594,
-0.07710803300142288,
0.013970785774290562,
0.11589844524860382,
-0.06570203602313995,
0.04521610215306282,
0.03654171898961067,
0.05179869011044502,
-0.08960038423538208,
0.17069628834724426,
-0.033626291900873184,
-0.022550251334905624,
0.017764555290341377,
0.04625367373228073,
0.02610119618475437,
-0.018981298431754112,
-0.12368905544281006,
0.01198206003755331,
0.09392620623111725,
0.0013035978190600872,
-0.05933376029133797,
0.08104883879423141,
-0.038034044206142426,
-0.033083993941545486,
-0.005496066529303789,
-0.09730266779661179,
0.019998257979750633,
0.006821903865784407,
-0.08317822217941284,
-0.020987385883927345,
0.037898458540439606,
0.013462735339999199,
-0.027173368260264397,
0.10108006745576859,
-0.08223720639944077,
0.021280832588672638,
-0.0873541384935379,
-0.1072964146733284,
0.01761503331363201,
-0.08617527782917023,
0.019810689613223076,
-0.10400725156068802,
-0.18891578912734985,
-0.0077062081545591354,
0.06357033550739288,
-0.01852494478225708,
-0.06713766604661942,
-0.04489322751760483,
-0.08013653755187988,
0.009742194786667824,
-0.014273520559072495,
0.12686266005039215,
-0.06624139845371246,
0.09025576710700989,
0.02989248000085354,
0.05423325300216675,
-0.055293235927820206,
0.05391918867826462,
-0.09691685438156128,
0.008810688741505146,
-0.16241350769996643,
0.041948363184928894,
-0.05697361379861832,
0.05600197985768318,
-0.07802683860063553,
-0.10861963778734207,
0.014010758139193058,
-0.009118323214352131,
0.051885224878787994,
0.0832173079252243,
-0.1714225709438324,
-0.07527991384267807,
0.1451883614063263,
-0.08433152735233307,
-0.117959164083004,
0.12364167720079422,
-0.05751069262623787,
0.03996153548359871,
0.055080514401197433,
0.17722031474113464,
0.09120480716228485,
-0.06544680148363113,
0.005950626451522112,
0.03313767910003662,
0.04212610796093941,
-0.06831853836774826,
0.06972771137952805,
0.019984891638159752,
0.016782991588115692,
0.03638952597975731,
-0.03108343482017517,
0.06633422523736954,
-0.07898469269275665,
-0.09579379111528397,
-0.046762142330408096,
-0.08413276076316833,
0.03685159981250763,
0.07383585721254349,
0.07521867007017136,
-0.08728065341711044,
-0.07636827975511551,
0.03872332349419594,
0.08539536595344543,
-0.05908842012286186,
0.026852475479245186,
-0.053241580724716187,
0.07960749417543411,
-0.022701889276504517,
-0.010898844338953495,
-0.18033812940120697,
-0.0489765927195549,
0.013111790642142296,
-0.019065599888563156,
0.011413183994591236,
0.034898094832897186,
0.05896642804145813,
0.06169478967785835,
-0.05224358290433884,
-0.006700506899505854,
-0.03707636147737503,
-0.002736605005338788,
-0.11889437586069107,
-0.19659262895584106,
-0.03258073702454567,
-0.0219099298119545,
0.1598203033208847,
-0.20940329134464264,
0.051057182252407074,
-0.007849784567952156,
0.08635152131319046,
0.004428023472428322,
-0.004245837219059467,
-0.04512714967131615,
0.05986940115690231,
-0.04119914397597313,
-0.04754257947206497,
0.0797266736626625,
0.020951641723513603,
-0.08738018572330475,
-0.05052084103226662,
-0.08946986496448517,
0.17589566111564636,
0.13004666566848755,
-0.1203686073422432,
-0.06231275945901871,
-0.005228476598858833,
-0.07227027416229248,
-0.03346370905637741,
-0.04478921368718147,
0.027298297733068466,
0.19398172199726105,
-0.010374259203672409,
0.15592092275619507,
-0.07345370203256607,
-0.04698825627565384,
0.025998717173933983,
-0.029928049072623253,
0.01762254908680916,
0.11882616579532623,
0.11529583483934402,
-0.08247628062963486,
0.15382464230060577,
0.14487947523593903,
-0.09263201802968979,
0.14530399441719055,
-0.04800577461719513,
-0.05440796539187431,
-0.012026961892843246,
-0.02653445303440094,
-0.02075771987438202,
0.10357513278722763,
-0.1614956557750702,
0.010308556258678436,
0.03666393458843231,
0.022368796169757843,
0.033521343022584915,
-0.21500913798809052,
-0.033284418284893036,
0.03360406681895256,
-0.04915229231119156,
-0.005554342642426491,
-0.00600228039547801,
0.011097459122538567,
0.10555184632539749,
-0.0037921082694083452,
-0.1044158786535263,
0.05074157938361168,
-0.006137927062809467,
-0.08894295990467072,
0.21745462715625763,
-0.08632136881351471,
-0.18696673214435577,
-0.11720063537359238,
-0.05977078899741173,
-0.07440357655286789,
-0.005697417538613081,
0.05866274610161781,
-0.08173168450593948,
-0.03133115917444229,
-0.07904272526502609,
0.014714551158249378,
-0.0008177160052582622,
0.02673378773033619,
0.017737895250320435,
0.003071904182434082,
0.08112871646881104,
-0.11631204187870026,
-0.014812331646680832,
-0.06412842124700546,
-0.04106014594435692,
0.039631348103284836,
0.04294215887784958,
0.09911496192216873,
0.15236623585224152,
-0.01932796649634838,
0.022595083341002464,
-0.021412083879113197,
0.24335768818855286,
-0.06545376032590866,
-0.018363337963819504,
0.15558351576328278,
-0.00867502298206091,
0.04948771744966507,
0.11573895066976547,
0.0715351402759552,
-0.07415314763784409,
0.0050702085718512535,
0.03689371049404144,
-0.036363378167152405,
-0.22274886071681976,
-0.06535296142101288,
-0.05658477544784546,
0.02026711404323578,
0.07761484384536743,
0.03714492544531822,
0.04264776408672333,
0.05929701030254364,
0.04340836778283119,
0.0686696395277977,
-0.02882838249206543,
0.05045050010085106,
0.15857917070388794,
0.03623516485095024,
0.12905411422252655,
-0.0383649580180645,
-0.05830521136522293,
0.04401826858520508,
-0.01650388538837433,
0.2045838087797165,
-0.010560386814177036,
0.10752256959676743,
0.05580081418156624,
0.17469125986099243,
-0.009762926027178764,
0.07077998667955399,
-0.015204674564301968,
-0.032061636447906494,
-0.015053500421345234,
-0.038609445095062256,
-0.029359186068177223,
0.01937456801533699,
-0.07770969718694687,
0.060527801513671875,
-0.11411376297473907,
0.010677093639969826,
0.059778086841106415,
0.23909689486026764,
0.03223899379372597,
-0.3180825710296631,
-0.09792063385248184,
-0.005235941149294376,
-0.03779191896319389,
-0.022020915523171425,
0.032515332102775574,
0.10104595869779587,
-0.10247465968132019,
0.039183102548122406,
-0.0768279880285263,
0.09344609081745148,
-0.04620886594057083,
0.05521034076809883,
0.09871474653482437,
0.09597338736057281,
0.02048015221953392,
0.096494659781456,
-0.2858371138572693,
0.25810134410858154,
-0.0008917004452086985,
0.055689625442028046,
-0.07825443148612976,
0.011470974422991276,
0.03769155591726303,
0.07573051750659943,
0.07050973176956177,
-0.010998306795954704,
-0.02132379449903965,
-0.17557914555072784,
-0.06484248489141464,
0.03208262845873833,
0.06640531122684479,
-0.02824542112648487,
0.07367397099733353,
-0.02855999954044819,
0.009517839178442955,
0.07583609968423843,
0.010695524513721466,
-0.05560624599456787,
-0.09996896237134933,
0.0005532020586542785,
0.053071361035108566,
-0.06607106328010559,
-0.06424704939126968,
-0.1176302582025528,
-0.11913973838090897,
0.15881624817848206,
-0.024910470470786095,
-0.05115595459938049,
-0.10973544418811798,
0.0894245058298111,
0.05847169831395149,
-0.08369704335927963,
0.042348362505435944,
-0.003034900175407529,
0.08414874970912933,
0.022848933935165405,
-0.08384497463703156,
0.09918875247240067,
-0.08087218552827835,
-0.15695929527282715,
-0.056926511228084564,
0.11002043634653091,
0.03164108842611313,
0.058370400220155716,
-0.01081100944429636,
0.0006559081957675517,
-0.04384872689843178,
-0.09692469984292984,
0.013017899356782436,
-0.009734023362398148,
0.07791733741760254,
0.02332180365920067,
-0.055519409477710724,
0.010724114254117012,
-0.06127122417092323,
-0.025878654792904854,
0.1999949812889099,
0.20991948246955872,
-0.10273119807243347,
0.01914804056286812,
0.02892344444990158,
-0.06101452186703682,
-0.19818688929080963,
0.026290491223335266,
0.06219090148806572,
-0.006466645747423172,
0.028350498527288437,
-0.1772705316543579,
0.1424008160829544,
0.09497801214456558,
-0.019320080056786537,
0.11905593425035477,
-0.3191659450531006,
-0.11980247497558594,
0.13569417595863342,
0.12890441715717316,
0.11123138666152954,
-0.12752926349639893,
-0.018578607589006424,
-0.01675238274037838,
-0.12915006279945374,
0.1329793781042099,
-0.1172524243593216,
0.11238279938697815,
-0.03414227068424225,
0.08010146021842957,
0.001642079558223486,
-0.0570412315428257,
0.11512328684329987,
0.025005081668496132,
0.09225330501794815,
-0.06164787337183952,
-0.008495654910802841,
0.03452535346150398,
-0.03190139681100845,
0.04483703896403313,
-0.09700168669223785,
0.033334556967020035,
-0.10865794122219086,
-0.0214571263641119,
-0.07938719540834427,
0.05095565319061279,
-0.03664569556713104,
-0.07135503739118576,
-0.045485079288482666,
0.02403452806174755,
0.057519298046827316,
-0.00724073639139533,
0.13187195360660553,
0.037778809666633606,
0.14953835308551788,
0.1497054547071457,
0.06296437233686447,
-0.0816614180803299,
-0.09189815074205399,
-0.02339370734989643,
-0.007961608469486237,
0.05737365782260895,
-0.13682818412780762,
0.025487486273050308,
0.14920346438884735,
0.01067504845559597,
0.1419643610715866,
0.08810736984014511,
-0.024375448003411293,
-0.011216367594897747,
0.05464911088347435,
-0.17467109858989716,
-0.09439196437597275,
-0.02270573377609253,
-0.06931719183921814,
-0.11218071728944778,
0.05837605893611908,
0.10388980060815811,
-0.08238784968852997,
0.0025478980969637632,
-0.0036189118400216103,
0.01330843847244978,
-0.05678478628396988,
0.18629495799541473,
0.06895172595977783,
0.04923379421234131,
-0.10211517661809921,
0.0844893679022789,
0.04786021634936333,
-0.061863500624895096,
0.003803879488259554,
0.07889994233846664,
-0.09867862612009048,
-0.053205981850624084,
0.05900222063064575,
0.17500045895576477,
-0.05377772077918053,
-0.0596037320792675,
-0.14735089242458344,
-0.13265158236026764,
0.07988547533750534,
0.13361380994319916,
0.11714371293783188,
0.0073603312484920025,
-0.07273177057504654,
0.002185244346037507,
-0.11119045317173004,
0.10686644166707993,
0.0465870127081871,
0.055424951016902924,
-0.1553489863872528,
0.13742566108703613,
0.016396204009652138,
0.06254139542579651,
-0.02566162869334221,
0.02288627438247204,
-0.08485285937786102,
0.008428186178207397,
-0.10126855224370956,
-0.02166859805583954,
-0.015184788964688778,
0.009756003506481647,
-0.005930773913860321,
-0.04936628416180611,
-0.05678648129105568,
0.02513021230697632,
-0.10625382512807846,
-0.02489090897142887,
0.033812735229730606,
0.05252465978264809,
-0.1173783391714096,
-0.04442751407623291,
0.016967257484793663,
-0.06054551899433136,
0.08527270704507828,
0.053835052996873856,
0.012136760167777538,
0.05291440710425377,
-0.12204016745090485,
0.016342202201485634,
0.07364647090435028,
0.0349479541182518,
0.06090749427676201,
-0.09562382102012634,
-0.003282394725829363,
-0.00001865075682871975,
0.030004533007740974,
0.022373711690306664,
0.07710137963294983,
-0.1379968672990799,
-0.0034996774047613144,
-0.027388708665966988,
-0.06447653472423553,
-0.06685411930084229,
0.019427789375185966,
0.08218405395746231,
0.03522159904241562,
0.20542800426483154,
-0.07487155497074127,
0.045620422810316086,
-0.21430227160453796,
0.00823394488543272,
-0.0004625043657142669,
-0.11957801133394241,
-0.09532622992992401,
-0.051133062690496445,
0.05481477826833725,
-0.06431839615106583,
0.1534840315580368,
0.04941427335143089,
0.01592053472995758,
0.028808899223804474,
-0.01811968721449375,
0.018443580716848373,
0.0080113485455513,
0.18708765506744385,
0.03429579362273216,
-0.02985781617462635,
0.07225026190280914,
0.039080578833818436,
0.11195438355207443,
0.11603232473134995,
0.19704307615756989,
0.13246352970600128,
-0.0008535053348168731,
0.0951245054602623,
0.04693981632590294,
-0.06035655736923218,
-0.16422057151794434,
0.05283075198531151,
-0.03835199028253555,
0.12152128666639328,
-0.028580691665410995,
0.19253279268741608,
0.06901441514492035,
-0.16116325557231903,
0.04764442518353462,
-0.04824594035744667,
-0.08681273460388184,
-0.11683184653520584,
-0.06121692433953285,
-0.07114308327436447,
-0.1440211683511734,
-0.004636962898075581,
-0.11918430775403976,
-0.0008277559536509216,
0.1097910925745964,
0.010498684830963612,
-0.03498762845993042,
0.15053655207157135,
0.006402645725756884,
0.015238703228533268,
0.0695597305893898,
0.006615471560508013,
-0.04100614786148071,
-0.10920800268650055,
-0.05520232766866684,
-0.022179335355758667,
-0.010697388090193272,
0.03149294853210449,
-0.06045083701610565,
-0.03633178025484085,
0.04040398821234703,
-0.02195586822926998,
-0.09119494259357452,
0.005079627502709627,
0.00920786615461111,
0.05570284649729729,
0.05661286786198616,
0.007228526286780834,
0.018431752920150757,
-0.009099789895117283,
0.20788635313510895,
-0.0740276426076889,
-0.06805037707090378,
-0.09928686916828156,
0.23586858808994293,
0.031392164528369904,
-0.020330391824245453,
0.03078066185116768,
-0.06767775863409042,
-0.018910378217697144,
0.2457672357559204,
0.22598128020763397,
-0.08714233338832855,
-0.002273096702992916,
0.022417528554797173,
-0.01315001118928194,
-0.016113251447677612,
0.10431814938783646,
0.14606770873069763,
0.06461671739816666,
-0.08543983101844788,
-0.029877889901399612,
-0.06068196892738342,
-0.006037788465619087,
-0.04389375448226929,
0.07362931221723557,
0.049937255680561066,
0.007489452138543129,
-0.03518691286444664,
0.048443753272295,
-0.07235494256019592,
-0.09499064832925797,
0.038542162626981735,
-0.20198656618595123,
-0.16029119491577148,
-0.015390058979392052,
0.09282878786325455,
0.0010308976052328944,
0.057557959109544754,
-0.030114294961094856,
0.0048396289348602295,
0.08691193163394928,
-0.024416247382760048,
-0.09137684851884842,
-0.07855791598558426,
0.09151489287614822,
-0.12239585816860199,
0.22643736004829407,
-0.04916463419795036,
0.07056054472923279,
0.11964943259954453,
0.05358283966779709,
-0.06439071148633957,
0.06493911147117615,
0.04323429614305496,
-0.03877938538789749,
0.012006440199911594,
0.08003684133291245,
-0.027095813304185867,
0.06842819601297379,
0.041951268911361694,
-0.1349412053823471,
0.005635885521769524,
-0.04544267803430557,
-0.06833399087190628,
-0.040099576115608215,
-0.03195123374462128,
-0.0631319135427475,
0.12548556923866272,
0.2150724083185196,
-0.029118625447154045,
-0.012089218944311142,
-0.08287710696458817,
0.007270357105880976,
0.048678286373615265,
0.018202396109700203,
-0.046538036316633224,
-0.2092742621898651,
0.017721407115459442,
0.0428425669670105,
-0.02169046550989151,
-0.224505215883255,
-0.09362480044364929,
0.00570263434201479,
-0.07918917387723923,
-0.08765849471092224,
0.06609953939914703,
0.08304844796657562,
0.04343817010521889,
-0.059563811868429184,
-0.03427022323012352,
-0.08322528004646301,
0.1385328769683838,
-0.145302414894104,
-0.08898579329252243
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Hakha-Chin
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Hakha Chin using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "cnh", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-cnh")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-cnh/")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "cnh", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-cnh")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-cnh")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\/]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 31.38 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The script used for training can be found [here](https://colab.research.google.com/drive/1pejk9gv9vMcUOjyVQ_vsV2ngW4NiWLWy?usp=sharing).
|
{"language": "cnh", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Hakha Chin by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice cnh", "type": "common_voice", "args": "cnh"}, "metrics": [{"type": "wer", "value": 31.38, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-cnh
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"cnh",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"cnh"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cnh #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Hakha-Chin
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Hakha Chin using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 31.38 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Hakha-Chin\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hakha Chin using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 31.38 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cnh #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Hakha-Chin\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hakha Chin using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 31.38 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here."
] |
[
81,
69,
20,
29,
33
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #cnh #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Hakha-Chin\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hakha Chin using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 31.38 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here."
] |
[
-0.17322692275047302,
0.026642607524991035,
-0.001407379051670432,
-0.028769079595804214,
0.07448530942201614,
-0.04717632010579109,
0.16471755504608154,
0.10739834606647491,
-0.03518256917595863,
-0.009502695873379707,
0.02058388851583004,
-0.04933178052306175,
0.09283628314733505,
0.10286643356084824,
0.020424045622348785,
-0.1659296452999115,
0.014492196962237358,
0.01124205719679594,
0.024113643914461136,
0.10492001473903656,
0.09913039207458496,
-0.06427212804555893,
0.008795860223472118,
0.09508918970823288,
-0.14312990009784698,
0.04421079531311989,
0.01630224473774433,
-0.13442295789718628,
0.13827218115329742,
0.06420675665140152,
0.08869800716638565,
0.05046094208955765,
0.10574987530708313,
-0.20224490761756897,
0.03003830276429653,
0.0366012342274189,
0.04029448330402374,
0.02580735832452774,
0.0633237287402153,
0.008389294147491455,
0.10528723895549774,
0.08368510752916336,
-0.04153637960553169,
0.09648877382278442,
-0.061981067061424255,
-0.215006485581398,
-0.018450064584612846,
-0.010075218975543976,
0.09313930571079254,
0.14287987351417542,
-0.062264204025268555,
0.10462997853755951,
-0.15406310558319092,
0.07052694261074066,
0.1264723241329193,
-0.1596112698316574,
0.012309481389820576,
0.11768221110105515,
0.07919475436210632,
0.03421720862388611,
-0.05785197392106056,
0.031727589666843414,
0.0706111490726471,
-0.0022643092088401318,
0.015856212005019188,
-0.030110467225313187,
-0.21387872099876404,
-0.017880812287330627,
-0.1148061603307724,
-0.04289346560835838,
0.2295471578836441,
-0.014947899617254734,
-0.06459517031908035,
-0.13447324931621552,
-0.0006170012056827545,
0.01239338144659996,
0.012797126546502113,
-0.03539958596229553,
-0.011332059279084206,
0.029492823407053947,
-0.006659031845629215,
-0.017320696264505386,
-0.1132884994149208,
-0.14523862302303314,
0.05043921247124672,
0.001253026770427823,
0.02742503397166729,
0.0074493237771093845,
-0.14368341863155365,
0.09195590019226074,
-0.046402204781770706,
-0.09584619104862213,
-0.01964699849486351,
0.03737711161375046,
-0.06282849609851837,
-0.002770415274426341,
-0.04412195831537247,
-0.11872558295726776,
0.06724199652671814,
-0.04934082552790642,
0.041195012629032135,
0.026600195094943047,
-0.07473471015691757,
0.06147437170147896,
0.03761840984225273,
0.10304342955350876,
-0.07764057070016861,
0.0005568036576732993,
0.04384225979447365,
0.007772029843181372,
-0.049682218581438065,
-0.011019003577530384,
-0.0686541423201561,
-0.05894146114587784,
0.015800246968865395,
0.0943262055516243,
-0.05157903581857681,
-0.025709128007292747,
-0.03395957499742508,
-0.03249015286564827,
0.042210884392261505,
-0.10418824106454849,
-0.03862040862441063,
0.08322373032569885,
-0.013814299367368221,
0.11648305505514145,
0.0559234656393528,
0.04780561849474907,
-0.09787382930517197,
-0.04632234200835228,
0.006716195493936539,
0.06909965723752975,
-0.02322281338274479,
-0.08581022173166275,
0.037588704377412796,
-0.006228119134902954,
-0.03580303490161896,
-0.09800078719854355,
-0.12207367271184921,
-0.08836726099252701,
-0.029233096167445183,
0.02778594382107258,
0.032193247228860855,
-0.10925088077783585,
-0.0042365845292806625,
-0.042329393327236176,
-0.044270046055316925,
0.08718077093362808,
-0.05791154131293297,
0.05915110558271408,
0.04016123712062836,
0.05934121459722519,
0.061905138194561005,
0.06040153652429581,
-0.07539457827806473,
-0.04388820379972458,
0.01689731702208519,
0.13616400957107544,
-0.024735350161790848,
-0.04817402735352516,
-0.09630066156387329,
-0.07479012757539749,
-0.07120057195425034,
0.06260403990745544,
0.05664403364062309,
0.13473038375377655,
-0.2730613946914673,
-0.09293102473020554,
0.26349732279777527,
-0.13016390800476074,
-0.07018495351076126,
0.1971711814403534,
-0.021319489926099777,
0.13235415518283844,
0.1207074299454689,
0.20182472467422485,
0.10031650960445404,
-0.22007784247398376,
0.0511934831738472,
-0.01510687731206417,
0.009456872008740902,
-0.07235071808099747,
0.08758478611707687,
-0.0714401826262474,
0.0028502722270786762,
0.022747015580534935,
-0.11016403138637543,
0.09156617522239685,
-0.026222940534353256,
-0.06741217523813248,
-0.016108956187963486,
-0.07007147371768951,
0.0019362042658030987,
0.02714536525309086,
0.02520902268588543,
-0.021723225712776184,
-0.042662911117076874,
-0.01316136959940195,
0.13454362750053406,
-0.1457686424255371,
0.058825962245464325,
-0.09755044430494308,
0.07153275609016418,
-0.05365287512540817,
-0.010259023867547512,
-0.13033892214298248,
0.16708555817604065,
-0.01820565015077591,
0.03505448251962662,
0.0288616381585598,
0.12289149314165115,
0.007107111159712076,
0.044503819197416306,
-0.030426884070038795,
-0.002919259248301387,
-0.000685110513586551,
-0.02745380997657776,
-0.005514070857316256,
-0.0929194763302803,
-0.020651815459132195,
-0.047784022986888885,
0.16181223094463348,
-0.19909843802452087,
0.006240827962756157,
0.007480686996132135,
0.036436792463064194,
-0.005244039930403233,
-0.010279817506670952,
0.08500085771083832,
0.0847698450088501,
-0.012508708983659744,
-0.015646064653992653,
0.026232542470097542,
0.00961363036185503,
-0.06413371115922928,
0.12487418204545975,
-0.1299906224012375,
-0.028807783499360085,
0.10473006963729858,
-0.03596942126750946,
0.0011027406435459852,
-0.004422035999596119,
-0.005357890855520964,
-0.020822109654545784,
-0.14031103253364563,
0.0027154888957738876,
0.24953849613666534,
-0.018056627362966537,
0.13640935719013214,
-0.12483344972133636,
0.0006267203716561198,
0.015084685757756233,
-0.07618892937898636,
0.049962177872657776,
0.036724213510751724,
0.0020203960593789816,
0.0328219048678875,
0.0138484425842762,
-0.04978096857666969,
-0.08211816102266312,
0.2610783874988556,
-0.03926945477724075,
-0.10529587417840958,
0.020550070330500603,
0.01033087819814682,
-0.010965175926685333,
0.07686008512973785,
-0.17541304230690002,
-0.05869787558913231,
0.03220631554722786,
0.03656180948019028,
0.07059520483016968,
-0.1569889336824417,
0.017589686438441277,
0.007218093145638704,
-0.1350087970495224,
-0.1684732288122177,
0.06287939846515656,
-0.04811922088265419,
0.052073124796152115,
-0.10302755236625671,
-0.025029528886079788,
-0.03498302027583122,
-0.049332182854413986,
-0.18245916068553925,
0.1436934620141983,
-0.0698196068406105,
-0.20582234859466553,
-0.13921384513378143,
0.07900837063789368,
0.04312336817383766,
0.027471177279949188,
0.09369044750928879,
-0.1295296996831894,
0.006699814461171627,
-0.029526669532060623,
0.06045905873179436,
0.014148376882076263,
-0.05078712850809097,
-0.010596479289233685,
0.047003667801618576,
0.052869342267513275,
-0.1492263376712799,
0.018848663195967674,
-0.04274969547986984,
-0.036173995584249496,
-0.004347376059740782,
-0.0335901640355587,
0.031745050102472305,
0.18359939754009247,
0.040942635387182236,
0.031775325536727905,
-0.03891444578766823,
0.12457271665334702,
-0.09189506620168686,
-0.0299921166151762,
0.2168426662683487,
0.0023466108832508326,
-0.03358753025531769,
0.05647744610905647,
0.014147632755339146,
-0.06317576766014099,
-0.004816947039216757,
-0.044247716665267944,
-0.09170389920473099,
-0.2563210129737854,
-0.09446245431900024,
-0.06999242305755615,
-0.05461108312010765,
-0.01621728204190731,
-0.00727958744391799,
0.06177227944135666,
0.015563114546239376,
-0.0329924151301384,
-0.06802777200937271,
0.08695174008607864,
0.00435239402577281,
0.03404028341174126,
0.021821442991495132,
0.07745073735713959,
-0.046219032257795334,
-0.030835028737783432,
0.018319301307201385,
0.029601147398352623,
0.1454564929008484,
0.02261548489332199,
0.10976757854223251,
0.10024453699588776,
0.11630946397781372,
0.10128370672464371,
0.06920596957206726,
-0.012421466410160065,
-0.008686103858053684,
0.018401194363832474,
-0.0795874074101448,
-0.044517748057842255,
0.03196386992931366,
0.12184163928031921,
-0.04475034773349762,
-0.0536300390958786,
-0.005188970826566219,
0.021479392424225807,
0.1621111035346985,
0.09162687510251999,
-0.22287796437740326,
-0.07093272358179092,
-0.023275675252079964,
-0.015784071758389473,
-0.0017822540830820799,
0.050826720893383026,
0.16917379200458527,
-0.14794060587882996,
0.022971443831920624,
0.015932977199554443,
0.09716799110174179,
-0.021460220217704773,
0.016416041180491447,
-0.09824144095182419,
0.019173014909029007,
-0.006980908568948507,
0.0987473726272583,
-0.28614702820777893,
0.2074272185564041,
0.0006737062940374017,
0.132294699549675,
-0.06386313587427139,
0.0037035439163446426,
0.022970689460635185,
0.026142805814743042,
0.12505070865154266,
0.015947457402944565,
-0.0030395223293453455,
-0.084376759827137,
-0.07030807435512543,
0.06560228765010834,
-0.028884703293442726,
0.02275286242365837,
0.03985162824392319,
0.02351733110845089,
0.0016806276980787516,
-0.005647759884595871,
-0.045051459223032,
-0.1958828717470169,
-0.042453642934560776,
0.009533635340631008,
0.16777415573596954,
0.10955499112606049,
-0.036156877875328064,
-0.08102498948574066,
-0.07248008251190186,
0.061031196266412735,
-0.11524558812379837,
-0.05675237253308296,
-0.06336378306150436,
-0.007389596663415432,
0.08202018588781357,
-0.06188604235649109,
-0.017890485003590584,
0.09095213562250137,
0.1364801824092865,
-0.0349825844168663,
-0.021280543878674507,
0.06165294721722603,
-0.12339839339256287,
-0.1062508299946785,
-0.0254368893802166,
0.20791584253311157,
0.10637198388576508,
0.059800200164318085,
0.07576793432235718,
-0.013960366137325764,
0.004649462644010782,
-0.044868648052215576,
-0.009616067633032799,
0.14367890357971191,
-0.07959830015897751,
0.035288967192173004,
-0.020408323034644127,
-0.18724346160888672,
-0.11382506787776947,
-0.08352551609277725,
0.15838809311389923,
0.06687845289707184,
-0.04213012382388115,
0.14341233670711517,
0.2225625067949295,
-0.09501513838768005,
-0.1922202706336975,
-0.014732295647263527,
0.08912266790866852,
0.11967228353023529,
-0.030527757480740547,
-0.17951373755931854,
0.053662724792957306,
0.01746886782348156,
-0.023439472541213036,
-0.056780554354190826,
-0.3268935978412628,
-0.15245503187179565,
0.12986081838607788,
-0.024983512237668037,
0.12163729220628738,
-0.005944661796092987,
-0.027759965509176254,
-0.04453793168067932,
-0.06126886233687401,
-0.002958844415843487,
-0.18103523552417755,
0.0944969579577446,
0.02003627084195614,
0.047536056488752365,
0.029239479452371597,
-0.04468071088194847,
0.09480889141559601,
0.11758555471897125,
-0.012122008018195629,
0.010186568833887577,
0.10205520689487457,
0.06348980963230133,
-0.008932762779295444,
0.14624406397342682,
-0.07073856145143509,
0.022565461695194244,
-0.1117113009095192,
-0.07819148153066635,
-0.08264622092247009,
0.07318692654371262,
0.014582712203264236,
-0.013352898880839348,
0.01567402295768261,
-0.04440842941403389,
0.006261094473302364,
0.010670664720237255,
-0.05873145908117294,
-0.10551067441701889,
0.07886047661304474,
0.12781669199466705,
0.18251933157444,
-0.02432030625641346,
-0.050377629697322845,
-0.009738151915371418,
-0.012767884880304337,
0.13121286034584045,
-0.08960671722888947,
0.03985253721475601,
0.06639483571052551,
0.04834795370697975,
0.12310951203107834,
0.027056967839598656,
-0.11244285106658936,
0.08023395389318466,
0.04213714972138405,
-0.0412273146212101,
-0.07218044251203537,
-0.03926276043057442,
-0.06341873854398727,
-0.009640823118388653,
0.025329746305942535,
0.10987760126590729,
-0.07832277566194534,
-0.025966744869947433,
-0.02051674947142601,
0.020240113139152527,
-0.11309017986059189,
0.22747360169887543,
0.009200741536915302,
0.08001285046339035,
-0.11892415583133698,
0.04377765953540802,
0.0050275493413209915,
-0.023082418367266655,
0.042022593319416046,
-0.00586611358448863,
-0.10766841471195221,
-0.06271304190158844,
-0.025777872651815414,
0.14090459048748016,
0.03328073024749756,
-0.16014431416988373,
-0.06985026597976685,
-0.0972115695476532,
-0.01035535428673029,
0.06890140473842621,
0.04529767110943794,
0.018608467653393745,
-0.06738455593585968,
-0.06964292377233505,
-0.11865691840648651,
0.051997680217027664,
0.09536318480968475,
-0.016092492267489433,
-0.0723503977060318,
0.19356614351272583,
0.09286556392908096,
0.02889465168118477,
-0.03402938321232796,
-0.07776699215173721,
-0.02620876021683216,
0.09803039580583572,
-0.08514729142189026,
0.024378035217523575,
-0.08042803406715393,
-0.009450572542846203,
-0.011592443101108074,
-0.0540720634162426,
0.006327721755951643,
0.10642624646425247,
-0.07298164069652557,
0.060402221977710724,
-0.029027801007032394,
0.09034138917922974,
-0.09625118225812912,
0.021300844848155975,
0.007130071986466646,
-0.07522166520357132,
0.07581505924463272,
0.11596356332302094,
-0.09962832182645798,
0.12903623282909393,
-0.21125571429729462,
-0.0661977156996727,
0.07577034085988998,
0.0488484688103199,
-0.02983417548239231,
-0.13245880603790283,
0.05010940879583359,
0.08273618668317795,
0.06118643656373024,
-0.011929739266633987,
0.07873028516769409,
-0.03146923705935478,
-0.02333858795464039,
-0.06121954694390297,
0.0007124226540327072,
-0.0057089743204414845,
0.058688174933195114,
0.08521649986505508,
0.12620079517364502,
0.12235458940267563,
-0.11069963872432709,
0.10232628881931305,
-0.13461415469646454,
-0.00827618408948183,
-0.05240807309746742,
-0.003053412539884448,
-0.1435060352087021,
-0.07277190685272217,
0.0688076838850975,
-0.057375468313694,
0.1163543313741684,
0.005240899045020342,
0.0978146642446518,
-0.030537942424416542,
-0.08043797314167023,
0.0015168729005381465,
-0.006901290733367205,
0.2309749275445938,
0.04286229982972145,
0.029952900484204292,
-0.005635898094624281,
0.007355386391282082,
0.030973311513662338,
0.11940113455057144,
0.03689318522810936,
0.11686863750219345,
0.010885344818234444,
0.0969182699918747,
0.06692425161600113,
-0.07294543832540512,
-0.03833971172571182,
-0.03589889779686928,
-0.10987012088298798,
0.03051847591996193,
-0.0547674298286438,
0.10695300251245499,
0.16774360835552216,
-0.10546240210533142,
0.05167404189705849,
0.027308695018291473,
-0.101619191467762,
-0.15474192798137665,
-0.11396848410367966,
-0.04986292123794556,
-0.1349383443593979,
0.021748436614871025,
-0.08451327681541443,
0.014131163246929646,
0.05377573147416115,
0.05929993838071823,
-0.028654037043452263,
0.17646750807762146,
0.05278172716498375,
-0.12763667106628418,
0.07870878279209137,
-0.07120858877897263,
0.02356782741844654,
-0.0710408166050911,
0.05454280972480774,
0.1601179540157318,
-0.008142187260091305,
0.06844121217727661,
0.011717296205461025,
-0.025187432765960693,
0.03634122014045715,
-0.08085165917873383,
-0.05425870418548584,
-0.006681079510599375,
-0.01203702762722969,
0.0760202631354332,
0.14087340235710144,
0.1347752809524536,
-0.06680314987897873,
0.030697818845510483,
0.10770655423402786,
-0.03735479339957237,
-0.1385401338338852,
-0.1728842556476593,
0.13790136575698853,
0.055231716483831406,
-0.0031563681550323963,
-0.014399093575775623,
-0.03687063232064247,
-0.00417690584436059,
0.25209298729896545,
0.1960555613040924,
0.048955079168081284,
0.025241918861865997,
-0.03632012754678726,
-0.010345184244215488,
-0.03410746529698372,
0.06750142574310303,
0.0842655599117279,
0.17420728504657745,
-0.007058215793222189,
0.010857030749320984,
-0.06836897879838943,
-0.10246943682432175,
-0.03707767277956009,
0.03195972368121147,
-0.0887630358338356,
-0.09893450886011124,
0.022252043709158897,
0.13479799032211304,
-0.048284828662872314,
-0.11998921632766724,
-0.09067314118146896,
-0.059146132320165634,
-0.07754801213741302,
-0.011878021992743015,
-0.00014870884479023516,
0.11788775771856308,
0.007864887826144695,
-0.0664629191160202,
0.023032519966363907,
0.1835976541042328,
0.0063480413518846035,
-0.05402221158146858,
-0.07769936323165894,
0.025646498426795006,
-0.11522536724805832,
0.03335707634687424,
-0.0037840993609279394,
0.17032891511917114,
0.015825198963284492,
0.09229398518800735,
-0.03106898069381714,
0.1351260542869568,
-0.013094357214868069,
-0.006851257756352425,
0.010962991043925285,
0.12245772033929825,
-0.042394425719976425,
0.0787353590130806,
0.018009889870882034,
-0.1394142210483551,
0.05124864727258682,
-0.0980096086859703,
0.014403847977519035,
-0.10578920692205429,
0.06285757571458817,
-0.03143720328807831,
0.09781688451766968,
0.07094354182481766,
-0.0790608823299408,
-0.03180736303329468,
-0.04736855626106262,
0.0723169669508934,
0.017636002972722054,
-0.030962307006120682,
-0.03957940638065338,
-0.23717020452022552,
-0.0006466420600190759,
-0.10590814799070358,
0.0005784790264442563,
-0.1932273507118225,
0.008203865960240364,
-0.012413966469466686,
-0.09754195064306259,
0.010266942903399467,
0.06166744977235794,
0.08089647442102432,
0.04243773967027664,
-0.0009809014154598117,
-0.02256365492939949,
0.05540349334478378,
0.11687643826007843,
-0.1618070751428604,
-0.11055975407361984
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Esperanto
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Esperanto using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "eo", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained('gchhablani/wav2vec2-large-xlsr-eo')
model = Wav2Vec2ForCTC.from_pretrained('gchhablani/wav2vec2-large-xlsr-eo')
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
import jiwer
def chunked_wer(targets, predictions, chunk_size=None):
if chunk_size is None: return jiwer.wer(targets, predictions)
start = 0
end = chunk_size
H, S, D, I = 0, 0, 0, 0
while start < len(targets):
chunk_metrics = jiwer.compute_measures(targets[start:end], predictions[start:end])
H = H + chunk_metrics["hits"]
S = S + chunk_metrics["substitutions"]
D = D + chunk_metrics["deletions"]
I = I + chunk_metrics["insertions"]
start += chunk_size
end += chunk_size
return float(S + D + I) / float(H + S + D)
test_dataset = load_dataset("common_voice", "eo", split="test") #TODO: replace {lang_id} in your language code here. Make sure the code is one of the *ISO codes* of [this](https://huggingface.co/languages) site.
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained('gchhablani/wav2vec2-large-xlsr-eo')
model = Wav2Vec2ForCTC.from_pretrained('gchhablani/wav2vec2-large-xlsr-eo')
model.to("cuda")
chars_to_ignore_regex = """[\\\\\\\\,\\\\\\\\?\\\\\\\\.\\\\\\\\!\\\\\\\\-\\\\\\\\;\\\\\\\\:\\\\\\\\"\\\\\\\\“\\\\\\\\%\\\\\\\\‘\\\\\\\\”\\\\\\\\�\\\\\\\\„\\\\\\\\«\\\\\\\\(\\\\\\\\»\\\\\\\\)\\\\\\\\’\\\\\\\\']"""
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower().replace('—',' ').replace('–',' ')
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * chunked_wer(predictions=result["pred_strings"], targets=result["sentence"],chunk_size=5000)))
```
**Test Result**: 10.13 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The code can be found [here](https://github.com/gchhablani/wav2vec2-week/blob/main/fine-tune-xlsr-wav2vec2-on-esperanto-asr-with-transformers-final.ipynb).
|
{"language": "eo", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Esperanto by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice eo", "type": "common_voice", "args": "eo"}, "metrics": [{"type": "wer", "value": 10.13, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-eo
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"eo",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"eo"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #eo #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Esperanto
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Esperanto using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 10.13 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The code can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Esperanto\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Esperanto using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 10.13 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #eo #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Esperanto\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Esperanto using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 10.13 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
80,
63,
20,
29,
30
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #eo #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Esperanto\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Esperanto using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 10.13 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
-0.16733308136463165,
0.08048710227012634,
-0.0028541393112391233,
-0.004466724116355181,
0.11843447387218475,
-0.04720776528120041,
0.13254572451114655,
0.09248120337724686,
0.018039386719465256,
-0.022427255287766457,
0.005296843592077494,
0.07294576615095139,
0.04451729729771614,
0.13246631622314453,
0.011050814762711525,
-0.2017822265625,
-0.01702948659658432,
0.0445227324962616,
0.0319436676800251,
0.09422668814659119,
0.11503401398658752,
-0.07787379622459412,
-0.0008387087727896869,
0.09301043301820755,
-0.13465313613414764,
0.09376562386751175,
0.006119367200881243,
-0.1536330282688141,
0.12721996009349823,
0.052317772060632706,
0.10187164694070816,
0.05528587847948074,
0.07087895274162292,
-0.14222055673599243,
0.016554497182369232,
0.02217200957238674,
0.0008558914414606988,
0.0115056112408638,
0.07006758451461792,
-0.04706012085080147,
0.03327442705631256,
0.09605143219232559,
-0.027211733162403107,
0.07266094535589218,
-0.06624405086040497,
-0.23351514339447021,
0.030247975140810013,
-0.06475111097097397,
0.12043943256139755,
0.1673537641763687,
-0.04346451908349991,
0.09491774439811707,
-0.11182766407728195,
0.07815307378768921,
0.11216992884874344,
-0.13539265096187592,
0.02352028712630272,
0.030584074556827545,
0.050348542630672455,
0.08882716298103333,
-0.022103326395154,
-0.008579778485000134,
0.0687599703669548,
0.0047493730671703815,
0.012020167894661427,
-0.0657973438501358,
-0.1967495083808899,
-0.024195775389671326,
-0.1254095882177353,
-0.03871814161539078,
0.19538342952728271,
-0.017825040966272354,
-0.07234404236078262,
-0.09490086883306503,
-0.0269970390945673,
-0.003928863909095526,
0.024456096813082695,
-0.07292915880680084,
0.02539895847439766,
0.02842552401125431,
0.02275754138827324,
-0.04409938305616379,
-0.10523773729801178,
-0.1346408575773239,
0.006010523531585932,
0.11996952444314957,
0.02177366241812706,
-0.0064644538797438145,
-0.12566301226615906,
0.056804556399583817,
-0.05478079244494438,
-0.0655163824558258,
-0.01564270257949829,
0.04582147300243378,
-0.018576137721538544,
0.04803977534174919,
-0.07244843244552612,
-0.16410306096076965,
0.050070565193891525,
-0.04522065073251724,
0.0032324367202818394,
0.005090727470815182,
0.0070380219258368015,
0.0657215267419815,
0.045677561312913895,
0.11470691859722137,
-0.0070977830328047276,
-0.03482446447014809,
0.03481860086321831,
-0.03132788464426994,
-0.04598168656229973,
-0.02675020694732666,
-0.12089209258556366,
-0.049267806112766266,
-0.01296920608729124,
0.06616372615098953,
-0.04354209825396538,
-0.04116496071219444,
-0.05431045964360237,
-0.06768617033958435,
0.04167467728257179,
-0.11112090200185776,
-0.012293720617890358,
0.0742620974779129,
-0.00937349908053875,
0.10449688136577606,
0.03191518783569336,
0.030056528747081757,
-0.11842340975999832,
-0.08808618783950806,
0.0029174911323934793,
0.04412386193871498,
-0.06167372688651085,
-0.10770384967327118,
-0.0009832805953919888,
-0.056486621499061584,
0.021686362102627754,
-0.13910041749477386,
-0.1174464300274849,
-0.09960034489631653,
-0.037058085203170776,
0.04339604079723358,
-0.0013970626750960946,
-0.138682022690773,
-0.00322951702401042,
-0.05110577121376991,
-0.06403130292892456,
-0.01566726341843605,
-0.04718071222305298,
0.051451023668050766,
0.06537266820669174,
0.05635971203446388,
0.008801366202533245,
0.08822977542877197,
-0.0887887179851532,
-0.09169728308916092,
-0.021387288346886635,
0.14436303079128265,
-0.03008417598903179,
-0.05765809863805771,
-0.07453662902116776,
-0.07149520516395569,
-0.04282683506608009,
0.11045652627944946,
0.06083878129720688,
0.10548847913742065,
-0.22212253510951996,
-0.12266626209020615,
0.1981259286403656,
-0.12002590298652649,
-0.02405884861946106,
0.15619099140167236,
0.003309838706627488,
0.12402292340993881,
0.11011898517608643,
0.26224055886268616,
0.12869787216186523,
-0.14798568189144135,
0.07620896399021149,
0.011375179514288902,
0.00622265599668026,
-0.1352182924747467,
0.0932295173406601,
-0.08450744301080704,
-0.0017424889374524355,
0.033059582114219666,
-0.0866892859339714,
0.0777125135064125,
-0.017256584018468857,
-0.03408889099955559,
0.010027497075498104,
-0.06496074795722961,
0.03126097097992897,
0.03546157851815224,
0.046186916530132294,
-0.03372154384851456,
-0.06054513901472092,
0.09106411784887314,
0.12635740637779236,
-0.14860737323760986,
0.041564907878637314,
-0.11631817370653152,
0.06316271424293518,
-0.06906723231077194,
-0.038646478205919266,
-0.14625997841358185,
0.20348533987998962,
-0.006121678277850151,
0.043101031333208084,
0.035281695425510406,
0.16145892441272736,
-0.00990165676921606,
0.019517771899700165,
0.006951541174203157,
-0.057583652436733246,
-0.04741249233484268,
-0.008337498642504215,
-0.0315149761736393,
-0.047993987798690796,
0.0036309396382421255,
-0.05782722309231758,
0.021118924021720886,
-0.12399446964263916,
-0.002297027502208948,
0.0036575086414813995,
0.010733946226537228,
0.0019882062915712595,
-0.005287500564008951,
0.038286905735731125,
0.12481781095266342,
-0.023020559921860695,
0.003919834271073341,
0.04067005589604378,
0.026125090196728706,
-0.010006114840507507,
0.10006147623062134,
-0.11056262254714966,
0.04717465117573738,
0.11260032653808594,
-0.09790602326393127,
0.0025876369327306747,
0.07895360887050629,
-0.006713098380714655,
0.0259271077811718,
-0.10322796553373337,
-0.03411058709025383,
0.3545953631401062,
-0.0401063933968544,
0.14925682544708252,
-0.11074017733335495,
0.016022393479943275,
0.014929632656276226,
-0.07706810534000397,
0.050605352967977524,
0.06929628551006317,
0.04030010104179382,
0.09079055488109589,
0.0581800602376461,
-0.04360540211200714,
-0.09672459214925766,
0.24894672632217407,
-0.01485554501414299,
-0.09925822168588638,
0.027209755033254623,
-0.025686688721179962,
-0.023910537362098694,
0.038551442325115204,
-0.20760399103164673,
-0.058086466044187546,
0.01815858855843544,
0.05700379237532616,
0.0991377905011177,
-0.14251922070980072,
0.016079392284154892,
0.03512651100754738,
-0.11239681392908096,
-0.1516529768705368,
0.031057873740792274,
-0.05833154544234276,
0.02283935621380806,
-0.1121271476149559,
-0.08157239854335785,
-0.01079422514885664,
-0.03535722568631172,
-0.16760964691638947,
0.15808984637260437,
-0.10040358453989029,
-0.27985161542892456,
-0.17052771151065826,
0.11690938472747803,
0.04451938346028328,
0.04964711517095566,
0.14125798642635345,
-0.13160790503025055,
0.000314966106088832,
-0.021214907988905907,
0.09486356377601624,
0.001012178254313767,
-0.04358172044157982,
0.009967347607016563,
0.03158103674650192,
0.031750429421663284,
-0.17669512331485748,
-0.002269109943881631,
-0.028071457520127296,
-0.05249444395303726,
-0.00345664587803185,
-0.03677131235599518,
0.02711152844130993,
0.20623892545700073,
0.023124631494283676,
0.011545923538506031,
-0.0255596823990345,
0.15790188312530518,
-0.09094756096601486,
-0.0482926219701767,
0.23829300701618195,
0.016085445880889893,
-0.024355286732316017,
0.02564726583659649,
0.021623563021421432,
-0.0811995342373848,
-0.007865875028073788,
-0.03201280161738396,
-0.06699741631746292,
-0.2704181671142578,
-0.10717539489269257,
-0.050034791231155396,
-0.06562396883964539,
-0.03842678666114807,
0.013907291926443577,
0.06515181064605713,
0.025165705010294914,
0.025545231997966766,
-0.05017835274338722,
0.013140658847987652,
0.0046155923046171665,
0.07504073530435562,
-0.01341578084975481,
0.06295778602361679,
-0.0449991375207901,
-0.05784275382757187,
0.02664164826273918,
0.04591599479317665,
0.1151382103562355,
0.047258198261260986,
0.11394332349300385,
0.08092400431632996,
0.059448208659887314,
0.0912742018699646,
0.07710816711187363,
-0.0328843779861927,
0.006638445891439915,
-0.013719078153371811,
-0.037851732224226,
-0.04062510281801224,
0.034228209406137466,
0.10897184163331985,
-0.02860936149954796,
-0.10309235006570816,
-0.05702411010861397,
0.011694364249706268,
0.13782213628292084,
0.06374707818031311,
-0.22139406204223633,
-0.06197551265358925,
-0.02256663329899311,
-0.038772061467170715,
0.0010278960689902306,
0.07953007519245148,
0.15924790501594543,
-0.11197911947965622,
-0.014502567239105701,
0.0475311316549778,
0.09575680643320084,
-0.02880781888961792,
0.018080100417137146,
-0.1039222851395607,
0.049724362790584564,
0.017077598720788956,
0.09827946126461029,
-0.251417338848114,
0.1871180236339569,
-0.0038468881975859404,
0.13651107251644135,
-0.03634992614388466,
0.01819881796836853,
-0.004546074196696281,
0.04327624663710594,
0.12357033044099808,
0.008158327080309391,
0.021242443472146988,
-0.09732063114643097,
-0.05537601187825203,
0.051314663141965866,
-0.016437573358416557,
-0.01799554005265236,
0.017471501603722572,
0.028317591175436974,
0.0012345421127974987,
0.0017795581370592117,
-0.08454247564077377,
-0.13918448984622955,
-0.04499524459242821,
-0.012697671540081501,
0.1482560634613037,
0.11447525769472122,
-0.016017727553844452,
-0.08987122029066086,
-0.04260113462805748,
0.07718010991811752,
-0.14001663029193878,
-0.07705853134393692,
-0.07563300430774689,
-0.0643591657280922,
0.07003553956747055,
-0.07785075902938843,
0.01851016841828823,
0.13152195513248444,
0.11504223942756653,
-0.020412974059581757,
-0.02646222710609436,
0.08347156643867493,
-0.13036152720451355,
-0.06633439660072327,
-0.03630894422531128,
0.14277836680412292,
0.0944322794675827,
0.04847758635878563,
0.062031932175159454,
-0.014872358180582523,
-0.006785917095839977,
-0.05485117435455322,
0.013114187866449356,
0.14210598170757294,
-0.09153027832508087,
0.020650528371334076,
-0.012571269646286964,
-0.1487192064523697,
-0.09676066040992737,
-0.05862392112612724,
0.17028652131557465,
0.06939443200826645,
-0.04253813624382019,
0.13955728709697723,
0.22276216745376587,
-0.1254129558801651,
-0.20690567791461945,
-0.00006811170896980911,
0.13165245950222015,
0.12349237501621246,
-0.007944058626890182,
-0.2212609350681305,
0.055830419063568115,
0.02241009660065174,
-0.006943514104932547,
-0.04541923105716705,
-0.3137394189834595,
-0.1289607137441635,
0.14973920583724976,
0.013238183222711086,
0.17461690306663513,
-0.016942352056503296,
-0.028861235827207565,
-0.022817088291049004,
-0.0395306833088398,
0.029767895117402077,
-0.10396786779165268,
0.09604042768478394,
0.039336975663900375,
0.04916752129793167,
0.037741757929325104,
-0.012747797183692455,
0.07448132336139679,
0.09430299699306488,
-0.02240127883851528,
0.02813948504626751,
0.04548485204577446,
0.02351602166891098,
0.026158491149544716,
0.10926654934883118,
-0.03266433626413345,
0.03214603662490845,
-0.16783730685710907,
-0.10689253360033035,
-0.06004801020026207,
0.09081537276506424,
0.03690500184893608,
-0.04712614044547081,
0.051949672400951385,
-0.06499400734901428,
-0.00006939646118553355,
0.002600681269541383,
0.01012361142784357,
-0.09583931416273117,
0.03253300115466118,
0.16812105476856232,
0.1635223776102066,
-0.05251435562968254,
-0.13718217611312866,
-0.008880173787474632,
-0.01567269116640091,
0.12621185183525085,
-0.0871795266866684,
0.03311369568109512,
0.06262961775064468,
0.03817424550652504,
0.10204306989908218,
0.03376280516386032,
-0.10127951949834824,
0.10373908281326294,
0.049652907997369766,
-0.013689505867660046,
-0.07433728873729706,
-0.06441725045442581,
-0.06975561380386353,
0.04850199818611145,
0.07256540656089783,
0.1247548833489418,
-0.0628814697265625,
-0.05760376900434494,
-0.05173353850841522,
0.0002885714638978243,
-0.10893657058477402,
0.211581289768219,
0.06585194170475006,
0.057012271136045456,
-0.09363774955272675,
0.044402461498975754,
0.005211584735661745,
-0.07298660278320312,
0.060393691062927246,
-0.009254016913473606,
-0.06549742072820663,
-0.07325612008571625,
0.0058188424445688725,
0.14058102667331696,
-0.0265232902020216,
-0.1555396169424057,
-0.13117198646068573,
-0.05823981761932373,
-0.01122029684484005,
0.07012761384248734,
0.06406962871551514,
-0.012835257686674595,
-0.07889110594987869,
-0.04177810251712799,
-0.08518390357494354,
0.018759293481707573,
0.05673450976610184,
-0.0135217709466815,
-0.08859612792730331,
0.20303316414356232,
0.08267412334680557,
0.03174044191837311,
-0.041081614792346954,
-0.0742909386754036,
-0.05807831883430481,
0.08060606569051743,
-0.03223650902509689,
0.021899119019508362,
-0.05204911157488823,
-0.007170974276959896,
0.006709778215736151,
-0.07725339382886887,
-0.010936398059129715,
0.11757057905197144,
-0.10259287804365158,
0.06541787832975388,
0.003657236695289612,
0.08592066913843155,
-0.04891655594110489,
0.016691947355866432,
-0.014572126790881157,
-0.0438317134976387,
0.056493744254112244,
0.09732253104448318,
-0.10217905789613724,
0.11421244591474533,
-0.21955867111682892,
-0.01656205579638481,
0.08618094027042389,
0.07474078983068466,
0.012470245361328125,
-0.073586106300354,
0.07352642714977264,
0.0983777865767479,
0.04367876425385475,
-0.05437770485877991,
0.08843334019184113,
-0.0728614330291748,
-0.000809582183137536,
-0.04661022499203682,
-0.0443900041282177,
-0.029118768870830536,
0.038046546280384064,
0.09824958443641663,
0.10263911634683609,
0.14547353982925415,
-0.10249479115009308,
0.12316206842660904,
-0.10135651379823685,
0.0034126692917197943,
-0.04035648703575134,
0.004318528808653355,
-0.11704415827989578,
-0.07940443605184555,
0.05921214073896408,
-0.045635294169187546,
0.12537534534931183,
0.06287579238414764,
0.1419481486082077,
-0.042647428810596466,
-0.14308273792266846,
0.058382656425237656,
-0.0008581093861721456,
0.23737184703350067,
0.023879367858171463,
0.0410788469016552,
-0.057664692401885986,
0.04582037031650543,
0.0703682079911232,
0.12524856626987457,
-0.023156896233558655,
0.14340168237686157,
0.007742757443338633,
0.1294732242822647,
0.07029690593481064,
-0.04440564289689064,
-0.09595127403736115,
-0.08559755235910416,
-0.02710098773241043,
0.02840375155210495,
-0.04670164734125137,
0.1188943088054657,
0.10918072611093521,
-0.07967475056648254,
0.08010223507881165,
0.051514554768800735,
-0.0650034248828888,
-0.1548340618610382,
-0.08393210172653198,
-0.05715935304760933,
-0.20057356357574463,
0.005644362885504961,
-0.11005289852619171,
0.012295008637011051,
0.007746435701847076,
0.025178633630275726,
-0.038472190499305725,
0.12539860606193542,
0.048601724207401276,
-0.1250898540019989,
0.06107655167579651,
-0.07294837385416031,
0.045794226229190826,
-0.10943770408630371,
0.03815237805247307,
0.1275138258934021,
0.04363156855106354,
0.05782851204276085,
0.004922159016132355,
-0.07414648681879044,
0.01429734192788601,
-0.06793871521949768,
-0.042569104582071304,
-0.017632968723773956,
-0.014136895537376404,
0.03631921485066414,
0.12029334157705307,
0.12463094294071198,
-0.09862489998340607,
0.03580546751618385,
0.12443316727876663,
-0.041922036558389664,
-0.11648868769407272,
-0.11602447181940079,
0.15966829657554626,
0.12342200428247452,
0.0706854835152626,
0.0070127504877746105,
-0.08590686321258545,
-0.05313269421458244,
0.24543270468711853,
0.16908308863639832,
0.07796170562505722,
0.017594758421182632,
-0.002537249121814966,
-0.0005779267521575093,
-0.0021560543682426214,
0.06501612812280655,
0.08177091926336288,
0.28649261593818665,
0.00506211631000042,
-0.019071023911237717,
-0.08430221676826477,
-0.03982151299715042,
-0.02515227720141411,
0.045094236731529236,
-0.08625693619251251,
-0.1114792749285698,
0.02517145499587059,
0.11460620909929276,
-0.055046409368515015,
-0.11891257762908936,
-0.05630602687597275,
-0.08672486990690231,
-0.0789380818605423,
-0.016561197116971016,
0.04466712474822998,
0.11599113792181015,
0.045261528342962265,
-0.03955942392349243,
0.002765270881354809,
0.17840659618377686,
0.008063551038503647,
-0.09237592667341232,
-0.0619824044406414,
-0.006352352909743786,
-0.20672042667865753,
0.032693248242139816,
-0.02346084453165531,
0.17470861971378326,
0.0250078197568655,
0.11643426865339279,
-0.0038617057725787163,
0.17411182820796967,
-0.0325433611869812,
-0.018655259162187576,
0.08565491437911987,
0.06223765388131142,
-0.0522795207798481,
0.08487042039632797,
-0.006703831721097231,
-0.15519815683364868,
0.06306304782629013,
-0.06906506419181824,
-0.011023660190403461,
-0.08658559620380402,
0.040447235107421875,
-0.07995874434709549,
0.06447772681713104,
0.0434727743268013,
-0.07440163940191269,
-0.06605245918035507,
-0.02745025046169758,
0.10259188711643219,
0.03100106120109558,
-0.036509666591882706,
-0.03423742204904556,
-0.2469210922718048,
-0.027068160474300385,
-0.056730467826128006,
0.00691754836589098,
-0.16989843547344208,
0.02298537641763687,
-0.021667854860424995,
-0.09870762377977371,
-0.02639886736869812,
0.0012773098424077034,
0.0671665295958519,
0.028754109516739845,
-0.01416686363518238,
0.02413279563188553,
0.019745346158742905,
0.13544988632202148,
-0.13997860252857208,
-0.1325104981660843
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Gujarati
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Gujarati using the [OpenSLR SLR78](http://openslr.org/78/) dataset. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Gujarati `sentence` and `path` fields:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET.
# For sample see the Colab link in Training Section.
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-gu")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-gu")
resampler = torchaudio.transforms.Resample(48_000, 16_000) # The original data was with 48,000 sampling rate. You can change it according to your input.
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset_eval = test_dataset_eval.map(speech_file_to_array_fn)
inputs = processor(test_dataset_eval["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset_eval["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-gu")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-gu")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…\'\_\’]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"),
attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 23.55 %
## Training
90% of the OpenSLR Gujarati Male+Female dataset was used for training, after removing few examples that contained Roman characters.
The colab notebook used for training can be found [here](https://colab.research.google.com/drive/1fRQlgl4EPR4qKGScgza3MpWgbL5BeWtn?usp=sharing).
|
{"language": "gu", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["openslr"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Gujarati by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "OpenSLR gu", "type": "openslr"}, "metrics": [{"type": "wer", "value": 23.55, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-gu
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"gu",
"dataset:openslr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"gu"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #gu #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Gujarati
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Gujarati using the OpenSLR SLR78 dataset. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Gujarati 'sentence' and 'path' fields:
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
Test Result: 23.55 %
## Training
90% of the OpenSLR Gujarati Male+Female dataset was used for training, after removing few examples that contained Roman characters.
The colab notebook used for training can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Gujarati\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Gujarati using the OpenSLR SLR78 dataset. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Gujarati 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 23.55 %",
"## Training\n\n90% of the OpenSLR Gujarati Male+Female dataset was used for training, after removing few examples that contained Roman characters.\nThe colab notebook used for training can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #gu #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Gujarati\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Gujarati using the OpenSLR SLR78 dataset. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Gujarati 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 23.55 %",
"## Training\n\n90% of the OpenSLR Gujarati Male+Female dataset was used for training, after removing few examples that contained Roman characters.\nThe colab notebook used for training can be found here."
] |
[
78,
69,
41,
29,
44
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #gu #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Gujarati\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Gujarati using the OpenSLR SLR78 dataset. When using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Gujarati 'sentence' and 'path' fields:## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 23.55 %## Training\n\n90% of the OpenSLR Gujarati Male+Female dataset was used for training, after removing few examples that contained Roman characters.\nThe colab notebook used for training can be found here."
] |
[
-0.15924136340618134,
0.08624093979597092,
0.001232543378137052,
0.016768136993050575,
0.05502745881676674,
-0.02654704451560974,
0.09700138866901398,
0.11256138235330582,
-0.0002939591067843139,
0.0250511784106493,
0.08652672171592712,
-0.021084830164909363,
0.05346959829330444,
0.11917193979024887,
-0.013632760383188725,
-0.10013718158006668,
0.007259096018970013,
-0.023093661293387413,
0.01566888950765133,
0.1011759489774704,
0.1026538833975792,
-0.07310350239276886,
-0.018496928736567497,
0.07685660570859909,
-0.07946718484163284,
0.0355985127389431,
0.0071335481479763985,
-0.06904353946447372,
0.08124106377363205,
0.06657807528972626,
0.07156876474618912,
0.04301373288035393,
0.08479779213666916,
-0.20283348858356476,
0.02294992469251156,
0.0063189188949763775,
0.037946272641420364,
0.004529118537902832,
0.07466431707143784,
0.028447609394788742,
0.089224673807621,
0.014704965986311436,
-0.017215611413121223,
0.08278107643127441,
-0.028721364215016365,
-0.24613864719867706,
-0.03095833957195282,
-0.07779476791620255,
0.02253296598792076,
0.17160239815711975,
-0.045052073895931244,
0.15114346146583557,
-0.10523044317960739,
0.10216348618268967,
0.0954643115401268,
-0.1262003481388092,
-0.018312757834792137,
0.004753198474645615,
0.03574439138174057,
0.09081478416919708,
-0.12030751258134842,
-0.002944058272987604,
0.08476906269788742,
-0.007306975312530994,
-0.03421897441148758,
0.009268548339605331,
-0.14688672125339508,
-0.008989006280899048,
-0.0776621401309967,
-0.027568703517317772,
0.23265929520130157,
0.014824045822024345,
-0.08557003736495972,
-0.11601338535547256,
-0.0647617056965828,
-0.008362747728824615,
0.012718775309622288,
0.0277173463255167,
-0.040715329349040985,
0.04788067191839218,
0.022167298942804337,
-0.06545085459947586,
-0.12190989404916763,
-0.11357743293046951,
0.05291277915239334,
0.1176084578037262,
0.04176108539104462,
0.026559356600046158,
-0.05322250351309776,
0.023032164201140404,
-0.052190911024808884,
-0.1008920967578888,
-0.056630413979291916,
0.01819169707596302,
-0.08735363185405731,
0.026674725115299225,
-0.08298300206661224,
-0.15646280348300934,
0.06465578079223633,
-0.026531130075454712,
-0.04026006907224655,
0.04033300653100014,
0.005396798253059387,
0.0072863223031163216,
-0.014615548774600029,
0.08098766207695007,
-0.032758165150880814,
0.03326267749071121,
0.03562663123011589,
-0.02396349422633648,
0.00008601228182669729,
-0.00832296907901764,
0.01788627728819847,
-0.10061576217412949,
-0.0015964596532285213,
0.06610031425952911,
-0.08298150449991226,
-0.008087958209216595,
0.012093557976186275,
0.01763675920665264,
-0.005868668667972088,
-0.12406477332115173,
-0.03248406574130058,
0.0321236215531826,
0.018367033451795578,
0.09278594702482224,
0.03109343722462654,
0.0031469787936657667,
-0.013058283366262913,
-0.11218931525945663,
-0.01691289246082306,
0.03325067460536957,
-0.1022014394402504,
-0.09752411395311356,
-0.015019762329757214,
-0.08213626593351364,
-0.03494078293442726,
-0.09521322697401047,
-0.19473029673099518,
-0.04910500720143318,
-0.031403549015522,
-0.009824282489717007,
0.06451132148504257,
-0.1135740801692009,
-0.029860058799386024,
0.01596767269074917,
-0.023288261145353317,
0.10687649250030518,
-0.0338420644402504,
0.09743677079677582,
0.054202307015657425,
0.06498085707426071,
-0.003101927461102605,
0.06054374948143959,
-0.06077291816473007,
-0.013702389784157276,
0.1414097398519516,
0.11942921578884125,
-0.08023766428232193,
-0.03567080199718475,
-0.09830909222364426,
-0.05861666798591614,
-0.05327794700860977,
0.037448350340127945,
0.08772514015436172,
0.09035515040159225,
-0.24372906982898712,
-0.0012893248349428177,
0.2591724395751953,
-0.14803831279277802,
-0.007102688774466515,
0.1425512582063675,
-0.0027826563455164433,
0.10788369178771973,
0.061257798224687576,
0.155344620347023,
0.10917045921087265,
-0.14366281032562256,
0.03871436417102814,
0.04841584339737892,
0.049949321895837784,
-0.015413036569952965,
0.10456739366054535,
-0.07801135629415512,
0.031544458121061325,
0.030035432428121567,
-0.06155001372098923,
0.07681635767221451,
-0.05894806236028671,
-0.07207382470369339,
-0.06074385717511177,
-0.09833921492099762,
0.008945208974182606,
0.012889762409031391,
0.057713501155376434,
-0.0471160002052784,
-0.05649244785308838,
-0.018684793263673782,
0.1739928126335144,
-0.12254539877176285,
0.03977760672569275,
-0.12527525424957275,
0.08890388906002045,
-0.08983509987592697,
-0.03203441575169563,
-0.12279890477657318,
0.11369802802801132,
0.01913738250732422,
0.048121996223926544,
-0.033977191895246506,
0.18666139245033264,
0.049266718327999115,
0.011633949354290962,
-0.02490908093750477,
-0.008927746675908566,
0.029660770669579506,
-0.016382567584514618,
-0.041284091770648956,
-0.11265548318624496,
0.007988444529473782,
-0.05721769109368324,
0.08575425297021866,
-0.20092757046222687,
-0.004057594574987888,
0.035890813916921616,
0.027743805199861526,
0.004386678338050842,
-0.03989091515541077,
0.05689496174454689,
0.04250182956457138,
0.004513842985033989,
-0.03135271742939949,
0.023548278957605362,
0.02894444763660431,
-0.02713119424879551,
0.08968520164489746,
-0.030369797721505165,
-0.017977412790060043,
0.051590729504823685,
-0.01014877948909998,
-0.08828827738761902,
0.07141313701868057,
-0.030322203412652016,
-0.014996788464486599,
-0.11512689292430878,
-0.0550307035446167,
0.1863078773021698,
-0.006935599725693464,
0.11918623745441437,
-0.05813499167561531,
-0.014461965300142765,
0.017235135659575462,
-0.08567994832992554,
0.03873651847243309,
0.06926550716161728,
-0.0012457255506888032,
-0.05643122270703316,
0.06041315570473671,
-0.08334024995565414,
-0.15913325548171997,
0.2250104397535324,
-0.037488218396902084,
-0.10350474715232849,
0.042776305228471756,
0.08061708509922028,
-0.06960522383451462,
0.021212415769696236,
-0.1767635941505432,
-0.011147648096084595,
0.043999265879392624,
0.06219182908535004,
0.035023048520088196,
-0.12360271066427231,
-0.006805430632084608,
0.03956839069724083,
-0.07044733315706253,
-0.09819851070642471,
0.10910503566265106,
-0.08980052173137665,
0.06823237985372543,
-0.08820793032646179,
0.02660498581826687,
0.00008546561002731323,
-0.037851929664611816,
-0.18614798784255981,
0.12577711045742035,
-0.0776183009147644,
-0.07711253315210342,
-0.13341137766838074,
0.024161985144019127,
0.07544272392988205,
-0.006280891597270966,
0.08102107793092728,
-0.14827848970890045,
-0.06514661014080048,
-0.053675126284360886,
0.0063468581065535545,
-0.010722423903644085,
-0.009092428721487522,
0.0014004420954734087,
-0.005356389097869396,
0.063146211206913,
-0.1458645910024643,
0.03585813194513321,
0.04525890573859215,
-0.05575171858072281,
0.03232700377702713,
-0.01737436093389988,
0.03152432292699814,
0.03792651370167732,
-0.03741548955440521,
0.016750434413552284,
0.027015047147870064,
0.15344573557376862,
-0.05074445530772209,
-0.022071154788136482,
0.1708216816186905,
0.01468137837946415,
-0.04373275488615036,
0.03650384023785591,
-0.018145525828003883,
-0.0929640382528305,
0.008449920453131199,
-0.005988556891679764,
-0.06489113718271255,
-0.24918630719184875,
-0.051951926201581955,
-0.09718598425388336,
-0.09937193989753723,
-0.03646248206496239,
-0.011046509258449078,
0.034615810960531235,
0.041629038751125336,
-0.0352075956761837,
-0.034079838544130325,
-0.045065999031066895,
-0.008560574613511562,
0.12273844331502914,
-0.018322937190532684,
0.08401833474636078,
-0.10079533606767654,
0.017838213592767715,
0.03851636126637459,
-0.038771454244852066,
0.2320985049009323,
-0.007386969868093729,
0.15966884791851044,
0.1218423992395401,
0.119059719145298,
0.05667934566736221,
0.030214328318834305,
-0.03799303248524666,
-0.028000961989164352,
0.025944342836737633,
-0.06532226502895355,
-0.049608565866947174,
0.08140354603528976,
0.058073557913303375,
-0.03938675299286842,
0.00611426355317235,
-0.047320153564214706,
0.02427603118121624,
0.18408043682575226,
0.05916575714945793,
-0.20124034583568573,
-0.07825006544589996,
-0.015987399965524673,
-0.10131891071796417,
-0.0389733761548996,
-0.0002832069876603782,
0.2008277326822281,
-0.08584647625684738,
0.0053955973125994205,
0.03470822423696518,
0.07649320363998413,
0.011725395917892456,
-0.015066883526742458,
-0.07505997270345688,
0.007796278689056635,
-0.01297159492969513,
0.10465609282255173,
-0.2940201461315155,
0.2376244217157364,
0.013139464892446995,
0.11666146665811539,
-0.0028640853706747293,
-0.023847704753279686,
0.013544430024921894,
-0.07141619175672531,
0.10034743696451187,
-0.017636766657233238,
-0.009006411768496037,
-0.06864093244075775,
-0.05732543021440506,
0.07661004364490509,
-0.0017742131603881717,
0.10192108154296875,
0.08279260247945786,
-0.013448499143123627,
0.042389821261167526,
0.042916007339954376,
0.07664617151021957,
-0.15109308063983917,
-0.03448992595076561,
-0.008861756883561611,
0.1537657380104065,
0.014499399811029434,
-0.043607331812381744,
-0.10413403064012527,
-0.11974900960922241,
0.10878562927246094,
-0.11706464737653732,
-0.08627299964427948,
-0.09006090462207794,
0.07821105420589447,
0.06634264439344406,
-0.039551619440317154,
0.02589539624750614,
0.06466447561979294,
0.0975654199719429,
-0.012025292962789536,
0.04823561757802963,
-0.009416284039616585,
-0.050594527274370193,
-0.12319876253604889,
-0.03061188943684101,
0.10615084320306778,
0.09891211986541748,
0.030482081696391106,
0.018376585096120834,
0.007822980172932148,
-0.00030430193874053657,
-0.07875999063253403,
0.01698419824242592,
0.1757662296295166,
-0.11964604258537292,
-0.056918010115623474,
0.01890924759209156,
-0.10346979647874832,
-0.023416783660650253,
-0.14274081587791443,
0.18785426020622253,
0.14341826736927032,
-0.05312351509928703,
0.12761880457401276,
0.11684902012348175,
-0.08013685792684555,
-0.1480063498020172,
0.027945758774876595,
0.11307282745838165,
0.1329507678747177,
-0.040022317320108414,
-0.17698730528354645,
0.006369425915181637,
0.006805071607232094,
-0.02810920774936676,
-0.027459939941763878,
-0.17716360092163086,
-0.13992922008037567,
0.09938298165798187,
0.07877279818058014,
0.164972186088562,
-0.023014429956674576,
-0.043227147310972214,
-0.035086095333099365,
-0.04738155007362366,
-0.045131247490644455,
-0.05554194748401642,
0.09393701702356339,
-0.004092373885214329,
0.05621613562107086,
0.04100386053323746,
-0.056489843875169754,
0.11557058244943619,
-0.006821824237704277,
0.010597475804388523,
-0.025696516036987305,
0.15798474848270416,
0.06464044004678726,
0.07067303359508514,
0.15295450389385223,
-0.09968248009681702,
0.036119040101766586,
-0.09231496602296829,
-0.10217135399580002,
-0.05001116544008255,
0.04630526527762413,
0.017723826691508293,
-0.04953915253281593,
0.031774457544088364,
-0.033517416566610336,
0.044444482773542404,
-0.008637171238660812,
-0.11269531399011612,
-0.09003137052059174,
0.03918023779988289,
0.07785745710134506,
0.16769953072071075,
0.09519778192043304,
0.0226382277905941,
-0.027580438181757927,
-0.002537823049351573,
0.09466610848903656,
-0.13304610550403595,
-0.0037977416068315506,
0.05059991031885147,
0.09234335273504257,
0.10262226313352585,
-0.024512087926268578,
-0.06996957212686539,
0.09713279455900192,
0.028703873977065086,
0.014970976859331131,
-0.09212661534547806,
0.0008716267184354365,
0.03512439504265785,
-0.06647844612598419,
-0.058346930891275406,
0.09649699926376343,
-0.09840362519025803,
0.0056617711670696735,
-0.06062089279294014,
0.015709489583969116,
-0.11839687079191208,
0.24258878827095032,
0.042295150458812714,
0.05337369814515114,
-0.06415904313325882,
-0.024328192695975304,
-0.03700750321149826,
-0.02793494239449501,
0.0663318932056427,
0.007259340491145849,
-0.11846936494112015,
-0.05425979569554329,
0.02380182407796383,
0.058336291462183,
-0.003857456147670746,
-0.09713441878557205,
-0.05909893289208412,
-0.05645689368247986,
0.019314363598823547,
0.08619087189435959,
0.034635864198207855,
0.06384824961423874,
-0.062457375228405,
-0.051348328590393066,
-0.06878381967544556,
0.0640529990196228,
0.019107531756162643,
-0.04345255717635155,
-0.062473781406879425,
0.13801461458206177,
0.032704297453165054,
-0.02259465493261814,
-0.035133954137563705,
-0.03442532196640968,
0.011020089499652386,
0.07815175503492355,
-0.16175754368305206,
0.03427612781524658,
-0.026381460949778557,
-0.02914178930222988,
-0.015567613765597343,
-0.06278129667043686,
-0.005016989074647427,
0.08327914029359818,
-0.06025008484721184,
0.03404363989830017,
-0.015336666256189346,
0.08393359929323196,
-0.055118076503276825,
0.014322343282401562,
0.03142474964261055,
-0.0522124357521534,
0.04185705631971359,
0.09947935491800308,
-0.09611885249614716,
0.07116388529539108,
-0.23029978573322296,
0.03839237987995148,
0.02752835862338543,
0.06742994487285614,
-0.03126775100827217,
-0.1654656082391739,
0.04531209170818329,
0.0969216376543045,
0.05103111267089844,
0.019329963251948357,
0.1025165542960167,
-0.032254815101623535,
0.03472340106964111,
-0.0993553027510643,
0.024515429511666298,
-0.0690050721168518,
0.05592403933405876,
0.05939507856965065,
0.12237866222858429,
0.10799931734800339,
-0.08314452320337296,
0.06571120768785477,
-0.1368473321199417,
0.01895926706492901,
-0.024289611726999283,
0.02822386473417282,
-0.1191214919090271,
-0.0453345850110054,
0.08205185830593109,
-0.06494931131601334,
0.14655721187591553,
-0.0606696717441082,
-0.0023351560812443495,
-0.0005752044380642474,
-0.1394750028848648,
0.08617647737264633,
-0.006715152412652969,
0.3236235976219177,
0.03600660338997841,
0.06672462821006775,
-0.04398731142282486,
-0.008973927237093449,
-0.017110655084252357,
0.07437704503536224,
0.007762494497001171,
0.21429532766342163,
-0.048304907977581024,
0.0544276237487793,
-0.016254175454378128,
-0.15783452987670898,
-0.03138921037316322,
0.04767294600605965,
-0.10641724616289139,
0.022065158933401108,
-0.06535666435956955,
0.12134779989719391,
0.193451926112175,
-0.13445945084095,
0.057317811995744705,
-0.012963231652975082,
-0.07411868870258331,
-0.10792366415262222,
-0.10159168392419815,
-0.06557018309831619,
-0.12019628286361694,
0.07783845067024231,
-0.07747369259595871,
0.04718327894806862,
-0.01776769384741783,
0.03468111529946327,
-0.04483846575021744,
0.18491756916046143,
0.04025382548570633,
-0.09076030552387238,
0.0253509059548378,
-0.06535013765096664,
-0.02568306215107441,
-0.10432987660169601,
-0.010389599949121475,
0.14854499697685242,
-0.04307752102613449,
0.05582480505108833,
0.018767116591334343,
-0.03580446168780327,
-0.009630498476326466,
-0.07464032620191574,
-0.028472522273659706,
-0.008054622448980808,
0.019148364663124084,
0.07728651165962219,
0.19134759902954102,
0.10715784132480621,
-0.02672419138252735,
-0.004100162070244551,
0.0890672355890274,
-0.012823752127587795,
-0.11432831734418869,
-0.16047976911067963,
0.054159704595804214,
0.052853673696517944,
-0.00741744227707386,
0.03391169756650925,
-0.014673012308776379,
0.06404995918273926,
0.20518511533737183,
0.07837537676095963,
0.027268027886748314,
0.033828005194664,
-0.030916471034288406,
-0.016147490590810776,
-0.05941691994667053,
0.0705529972910881,
0.052253663539886475,
0.1551036685705185,
-0.0581108033657074,
-0.046383630484342575,
-0.09643064439296722,
-0.032141778618097305,
0.01757976971566677,
0.046976592391729355,
-0.06833554804325104,
-0.08036990463733673,
-0.01759963668882847,
0.10713116824626923,
-0.038770951330661774,
-0.11043936014175415,
-0.056244585663080215,
-0.0048875450156629086,
-0.07028894871473312,
0.005800252314656973,
-0.08272262662649155,
0.07356759160757065,
-0.03568197041749954,
-0.07053390890359879,
0.051886677742004395,
0.1525823026895523,
-0.00668367650359869,
-0.0206963699311018,
-0.04271332919597626,
0.05851907655596733,
-0.1490335911512375,
-0.027550052851438522,
-0.012897856533527374,
0.20931382477283478,
0.056563641875982285,
0.08973533660173416,
-0.024494493380188942,
0.19921110570430756,
0.025602085515856743,
-0.04434226453304291,
0.010485416278243065,
0.14691226184368134,
-0.03307431936264038,
0.14783765375614166,
0.04346299171447754,
-0.04859456419944763,
0.05506915599107742,
-0.0237235426902771,
-0.009254327975213528,
-0.13033437728881836,
0.09542534500360489,
-0.055219873785972595,
0.1229301393032074,
0.10320630669593811,
-0.095870740711689,
-0.06279391050338745,
-0.0584249421954155,
0.057106271386146545,
-0.013267507776618004,
-0.04324011877179146,
-0.05561019852757454,
-0.293860524892807,
0.017384279519319534,
-0.02338920719921589,
0.008388743735849857,
-0.2215878963470459,
-0.044323649257421494,
-0.033788107335567474,
-0.09065228700637817,
0.05812237784266472,
0.04873235896229744,
0.07519713044166565,
0.045235004276037216,
0.00823866669088602,
-0.098554827272892,
0.06227729469537735,
0.0985821858048439,
-0.10648065060377121,
-0.1111735925078392
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Hungarian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Hungarian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "hu", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-hu")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-hu")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "hu", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-hu")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-hu")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 46.75 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The code can be found [here](https://github.com/gchhablani/wav2vec2-week/blob/main/fine-tune-xlsr-wav2vec2-on-hungarian-asr.ipynb). The notebook containing the code used for evaluation can be found [here](https://colab.research.google.com/drive/1esYvWS6IkTQFfRqi_b6lAJEycuecInHE?usp=sharing).
|
{"language": "hu", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Hungarian by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice hu", "type": "common_voice", "args": "hu"}, "metrics": [{"type": "wer", "value": 46.75, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-hu
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"hu",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"hu"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Hungarian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 46.75 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The code can be found here. The notebook containing the code used for evaluation can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 46.75 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here. The notebook containing the code used for evaluation can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 46.75 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here. The notebook containing the code used for evaluation can be found here."
] |
[
80,
66,
20,
30,
44
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #hu #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Hungarian\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Hungarian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 46.75 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here. The notebook containing the code used for evaluation can be found here."
] |
[
-0.16231359541416168,
0.06481017917394638,
-0.0011002484243363142,
-0.017732644453644753,
0.12601040303707123,
-0.0535879023373127,
0.19906696677207947,
0.09090850502252579,
-0.014554324559867382,
-0.004921562969684601,
-0.016406448557972908,
0.02862565964460373,
0.0702095776796341,
0.07966109365224838,
0.034497857093811035,
-0.18044854700565338,
0.02059841714799404,
-0.01115722581744194,
0.0493563674390316,
0.08793597668409348,
0.10332313925027847,
-0.06348104774951935,
0.0036176186986267567,
0.08513432741165161,
-0.12081539630889893,
0.06760244071483612,
-0.005568876396864653,
-0.12234693020582199,
0.13663539290428162,
0.03937111049890518,
0.07376524060964584,
0.05171354487538338,
0.06071079149842262,
-0.1594495177268982,
0.01921183429658413,
0.03277352824807167,
0.013949960470199585,
0.007137460168451071,
0.06561770290136337,
-0.04631553962826729,
0.13539119064807892,
0.08403455466032028,
-0.03348596766591072,
0.09002449363470078,
-0.062424153089523315,
-0.26451170444488525,
-0.014238404110074043,
-0.014880509115755558,
0.10251235216856003,
0.13985390961170197,
-0.05929116904735565,
0.12118037045001984,
-0.09420115500688553,
0.08210904151201248,
0.10465454310178757,
-0.19838878512382507,
0.017216280102729797,
0.08885222673416138,
0.06128503382205963,
0.11945322155952454,
-0.044460177421569824,
0.029671650379896164,
0.05026679486036301,
0.005186345428228378,
0.021540367975831032,
-0.0544176921248436,
-0.2936275601387024,
-0.03535788133740425,
-0.13731828331947327,
-0.049776773899793625,
0.22139258682727814,
-0.028346659615635872,
-0.06932733952999115,
-0.11991535872220993,
0.003012519795447588,
0.0051017822697758675,
0.011540913954377174,
-0.015821054577827454,
-0.024724861606955528,
0.043735306710004807,
0.013975015841424465,
-0.02391468733549118,
-0.09590446949005127,
-0.11967513710260391,
0.049860239028930664,
0.0658661350607872,
0.007506962865591049,
0.009801479987800121,
-0.10850125551223755,
0.07521034777164459,
-0.03941766172647476,
-0.09434323757886887,
-0.009132303297519684,
0.047654539346694946,
-0.06946757435798645,
0.0014667520299553871,
-0.0631401315331459,
-0.11948270350694656,
0.044952042400836945,
-0.027919385582208633,
0.07074274122714996,
0.012285270728170872,
-0.04049943387508392,
0.06129666045308113,
0.02048288658261299,
0.1379317194223404,
-0.04626363515853882,
-0.07239188998937607,
0.012995898723602295,
-0.028482291847467422,
-0.017653029412031174,
-0.05061615630984306,
-0.10071150958538055,
-0.06556560844182968,
0.0016643364215269685,
0.09404313564300537,
-0.036904942244291306,
-0.0044802650809288025,
-0.020803019404411316,
-0.05882340669631958,
0.02899077534675598,
-0.10584240406751633,
-0.030908536165952682,
0.09950003772974014,
0.006320185028016567,
0.05813963711261749,
0.017878711223602295,
0.044212717562913895,
-0.11341241747140884,
-0.06451749056577682,
0.01815633662045002,
0.04338480159640312,
-0.04521694406867027,
-0.1293017566204071,
0.022149285301566124,
-0.03261442855000496,
0.005338659044355154,
-0.09795663505792618,
-0.06556925922632217,
-0.0801214799284935,
-0.025329528376460075,
0.03823117911815643,
0.041374482214450836,
-0.10510141402482986,
-0.03870599716901779,
-0.040811844170093536,
-0.039609335362911224,
0.024786628782749176,
-0.04886173456907272,
0.07121029496192932,
0.002923330757766962,
0.04555976390838623,
-0.008673575706779957,
0.06672180444002151,
-0.10642177611589432,
-0.06616732478141785,
0.002039786195382476,
0.12011009454727173,
-0.06369801610708237,
0.00818158034235239,
-0.06891831755638123,
-0.043127987533807755,
-0.0070140790194272995,
0.0431656613945961,
0.07305732369422913,
0.10592284053564072,
-0.2807902693748474,
-0.11339840292930603,
0.18108420073986053,
-0.16343238949775696,
-0.0608980655670166,
0.1672898679971695,
-0.004756661131978035,
0.1395355463027954,
0.10945580899715424,
0.2807854115962982,
0.20453979074954987,
-0.15014885365962982,
0.005486429203301668,
-0.025861453264951706,
0.000783591705840081,
-0.1052018329501152,
0.09937190264463425,
-0.05512712150812149,
-0.009153642691671848,
0.04227307811379433,
-0.08121520280838013,
0.07440277934074402,
-0.0007111084996722639,
-0.0581747367978096,
-0.0025172107852995396,
-0.06049220263957977,
0.037814460694789886,
0.05868129804730415,
0.01955392211675644,
-0.05100150778889656,
-0.02573637291789055,
0.04456877335906029,
0.13029834628105164,
-0.14870961010456085,
0.029970379546284676,
-0.09126513451337814,
0.12521471083164215,
-0.06709320098161697,
-0.009370866231620312,
-0.13159584999084473,
0.19651181995868683,
-0.004739173222333193,
0.0603053905069828,
0.026155943050980568,
0.13716673851013184,
0.00942105520516634,
0.04892845079302788,
0.011033531278371811,
-0.004170757718384266,
-0.04430563002824783,
-0.004712268710136414,
-0.02301470935344696,
-0.0751083567738533,
-0.03552598878741264,
-0.05112908408045769,
0.12628909945487976,
-0.15370318293571472,
0.010001776739954948,
0.06954941898584366,
0.02806328795850277,
0.0037985024973750114,
-0.040481843054294586,
0.043028078973293304,
0.10856036096811295,
-0.017122499644756317,
-0.01051104161888361,
0.05311977490782738,
0.04411500319838524,
-0.005723122041672468,
0.07920821011066437,
-0.151448592543602,
-0.027369769290089607,
0.1139645203948021,
-0.07044634968042374,
-0.0019326565088704228,
0.029275568202137947,
-0.018350858241319656,
0.018788723275065422,
-0.09807690978050232,
-0.018436972051858902,
0.29096153378486633,
-0.028666101396083832,
0.12620729207992554,
-0.08991137892007828,
0.0059464857913553715,
0.022715939208865166,
-0.05220286548137665,
0.061661574989557266,
0.07498749345541,
0.028280893340706825,
0.08663268387317657,
0.031376685947179794,
-0.014549238607287407,
-0.06148780509829521,
0.2673121988773346,
-0.036348890513181686,
-0.11611581593751907,
0.030951015651226044,
-0.001881562639027834,
0.010806656442582607,
0.0769515410065651,
-0.22248928248882294,
-0.03592190518975258,
0.02952677756547928,
0.04237476363778114,
0.06797605752944946,
-0.12175318598747253,
0.01911907084286213,
0.024413933977484703,
-0.12686216831207275,
-0.18938565254211426,
0.05435160547494888,
-0.042125944048166275,
0.03164125233888626,
-0.09031551331281662,
-0.02638970874249935,
-0.046212680637836456,
-0.05233437940478325,
-0.17666378617286682,
0.12445017695426941,
-0.07998927682638168,
-0.20903782546520233,
-0.21645329892635345,
0.0927223488688469,
0.022834889590740204,
0.03329752758145332,
0.11450515687465668,
-0.12404178828001022,
0.008140957914292812,
-0.023216795176267624,
0.13715320825576782,
0.019471900537610054,
-0.052667275071144104,
0.021941615268588066,
0.05940062925219536,
0.039535775780677795,
-0.14723946154117584,
0.021230008453130722,
-0.017645474523305893,
-0.056190598756074905,
-0.03752471134066582,
-0.02173842303454876,
0.026388810947537422,
0.17548207938671112,
0.018812010064721107,
0.0365060493350029,
-0.0216108039021492,
0.19277530908584595,
-0.1110280379652977,
-0.03499278798699379,
0.20138657093048096,
-0.04068712145090103,
-0.0362921878695488,
0.10296133905649185,
0.025511138141155243,
-0.05581675469875336,
-0.019191298633813858,
-0.03159509599208832,
-0.09568949788808823,
-0.22022446990013123,
-0.137300044298172,
-0.06867596507072449,
-0.03096592053771019,
-0.06130493804812431,
-0.003992271143943071,
0.040871281176805496,
0.034704968333244324,
-0.03221220150589943,
-0.06754541397094727,
0.04304823651909828,
0.014237118884921074,
0.08491789549589157,
-0.002506022807210684,
0.05658864229917526,
-0.08373096585273743,
-0.040342848747968674,
0.033119939267635345,
0.03414429351687431,
0.14856533706188202,
0.034845802932977676,
0.12439936399459839,
0.08790168166160583,
0.09687402099370956,
0.1017930805683136,
0.08476592600345612,
-0.03454681858420372,
-0.009605158120393753,
0.014992393553256989,
-0.0891609862446785,
-0.03299841657280922,
0.00948165450245142,
0.06261584162712097,
-0.027836918830871582,
-0.11579485982656479,
-0.094114750623703,
0.03014112263917923,
0.16338904201984406,
0.10046979784965515,
-0.19560128450393677,
-0.09324603527784348,
0.004281194880604744,
-0.012602532282471657,
0.0015607367968186736,
0.03378577157855034,
0.1934790015220642,
-0.1273304671049118,
-0.0038792830891907215,
0.03480745851993561,
0.09003160893917084,
-0.07017053663730621,
0.021265557035803795,
-0.10615779459476471,
-0.01776902563869953,
0.005716846790164709,
0.11514265090227127,
-0.1692514717578888,
0.19481207430362701,
0.01597077026963234,
0.1284249871969223,
-0.07117880135774612,
0.005341116338968277,
0.022346394136548042,
0.04838886484503746,
0.10072819143533707,
0.020573632791638374,
0.026147985830903053,
-0.1369410753250122,
-0.07796024531126022,
0.050729140639305115,
-0.014841138385236263,
-0.02759646438062191,
0.03576444089412689,
-0.018200691789388657,
0.012888060882687569,
-0.036075107753276825,
-0.07966701686382294,
-0.1135953962802887,
-0.05781998857855797,
0.02066914550960064,
0.18410220742225647,
0.10742677003145218,
-0.024804187938570976,
-0.08726770430803299,
-0.05429139733314514,
0.11891555786132812,
-0.13953612744808197,
-0.06485545635223389,
-0.07289763540029526,
-0.08383266627788544,
0.09322625398635864,
-0.07010149210691452,
-0.02062373235821724,
0.11028272658586502,
0.13183344900608063,
-0.033074699342250824,
-0.06899858266115189,
0.02497842349112034,
-0.11279931664466858,
-0.06930740177631378,
-0.012518090195953846,
0.17501069605350494,
0.09474734216928482,
0.05837952345609665,
0.06250467896461487,
0.0157674178481102,
0.012073934078216553,
-0.07483555376529694,
-0.013933495618402958,
0.13661278784275055,
-0.11050288379192352,
-0.004473311826586723,
-0.01584315486252308,
-0.17100106179714203,
-0.10363323986530304,
-0.06111523509025574,
0.18182812631130219,
0.10705520212650299,
-0.07924468070268631,
0.16024088859558105,
0.21096020936965942,
-0.09953795373439789,
-0.22354960441589355,
0.0023446271661669016,
0.09226539731025696,
0.12796279788017273,
0.022407688200473785,
-0.17231005430221558,
0.04145599156618118,
-0.01057148352265358,
-0.027956359088420868,
-0.026368964463472366,
-0.3387748897075653,
-0.11426562070846558,
0.13992974162101746,
0.018441705033183098,
0.1271805316209793,
-0.009236948564648628,
-0.04735253378748894,
-0.00960950180888176,
-0.07834118604660034,
-0.01019535306841135,
-0.18749845027923584,
0.09651856124401093,
0.04224666208028793,
0.03800268471240997,
0.05869586765766144,
-0.024217618629336357,
0.08909869939088821,
0.08235706388950348,
-0.031398460268974304,
-0.017912013456225395,
0.058879077434539795,
0.07457812875509262,
0.001456919126212597,
0.10380782186985016,
-0.053726475685834885,
0.042169615626335144,
-0.14413243532180786,
-0.06745140254497528,
-0.05714259296655655,
0.0853475034236908,
0.0038816281594336033,
-0.0382632277905941,
0.011103947646915913,
-0.038514748215675354,
-0.0020992388017475605,
0.0009778756648302078,
-0.04510164633393288,
-0.10140658169984818,
0.0627252534031868,
0.14133095741271973,
0.16261959075927734,
-0.036956459283828735,
-0.029203470796346664,
0.017648568376898766,
-0.017429722473025322,
0.11576932668685913,
-0.04156810790300369,
0.04495687037706375,
0.07808380573987961,
0.03565860167145729,
0.08344872295856476,
0.017775120213627815,
-0.10556791722774506,
0.08887328952550888,
0.06525794416666031,
-0.07274854928255081,
-0.05359258875250816,
-0.07257014513015747,
-0.05069471523165703,
0.004862604662775993,
0.06654758006334305,
0.13059961795806885,
-0.07378187030553818,
-0.02542075328528881,
-0.04245581850409508,
-0.016964923590421677,
-0.11058659851551056,
0.1774200201034546,
0.011809329502284527,
0.07451487332582474,
-0.11462293565273285,
0.025953879579901695,
0.025971345603466034,
-0.034116972237825394,
0.05017266422510147,
-0.005425689741969109,
-0.10680123418569565,
-0.06083320081233978,
-0.0026493652258068323,
0.126332089304924,
0.020870978012681007,
-0.14065134525299072,
-0.09525658190250397,
-0.07104906439781189,
-0.04201582446694374,
0.09237724542617798,
0.07404985278844833,
0.02216961234807968,
-0.10243597626686096,
-0.07447561621665955,
-0.11484821140766144,
0.04236799478530884,
0.037392571568489075,
-0.030026651918888092,
-0.08702684938907623,
0.21153797209262848,
0.0750749260187149,
0.0037117628380656242,
-0.04687248915433884,
-0.10558391362428665,
-0.04496445506811142,
0.06644461303949356,
-0.06983722746372223,
-0.038023851811885834,
-0.06622104346752167,
0.0075687021017074585,
-0.012755628675222397,
-0.08283597975969315,
-0.015395772643387318,
0.09209953248500824,
-0.08306628465652466,
0.054015401750802994,
-0.04771089181303978,
0.09289207309484482,
-0.09196757525205612,
0.02889685146510601,
0.04012483358383179,
-0.061541419476270676,
0.0738912895321846,
0.09078674018383026,
-0.08461057394742966,
0.11908159404993057,
-0.17306987941265106,
-0.05476223677396774,
0.041096724569797516,
0.05463968589901924,
0.026191776618361473,
-0.08116643875837326,
0.05301688611507416,
0.07516290992498398,
0.059308938682079315,
0.003440074622631073,
0.12873448431491852,
-0.03874163329601288,
-0.03714532032608986,
-0.036027975380420685,
-0.023933297023177147,
-0.004191644489765167,
0.074488565325737,
0.08423568308353424,
0.10914460569620132,
0.12466277927160263,
-0.09433313459157944,
0.11375763267278671,
-0.10686548054218292,
-0.001529667992144823,
-0.05845457315444946,
-0.009739106521010399,
-0.12162850797176361,
-0.08521777391433716,
0.04765700176358223,
-0.049054618924856186,
0.10688040405511856,
0.024158403277397156,
0.14799506962299347,
-0.02371288277208805,
-0.09423170983791351,
0.039292700588703156,
-0.019324876368045807,
0.2674083709716797,
0.0627162978053093,
0.03804480656981468,
-0.031109515577554703,
0.03164080157876015,
0.029054906219244003,
0.09923791885375977,
-0.0034352729562669992,
0.12126538157463074,
-0.03158421441912651,
0.07634441554546356,
0.08194189518690109,
-0.04534672573208809,
-0.09630248695611954,
-0.08792590349912643,
-0.05108720809221268,
0.030500907450914383,
-0.08523115515708923,
0.15469813346862793,
0.17089352011680603,
-0.10349931567907333,
0.0821629986166954,
0.05446287989616394,
-0.07332419604063034,
-0.13973619043827057,
-0.08216803520917892,
-0.042116679251194,
-0.17215265333652496,
0.012298423796892166,
-0.08679917454719543,
0.01208161748945713,
0.030552782118320465,
0.04048590362071991,
-0.03423215448856354,
0.1673949807882309,
0.016628779470920563,
-0.0871821939945221,
0.04816935956478119,
-0.09226401895284653,
0.04652933403849602,
-0.09757654368877411,
0.04991622269153595,
0.13731276988983154,
0.03536665067076683,
0.06781000643968582,
0.026699764654040337,
-0.057723596692085266,
-0.0032456049229949713,
-0.09422501176595688,
-0.05683528259396553,
-0.014342171140015125,
-0.024843107908964157,
0.05827346444129944,
0.14605046808719635,
0.14331655204296112,
-0.08497108519077301,
0.008158297277987003,
0.13978831470012665,
-0.0508061945438385,
-0.12468282133340836,
-0.15197214484214783,
0.1262279748916626,
0.06481213122606277,
0.02474098466336727,
-0.01620342582464218,
-0.06690648198127747,
-0.04612492397427559,
0.2954155206680298,
0.18850228190422058,
0.10249973088502884,
0.022583596408367157,
-0.02027391828596592,
-0.005995030514895916,
-0.014480255544185638,
0.09661580622196198,
0.055140651762485504,
0.26427674293518066,
-0.01050237100571394,
-0.012805507518351078,
-0.09420635551214218,
-0.07177077978849411,
-0.02776791900396347,
0.08503764867782593,
-0.11341476440429688,
-0.11666601151227951,
0.027030346915125847,
0.16231383383274078,
-0.0463114008307457,
-0.10140354186296463,
-0.09234718978404999,
-0.08566109091043472,
-0.09855673462152481,
-0.0025837994180619717,
0.037148647010326385,
0.09540966153144836,
0.010999233461916447,
-0.03829455003142357,
0.021318508312106133,
0.15937526524066925,
0.0041376687586307526,
-0.04158689081668854,
-0.09560316056013107,
0.01445873361080885,
-0.22323250770568848,
0.07552950084209442,
-0.04211817681789398,
0.11364470422267914,
0.04393312707543373,
0.11495672166347504,
-0.02230362594127655,
0.14752304553985596,
-0.007915575057268143,
-0.05166373401880264,
0.03940366953611374,
0.10702314227819443,
-0.06345529109239578,
0.10704261809587479,
0.011168054305016994,
-0.13608258962631226,
0.07869797199964523,
-0.12495262920856476,
0.03091701865196228,
-0.06596188992261887,
0.03752380982041359,
-0.044519633054733276,
0.08318977802991867,
0.08675172179937363,
-0.07788937538862228,
-0.036453016102313995,
-0.05506451800465584,
0.04822428151965141,
0.031479086726903915,
-0.07257601618766785,
-0.03133183345198631,
-0.2461712658405304,
-0.03760267049074173,
-0.07507020980119705,
-0.010590375401079655,
-0.20402060449123383,
0.014498715288937092,
-0.00023233696992974728,
-0.07958546280860901,
-0.015926053747534752,
0.03526479750871658,
0.08446552604436874,
0.009608892723917961,
-0.008267074823379517,
-0.012388868257403374,
0.03539043664932251,
0.1265275925397873,
-0.17067302763462067,
-0.13654328882694244
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Interlingua
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Interlingua using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "ia", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-ia")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-ia")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "ia", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-ia")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-ia")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\']'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 25.09 %
## Training
The Common Voice `train` and `validation` datasets were used for training for 4000 steps due to GPU timeout. The results are based on the 4000 steps checkpoint. There is a good chance that full training will lead to better results.
The colab notebook used can be found [here](https://colab.research.google.com/drive/1nbqvVwS8DTNrCzzh3vgrN55qxgoqbita?usp=sharing) and the evaluation can be found [here](https://colab.research.google.com/drive/18pCWBwNNUMUYV1FiqT_0EsTbCfwwe7ms?usp=sharing).
|
{"language": "ia", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Interlingua by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ia", "type": "common_voice", "args": "ia"}, "metrics": [{"type": "wer", "value": 25.09, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-ia
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"ia",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ia"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Interlingua
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
Test Result: 25.09 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training for 4000 steps due to GPU timeout. The results are based on the 4000 steps checkpoint. There is a good chance that full training will lead to better results.
The colab notebook used can be found here and the evaluation can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 25.09 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training for 4000 steps due to GPU timeout. The results are based on the 4000 steps checkpoint. There is a good chance that full training will lead to better results.\n\nThe colab notebook used can be found here and the evaluation can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 25.09 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training for 4000 steps due to GPU timeout. The results are based on the 4000 steps checkpoint. There is a good chance that full training will lead to better results.\n\nThe colab notebook used can be found here and the evaluation can be found here."
] |
[
80,
65,
20,
28,
73
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ia #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Interlingua\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Interlingua using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 25.09 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training for 4000 steps due to GPU timeout. The results are based on the 4000 steps checkpoint. There is a good chance that full training will lead to better results.\n\nThe colab notebook used can be found here and the evaluation can be found here."
] |
[
-0.1465993970632553,
0.038170069456100464,
-0.0009598427568562329,
0.020533487200737,
0.06254571676254272,
0.028397688642144203,
0.1521599441766739,
0.12486373633146286,
-0.024282408878207207,
0.013463743962347507,
0.06801319122314453,
-0.003434802172705531,
0.1169358566403389,
0.1426832377910614,
0.04739341512322426,
-0.1882447898387909,
0.022990062832832336,
-0.002286724979057908,
0.06854264438152313,
0.12549671530723572,
0.06403426080942154,
-0.086252361536026,
0.034410785883665085,
0.11238544434309006,
-0.16777414083480835,
-0.045250993221998215,
0.06518805772066116,
-0.11538976430892944,
0.15083573758602142,
0.0782509446144104,
-0.008494418114423752,
0.02733132801949978,
0.10619243234395981,
-0.10844603180885315,
0.019375044852495193,
0.09722276777029037,
0.016188982874155045,
0.05644947662949562,
0.0696851834654808,
0.08566876500844955,
0.06962160021066666,
0.07584137469530106,
-0.009633623994886875,
0.0780419185757637,
-0.08675697445869446,
-0.22404538094997406,
-0.07004404067993164,
-0.04747492074966431,
0.05483114346861839,
0.10772991180419922,
-0.060172755271196365,
0.1473272442817688,
-0.07727478444576263,
0.061268873512744904,
0.11532604694366455,
-0.1704605519771576,
-0.016463397070765495,
0.09877651184797287,
0.04866969957947731,
0.05009492486715317,
-0.031287241727113724,
0.04430689290165901,
0.03773551806807518,
-0.014999116770923138,
0.06904514878988266,
-0.002562643028795719,
-0.03395259007811546,
0.013216553255915642,
-0.129037544131279,
-0.06647734344005585,
0.10179547965526581,
-0.010150210000574589,
-0.08993463963270187,
-0.22027118504047394,
-0.06539969146251678,
-0.06245654448866844,
0.01988230273127556,
-0.05539340153336525,
0.011132050305604935,
0.0186462365090847,
0.025685841217637062,
-0.01312010083347559,
-0.08503904193639755,
-0.16116051375865936,
-0.01806841418147087,
0.16329826414585114,
0.042954619973897934,
0.06480679661035538,
-0.055020540952682495,
0.14798276126384735,
-0.04753513261675835,
-0.06689377129077911,
-0.05810178816318512,
-0.013639654964208603,
-0.126572385430336,
0.05927039682865143,
-0.05737188830971718,
0.0070859286934137344,
-0.07498589903116226,
0.033932462334632874,
0.05670972168445587,
-0.0013535204343497753,
0.0740325078368187,
0.003173834178596735,
0.02550560049712658,
0.1435682326555252,
-0.10506758093833923,
-0.10099423676729202,
-0.0012090251548215747,
0.006354365032166243,
-0.04441765695810318,
-0.046718038618564606,
-0.02407325617969036,
-0.0972638875246048,
0.0029532192274928093,
-0.0010898106265813112,
-0.02870951034128666,
0.01757025346159935,
-0.039820555597543716,
-0.07247024029493332,
0.08983917534351349,
-0.08114305883646011,
-0.006302099674940109,
0.05902963876724243,
-0.07383213192224503,
0.08541132509708405,
0.08043617010116577,
-0.029424164444208145,
-0.10746317356824875,
-0.08980938047170639,
-0.006761925294995308,
-0.014634979888796806,
-0.09830345958471298,
-0.08985070884227753,
-0.011037972755730152,
-0.06735048443078995,
-0.04262326657772064,
-0.07483883202075958,
-0.14660847187042236,
-0.07063210755586624,
0.01712324097752571,
0.021448124200105667,
-0.0029908183496445417,
-0.04826999083161354,
0.05761076509952545,
0.0151829794049263,
-0.0590532049536705,
0.11657054722309113,
-0.032643064856529236,
0.07853671908378601,
0.03080780990421772,
0.08636429160833359,
0.08830209076404572,
0.07982737571001053,
-0.020562028512358665,
-0.06626533716917038,
0.049142640084028244,
0.08340472728013992,
0.02190195769071579,
-0.11547381430864334,
-0.04297858104109764,
-0.07088015228509903,
-0.03148368000984192,
0.07040883600711823,
0.0476800911128521,
0.11022482812404633,
-0.20983745157718658,
-0.06922803074121475,
0.20461323857307434,
-0.07758861780166626,
-0.001546074403449893,
0.16466782987117767,
-0.024432746693491936,
0.09347938001155853,
0.09694138914346695,
0.11131896078586578,
0.10571759939193726,
-0.22526763379573822,
0.021039703860878944,
-0.018436508253216743,
-0.0760839655995369,
-0.0804564356803894,
0.07992953807115555,
0.018331125378608704,
0.02598864585161209,
0.07796098291873932,
-0.03708934038877487,
0.05327572673559189,
-0.02780805714428425,
-0.043948158621788025,
-0.03420978784561157,
-0.039773955941200256,
0.0074552432633936405,
0.040833402425050735,
-0.007101899478584528,
0.0047473241575062275,
-0.10461397469043732,
-0.023651011288166046,
0.15150795876979828,
-0.07488589733839035,
0.038717787712812424,
-0.10560336709022522,
0.05938578024506569,
-0.04204597696661949,
0.01980513148009777,
-0.15152698755264282,
0.14829707145690918,
0.027247024700045586,
0.020446322858333588,
0.03488968312740326,
0.09691517800092697,
0.020531807094812393,
0.0022449407260864973,
-0.013201412744820118,
0.00960581749677658,
0.018810590729117393,
-0.07463447004556656,
-0.0441102534532547,
-0.05419038608670235,
0.006662401370704174,
-0.032836612313985825,
0.06474102288484573,
-0.14508658647537231,
0.011385583318769932,
0.06548517197370529,
-0.018908148631453514,
0.01929483748972416,
-0.06956452131271362,
0.0405709408223629,
0.01890362985432148,
-0.038023728877305984,
-0.013587094843387604,
0.003978028427809477,
-0.001664142357185483,
-0.016850421205163002,
0.1313387006521225,
-0.21160799264907837,
-0.024606024846434593,
0.1242728903889656,
0.07267923653125763,
-0.045078814029693604,
0.012867117300629616,
-0.03131144121289253,
-0.027712885290384293,
-0.07554610073566437,
0.01314141508191824,
0.28763288259506226,
-0.0008104548323899508,
0.11233879625797272,
-0.07616590708494186,
-0.005842340644448996,
0.051264334470033646,
0.018198709934949875,
0.024744179099798203,
0.016948126256465912,
0.00909329205751419,
0.10755358636379242,
0.007216245401650667,
-0.09864676743745804,
0.0259220190346241,
0.20677883923053741,
0.027510259300470352,
-0.12578101456165314,
-0.03911107778549194,
-0.06303796172142029,
-0.0052979676984250546,
0.15061382949352264,
-0.09750672429800034,
-0.05112401768565178,
0.06914526224136353,
0.0684070959687233,
0.018229715526103973,
-0.2209014892578125,
0.05529036000370979,
0.04327069967985153,
-0.09359953552484512,
-0.11691253632307053,
0.0379701666533947,
-0.05566594749689102,
0.06027575954794884,
-0.046488530933856964,
0.0056684184819459915,
-0.016901252791285515,
-0.05943680554628372,
-0.14523477852344513,
0.14879898726940155,
-0.06222939491271973,
-0.2584003508090973,
-0.15743054449558258,
-0.002042805776000023,
0.0018171181436628103,
0.037808820605278015,
0.03934934362769127,
-0.051180899143218994,
-0.02218501828610897,
-0.019712653011083603,
0.08198028802871704,
0.026491915807127953,
0.009584590792655945,
-0.04991323500871658,
0.0150906415656209,
0.007353445515036583,
-0.15106286108493805,
0.03743215277791023,
-0.06511779874563217,
-0.020647410303354263,
0.053127437829971313,
0.02300810068845749,
0.022624028846621513,
0.1400529146194458,
0.038958609104156494,
0.0146713238209486,
-0.038736194372177124,
0.1353093385696411,
-0.16982980072498322,
0.000575847050640732,
0.20938751101493835,
0.024119369685649872,
-0.016737526282668114,
0.014547339640557766,
0.0010009388206526637,
-0.10242477059364319,
0.008432096801698208,
-0.03422816842794418,
-0.09215866029262543,
-0.18761324882507324,
-0.057833995670080185,
-0.07845472544431686,
-0.013713791966438293,
0.022285951301455498,
-0.013968224637210369,
0.02471034601330757,
0.03741857409477234,
-0.013873033225536346,
-0.0048470040783286095,
0.05008842051029205,
0.014776776544749737,
0.03517533838748932,
-0.03791153430938721,
0.09052367508411407,
-0.0339275486767292,
0.0264913197606802,
0.11065217852592468,
0.06742249429225922,
0.1980573982000351,
-0.03243039548397064,
0.1054995059967041,
0.05043644830584526,
0.12171085178852081,
0.09675930440425873,
0.06317289918661118,
-0.014502775855362415,
-0.01729626953601837,
-0.02766544185578823,
-0.035355355590581894,
-0.067035011947155,
0.0550207681953907,
0.0972132533788681,
-0.021228034049272537,
-0.035644397139549255,
-0.03218585625290871,
-0.006898310501128435,
0.2381781041622162,
0.07220491021871567,
-0.2630225121974945,
-0.07908693701028824,
-0.04759794846177101,
-0.07196003198623657,
-0.07536907494068146,
0.05328979715704918,
0.09211397171020508,
-0.11824478954076767,
-0.004330760799348354,
-0.03622680529952049,
0.0675143301486969,
-0.06960020214319229,
-0.016868535429239273,
-0.0076634762808680534,
0.042335644364356995,
-0.03577123209834099,
0.046214669942855835,
-0.20175933837890625,
0.15981683135032654,
0.004396490287035704,
0.11314716190099716,
-0.09690944850444794,
0.02467242442071438,
0.04021639749407768,
-0.009458442218601704,
0.12390347570180893,
-0.022431185469031334,
-0.045425791293382645,
-0.11046453565359116,
-0.0771201103925705,
0.003699029330164194,
0.014436187222599983,
0.04815097153186798,
0.04170271381735802,
-0.0005734523292630911,
0.040405500680208206,
0.04068250581622124,
-0.08008085191249847,
-0.032707005739212036,
-0.06464871764183044,
0.06310641020536423,
0.08191294223070145,
0.0682678297162056,
-0.011606250889599323,
-0.09605731815099716,
-0.1445334553718567,
0.07767128199338913,
-0.14064030349254608,
-0.02763136476278305,
-0.12042451649904251,
-0.03620971366763115,
0.0929311215877533,
-0.027197478339076042,
0.040371157228946686,
0.09319553524255753,
0.15077243745326996,
-0.025746164843440056,
-0.06565065681934357,
0.051212627440690994,
-0.08397676050662994,
-0.12830901145935059,
-0.039614349603652954,
0.08906262367963791,
0.08892476558685303,
0.06000320240855217,
-0.009546203538775444,
0.002283375710248947,
-0.09339026361703873,
-0.03883780539035797,
0.06328348815441132,
0.13896670937538147,
-0.06137824058532715,
-0.029388727620244026,
0.07134488224983215,
-0.10095850378274918,
-0.04006711766123772,
-0.06201382726430893,
0.017930500209331512,
0.16645006835460663,
-0.07024779915809631,
0.18890613317489624,
0.15756331384181976,
-0.04794054850935936,
-0.2229507863521576,
0.04908250644803047,
0.07340184599161148,
0.1606592983007431,
-0.03438800573348999,
-0.1881982982158661,
-0.002169145504012704,
0.012857228517532349,
-0.04444215074181557,
0.057417985051870346,
-0.20992819964885712,
-0.1441822052001953,
0.1062396690249443,
0.027287917211651802,
0.1444936841726303,
-0.002785230055451393,
0.022899439558386803,
0.04051043093204498,
0.015626201406121254,
0.02625732310116291,
-0.007429797202348709,
0.13359200954437256,
-0.010374507866799831,
0.09657255560159683,
0.02671641856431961,
-0.04182352125644684,
0.07999877631664276,
0.017196152359247208,
0.052471861243247986,
0.0006909422809258103,
0.07092325389385223,
0.058174505829811096,
-0.0228774081915617,
0.09438639879226685,
0.05140582099556923,
0.04169513285160065,
-0.10430698841810226,
-0.1224510669708252,
-0.049999579787254333,
0.0083541888743639,
0.011223305948078632,
-0.06830839067697525,
-0.01573360711336136,
0.027783330529928207,
0.0025699043180793524,
-0.056697357445955276,
-0.1051216870546341,
-0.04436448961496353,
-0.11709088087081909,
0.10223336517810822,
0.10667087137699127,
-0.05350962281227112,
-0.0911809653043747,
-0.017638584598898888,
-0.0030245971865952015,
0.11876900494098663,
-0.09670881927013397,
-0.008229830302298069,
0.0588325671851635,
0.061630234122276306,
0.08877187967300415,
0.012942436151206493,
-0.09992482513189316,
0.049058761447668076,
0.021110758185386658,
-0.08213988691568375,
-0.09384183585643768,
-0.01813502423465252,
-0.13473713397979736,
-0.030801920220255852,
0.005979598965495825,
0.08984705805778503,
-0.05539151281118393,
-0.009283131919801235,
0.010724467225372791,
0.038850463926792145,
-0.03261513635516167,
0.2034713476896286,
-0.014045588672161102,
0.034283991903066635,
-0.11976467072963715,
0.07865136861801147,
-0.00006250190926948562,
-0.10679469257593155,
0.03186110034584999,
-0.0709109678864479,
-0.13775068521499634,
-0.009956897236406803,
-0.07708462327718735,
-0.05447787046432495,
0.02131156250834465,
-0.1016484722495079,
-0.09208136051893234,
-0.09979557245969772,
0.025506800040602684,
-0.04827947914600372,
0.06538046151399612,
0.03752727806568146,
-0.08888401836156845,
-0.010571948252618313,
-0.1132107600569725,
0.07178856432437897,
0.04804534465074539,
-0.03528890386223793,
-0.06417866051197052,
0.18924878537654877,
0.06786448508501053,
0.015451518818736076,
-0.061810240149497986,
-0.08634784817695618,
-0.019349081441760063,
0.07991356402635574,
-0.024907803162932396,
-0.015406531281769276,
-0.035677749663591385,
0.009566823951900005,
-0.009023739024996758,
-0.02340780943632126,
-0.0081245182082057,
0.06954535096883774,
-0.05302686616778374,
0.049069732427597046,
-0.009900268167257309,
0.04533106088638306,
-0.030874213203787804,
-0.001031456864438951,
0.02895846776664257,
-0.07695382833480835,
0.08692209422588348,
0.10716499388217926,
-0.061243895441293716,
0.02878979966044426,
-0.07840050011873245,
-0.0020105773583054543,
0.041798751801252365,
0.08220753818750381,
-0.012346982024610043,
-0.15233343839645386,
0.08525840193033218,
0.028299542143940926,
-0.021365685388445854,
-0.020325616002082825,
-0.005003644619137049,
-0.05897970497608185,
0.014908283948898315,
-0.05364586412906647,
0.04387510195374489,
-0.048122309148311615,
0.06506207585334778,
0.01013612188398838,
0.12316283583641052,
0.05770493671298027,
-0.03033355064690113,
0.03122260607779026,
-0.16279108822345734,
-0.005397975444793701,
-0.014642000198364258,
-0.048353809863328934,
-0.07963429391384125,
-0.0713476613163948,
0.10206545889377594,
0.023063896223902702,
0.19786033034324646,
0.03988251835107803,
-0.0064460234716534615,
-0.0211515873670578,
-0.09790872037410736,
0.008957474492490292,
-0.02550053410232067,
0.1765841692686081,
0.03605324774980545,
0.04714960232377052,
0.05418357998132706,
0.07578970491886139,
0.0399334579706192,
0.09887835383415222,
0.05053063854575157,
0.12554806470870972,
0.07151390612125397,
0.08955437690019608,
-0.02213822677731514,
-0.1091386005282402,
-0.1266760528087616,
0.012197853066027164,
-0.22025106847286224,
0.04506658762693405,
-0.052715566009283066,
0.113156758248806,
0.08404005318880081,
-0.10875939577817917,
0.09955088794231415,
0.015622996725142002,
-0.04270019009709358,
-0.11764192581176758,
-0.10162188857793808,
-0.02234380505979061,
-0.13131289184093475,
0.03707406297326088,
-0.07294982671737671,
0.05511435493826866,
0.043208520859479904,
0.06289441883563995,
0.0012518537696450949,
0.12759354710578918,
-0.05905524268746376,
-0.06680011749267578,
0.07671485841274261,
-0.02824215404689312,
-0.04525267705321312,
-0.06769397854804993,
-0.019635634496808052,
0.1029224693775177,
-0.07148394733667374,
0.11763555556535721,
-0.051865775138139725,
-0.02756989747285843,
0.04625400900840759,
-0.0006893552490510046,
-0.012060866691172123,
-0.04282859340310097,
-0.0325867123901844,
0.0562467947602272,
0.11655430495738983,
0.08255840092897415,
-0.022608695551753044,
-0.006285990588366985,
0.09007302671670914,
-0.03496208041906357,
-0.0398889034986496,
-0.16864578425884247,
0.12701177597045898,
0.019632184877991676,
-0.022109152749180794,
0.013230212964117527,
-0.054307326674461365,
-0.014511761255562305,
0.2826814651489258,
0.15109753608703613,
0.020301923155784607,
-0.039602577686309814,
0.018641402944922447,
-0.01641654223203659,
-0.02938619628548622,
0.1404549926519394,
0.014155278913676739,
0.26857107877731323,
0.015107404440641403,
-0.02159724198281765,
-0.008703848347067833,
-0.06659991294145584,
0.044124457985162735,
0.12314880639314651,
-0.05303868278861046,
-0.027229655534029007,
-0.0478065051138401,
0.02228146232664585,
0.030463358387351036,
-0.17942820489406586,
0.011394674889743328,
-0.052794743329286575,
-0.07177413254976273,
0.004002378322184086,
-0.0777895376086235,
0.029274461790919304,
0.050340116024017334,
-0.03539491444826126,
0.04877966269850731,
0.19837084412574768,
-0.015581646002829075,
-0.08912340551614761,
-0.033498745411634445,
0.05153905972838402,
-0.05522136762738228,
0.1369716078042984,
-0.04871576279401779,
0.1226927638053894,
0.02709958143532276,
0.04682685434818268,
-0.09298312664031982,
0.08183735609054565,
-0.04746721684932709,
-0.1280517429113388,
-0.007664740085601807,
0.12912236154079437,
-0.015104712918400764,
0.039573658257722855,
0.0028231581673026085,
-0.0066202194429934025,
0.04578035697340965,
-0.19311076402664185,
-0.0033312803134322166,
-0.09174492210149765,
0.0393512062728405,
0.01638970896601677,
0.12483282387256622,
0.10007577389478683,
-0.03549310192465782,
-0.03477328270673752,
-0.055628396570682526,
0.04260578751564026,
0.013203074224293232,
0.06829507648944855,
0.004689544904977083,
-0.1797996461391449,
0.014169218949973583,
-0.04793664067983627,
-0.05051983892917633,
-0.21154560148715973,
-0.09716902673244476,
0.021566253155469894,
-0.14890801906585693,
0.029943199828267097,
0.0400543250143528,
0.048300597816705704,
0.04651982709765434,
-0.012981594540178776,
-0.01638130657374859,
0.014450054615736008,
0.10156259685754776,
-0.13587243854999542,
-0.12690435349941254
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Italian
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Italian using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "it", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained('gchhablani/wav2vec2-large-xlsr-it')
model = Wav2Vec2ForCTC.from_pretrained('gchhablani/wav2vec2-large-xlsr-it')
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
import unicodedata
import jiwer
def chunked_wer(targets, predictions, chunk_size=None):
if chunk_size is None: return jiwer.wer(targets, predictions)
start = 0
end = chunk_size
H, S, D, I = 0, 0, 0, 0
while start < len(targets):
chunk_metrics = jiwer.compute_measures(targets[start:end], predictions[start:end])
H = H + chunk_metrics["hits"]
S = S + chunk_metrics["substitutions"]
D = D + chunk_metrics["deletions"]
I = I + chunk_metrics["insertions"]
start += chunk_size
end += chunk_size
return float(S + D + I) / float(H + S + D)
allowed_characters = [
" ",
"'",
'a',
'b',
'c',
'd',
'e',
'f',
'g',
'h',
'i',
'j',
'k',
'l',
'm',
'n',
'o',
'p',
'q',
'r',
's',
't',
'u',
'v',
'w',
'x',
'y',
'z',
'à',
'á',
'è',
'é',
'ì',
'í',
'ò',
'ó',
'ù',
'ú',
]
def remove_accents(input_str):
if input_str in allowed_characters:
return input_str
if input_str == 'ø':
return 'o'
elif input_str=='ß' or input_str =='ß':
return 'b'
elif input_str=='ё':
return 'e'
elif input_str=='đ':
return 'd'
nfkd_form = unicodedata.normalize('NFKD', input_str)
only_ascii = nfkd_form.encode('ASCII', 'ignore').decode()
if only_ascii is None or only_ascii=='':
return input_str
else:
return only_ascii
def fix_accents(sentence):
new_sentence=''
for char in sentence:
new_sentence+=remove_accents(char)
return new_sentence
test_dataset = load_dataset("common_voice", "it", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained('gchhablani/wav2vec2-large-xlsr-it')
model = Wav2Vec2ForCTC.from_pretrained('gchhablani/wav2vec2-large-xlsr-it')
model.to("cuda")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
chars_to_remove= [",", "?", ".", "!", "-", ";", ":", '""', "%", '"', "�",'ʿ','“','”','(','=','`','_','+','«','<','>','~','…','«','»','–','\[','\]','°','̇','´','ʾ','„','̇','̇','̇','¡'] # All extra characters
chars_to_remove_regex = f'[{"".join(chars_to_remove)}]'
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_remove_regex, '', batch["sentence"]).lower().replace('‘',"'").replace('ʻ',"'").replace('ʼ',"'").replace('’',"'").replace('ʹ',"''").replace('̇','')
batch["sentence"] = fix_accents(batch["sentence"])
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * chunked_wer(predictions=result["pred_strings"], targets=result["sentence"],chunk_size=5000)))
```
**Test Result**: 11.49 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The code can be found [here](https://github.com/gchhablani/wav2vec2-week/blob/main/fine-tune-xlsr-wav2vec2-on-italian-asr-with-transformers_final.ipynb).
|
{"language": "it", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Italian by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice it", "type": "common_voice", "args": "it"}, "metrics": [{"type": "wer", "value": 11.49, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-it
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"it",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"it"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #it #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Italian
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Italian using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 11.49 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The code can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Italian\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Italian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 11.49 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #it #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Italian\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Italian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 11.49 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
80,
64,
20,
29,
30
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #it #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Italian\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Italian using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 11.49 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
-0.1908435970544815,
0.10301430523395538,
-0.0014249648666009307,
0.010374925099313259,
0.11414660513401031,
-0.049995213747024536,
0.16511385142803192,
0.1118219867348671,
0.00015680478827562183,
-0.00042677888995967805,
-0.005378189962357283,
0.03690040856599808,
0.0665537640452385,
0.11929279565811157,
0.007908906787633896,
-0.17856022715568542,
0.0017505863215774298,
0.008647228591144085,
0.043857380747795105,
0.11394984275102615,
0.09846655279397964,
-0.07776272296905518,
0.0044374833814799786,
0.08031097799539566,
-0.15497498214244843,
0.07718037068843842,
0.02383151836693287,
-0.13030409812927246,
0.13382548093795776,
0.06583137065172195,
0.09539389610290527,
0.045374058187007904,
0.08265265077352524,
-0.14016006886959076,
0.0248203594237566,
0.05203220620751381,
0.011054590344429016,
0.01642111875116825,
0.08918769657611847,
-0.027531668543815613,
0.06465484946966171,
0.12359057366847992,
-0.05460919067263603,
0.08647200465202332,
-0.06302157044410706,
-0.21301062405109406,
-0.00039124605245888233,
-0.03509313613176346,
0.07544932514429092,
0.1628762185573578,
-0.04701690003275871,
0.12247931212186813,
-0.13816016912460327,
0.08017787337303162,
0.105999656021595,
-0.1696503907442093,
0.0026129214093089104,
0.0409541018307209,
0.035518769174814224,
0.09677532315254211,
-0.023480206727981567,
0.018245842307806015,
0.033565692603588104,
-0.0006356615340337157,
-0.031817562878131866,
-0.03626062721014023,
-0.2009977251291275,
-0.023938648402690887,
-0.13179074227809906,
-0.05232158303260803,
0.1878279596567154,
-0.025799229741096497,
-0.0736430287361145,
-0.11872435361146927,
-0.01151776872575283,
-0.0038847250398248434,
-0.0004545349220279604,
-0.04360869154334068,
-0.017486605793237686,
0.04905747249722481,
0.012002600356936455,
-0.01660553365945816,
-0.09306848049163818,
-0.1477890908718109,
0.02295701391994953,
0.11072982102632523,
0.016865340992808342,
0.011707006022334099,
-0.11468682438135147,
0.06423058360815048,
-0.0698886513710022,
-0.06160347908735275,
0.0007632059277966619,
0.029896289110183716,
-0.030515601858496666,
0.0314340703189373,
-0.08355750143527985,
-0.15620550513267517,
0.059515535831451416,
-0.04761617258191109,
0.06088673695921898,
-0.00870694313198328,
-0.04470657557249069,
0.07579237967729568,
0.011004194617271423,
0.09540194272994995,
-0.0779527947306633,
-0.04364568740129471,
0.026935160160064697,
-0.015598388388752937,
-0.037046290934085846,
-0.030438020825386047,
-0.11401837319135666,
-0.07386963069438934,
-0.042269133031368256,
0.08596169203519821,
-0.006448694504797459,
-0.0234017763286829,
-0.04885973781347275,
-0.07402078062295914,
0.03856673836708069,
-0.11381632834672928,
-0.013329449109733105,
0.09071165323257446,
-0.012776524759829044,
0.09852729737758636,
0.024274656549096107,
0.030185753479599953,
-0.12345481663942337,
-0.07131354510784149,
0.005974758416414261,
0.038903724402189255,
-0.05340682342648506,
-0.1145467609167099,
0.008555175736546516,
-0.051177896559238434,
-0.012065090239048004,
-0.10384635627269745,
-0.08504512161016464,
-0.08160001784563065,
-0.027832666411995888,
0.019838780164718628,
0.02460126392543316,
-0.11091123521327972,
-0.009321304038167,
-0.04787609353661537,
-0.042263854295015335,
0.020089851692318916,
-0.04484661668539047,
0.0756361111998558,
0.03447292745113373,
0.0638418048620224,
-0.015842711552977562,
0.0817321389913559,
-0.09804020822048187,
-0.08710335195064545,
0.013893059454858303,
0.13712383806705475,
-0.04863676428794861,
-0.06982669234275818,
-0.0853085070848465,
-0.06291981041431427,
-0.055975768715143204,
0.0712229460477829,
0.09654118865728378,
0.14202533662319183,
-0.26858851313591003,
-0.1157224103808403,
0.2212827503681183,
-0.1404951512813568,
-0.022961389273405075,
0.1802111119031906,
0.007471880875527859,
0.12094152718782425,
0.10810237377882004,
0.24080154299736023,
0.15927663445472717,
-0.16658812761306763,
0.05449632182717323,
0.0013358020223677158,
-0.015560263767838478,
-0.13332606852054596,
0.08171627670526505,
-0.06231401860713959,
0.02182704024016857,
0.04640165716409683,
-0.08675944805145264,
0.08177641779184341,
-0.026418358087539673,
-0.06887352466583252,
-0.015293555334210396,
-0.08728853613138199,
0.03795558959245682,
0.06731253862380981,
0.02400059998035431,
-0.021888772025704384,
-0.05418013781309128,
0.0008508806931786239,
0.11339298635721207,
-0.1494293063879013,
0.038140177726745605,
-0.10462003946304321,
0.11932924389839172,
-0.06915044039487839,
-0.02923738770186901,
-0.14864997565746307,
0.179956316947937,
-0.007419298868626356,
0.08141215890645981,
0.03156290948390961,
0.14177252352237701,
0.01916687749326229,
0.030652757734060287,
-0.006897446233779192,
-0.023047715425491333,
-0.018976552411913872,
-0.018092358484864235,
-0.025949005037546158,
-0.07451630383729935,
-0.03826447203755379,
-0.04949051886796951,
0.08265487849712372,
-0.1467445343732834,
-0.004589336458593607,
0.04122837260365486,
0.023300964385271072,
-0.00026168543263338506,
-0.028333451598882675,
0.03958955779671669,
0.10788946598768234,
-0.009587527252733707,
0.003090540412813425,
0.06183074414730072,
0.02298109605908394,
-0.008313392288982868,
0.09612301737070084,
-0.10108595341444016,
0.03773239254951477,
0.11402705311775208,
-0.08237019181251526,
0.0017378984484821558,
0.027087200433015823,
-0.016324670985341072,
0.01313888281583786,
-0.09745602309703827,
-0.0058114053681492805,
0.28118228912353516,
-0.0105172423645854,
0.12094448506832123,
-0.10698173940181732,
0.04397536814212799,
0.039825793355703354,
-0.046676795929670334,
0.041465360671281815,
0.06253823637962341,
0.05834322050213814,
0.08008444309234619,
0.039651114493608475,
-0.02540714479982853,
-0.09246911853551865,
0.24806290864944458,
-0.02920222096145153,
-0.11586210876703262,
0.04282953590154648,
-0.0018545106286183,
-0.00044233721564523876,
0.037585049867630005,
-0.19815027713775635,
-0.035065263509750366,
0.03176147863268852,
0.03599794954061508,
0.07628830522298813,
-0.13880924880504608,
-0.0023134273942559958,
0.00900487881153822,
-0.12769415974617004,
-0.17757299542427063,
0.03680875524878502,
-0.054627686738967896,
0.04974505305290222,
-0.09305360913276672,
-0.09254076331853867,
-0.006438399199396372,
-0.04522153362631798,
-0.15515023469924927,
0.12424486130475998,
-0.08203684538602829,
-0.24094127118587494,
-0.1758793443441391,
0.08965583145618439,
0.008243615739047527,
0.04243750870227814,
0.10032682120800018,
-0.12826530635356903,
0.010822189971804619,
-0.015729594975709915,
0.06735631078481674,
0.017514046281576157,
-0.044179175049066544,
-0.03783481940627098,
0.04980229213833809,
0.028730781748890877,
-0.15511010587215424,
0.020023275166749954,
-0.03657550364732742,
-0.06892142444849014,
-0.025234807282686234,
-0.051633477210998535,
0.045590031892061234,
0.20619571208953857,
0.027689605951309204,
0.04885083809494972,
-0.0069047678261995316,
0.16281086206436157,
-0.10257471352815628,
-0.03611774370074272,
0.23301587998867035,
0.009643997997045517,
-0.026337580755352974,
0.08038061857223511,
0.038285404443740845,
-0.07076374441385269,
-0.028430091217160225,
-0.02589629590511322,
-0.0813295766711235,
-0.2273482382297516,
-0.13320668041706085,
-0.06799294054508209,
-0.0489695705473423,
-0.04317527264356613,
-0.0008766725077293813,
0.04705138877034187,
0.02465731091797352,
0.0008391994633711874,
-0.10201579332351685,
0.03998372331261635,
-0.009978705085814,
0.06462947279214859,
-0.008191827684640884,
0.08637421578168869,
-0.05612659826874733,
-0.053225044161081314,
0.0496489480137825,
0.04879627376794815,
0.1300245076417923,
0.025581656023859978,
0.1314275860786438,
0.07138066738843918,
0.1186818853020668,
0.11289902776479721,
0.09400304406881332,
-0.018634270876646042,
-0.017105676233768463,
0.010171783156692982,
-0.0726238265633583,
-0.05758852884173393,
0.0157257579267025,
0.12187258154153824,
-0.051610738039016724,
-0.0814233049750328,
-0.06927303969860077,
0.009161426685750484,
0.17156866192817688,
0.08604519814252853,
-0.22557835280895233,
-0.06688971072435379,
-0.04269935190677643,
-0.02057749032974243,
-0.005060944240540266,
0.06061083823442459,
0.1802225261926651,
-0.1154528260231018,
-0.026322726160287857,
0.018845975399017334,
0.09581824392080307,
-0.0472237803041935,
0.019808819517493248,
-0.0887964740395546,
0.02131068892776966,
0.010161267593502998,
0.10582493245601654,
-0.2267649918794632,
0.20169174671173096,
0.010901925154030323,
0.12633134424686432,
-0.07427407801151276,
0.0036961394362151623,
0.011131757870316505,
0.07280091196298599,
0.1337985396385193,
0.008448892273008823,
0.047463271766901016,
-0.11044242978096008,
-0.07644132524728775,
0.04239790886640549,
0.007689462509006262,
0.018275544047355652,
0.016177726909518242,
0.006818078923970461,
-0.01297216396778822,
-0.00849528145045042,
-0.03557267412543297,
-0.1245877668261528,
-0.058718241751194,
0.01683882065117359,
0.19342465698719025,
0.09937304258346558,
-0.01322963647544384,
-0.10163071006536484,
-0.04203294217586517,
0.11338967084884644,
-0.10395336151123047,
-0.04427824541926384,
-0.08761722594499588,
-0.04613000899553299,
0.08316612243652344,
-0.05870618671178818,
0.009868292137980461,
0.09547311812639236,
0.1125609427690506,
-0.026162775233387947,
-0.04189193993806839,
0.07803776115179062,
-0.12093108147382736,
-0.06170427054166794,
-0.03908119350671768,
0.15204468369483948,
0.08913003653287888,
0.04995914176106453,
0.06878842413425446,
0.004364894703030586,
-0.008712117560207844,
-0.04575677588582039,
0.012667054310441017,
0.1129232719540596,
-0.10838665068149567,
0.0288514643907547,
-0.013223867863416672,
-0.15741890668869019,
-0.08446075767278671,
-0.03874516114592552,
0.18807661533355713,
0.053839948028326035,
-0.0671023353934288,
0.1453159898519516,
0.2037990391254425,
-0.09306524693965912,
-0.24830952286720276,
0.014337895438075066,
0.10871653258800507,
0.1168552041053772,
-0.020187631249427795,
-0.1979258507490158,
0.053327638655900955,
0.015438237227499485,
-0.02585575543344021,
-0.011287608183920383,
-0.32313060760498047,
-0.10739217698574066,
0.11501283198595047,
0.01906728744506836,
0.16989976167678833,
-0.00064013188239187,
-0.034196916967630386,
-0.014999853447079659,
-0.047127965837717056,
0.007057126611471176,
-0.1390201896429062,
0.09670882672071457,
0.03833115100860596,
0.03010890632867813,
0.049592405557632446,
-0.032328106462955475,
0.0543426014482975,
0.0864545926451683,
-0.02993946522474289,
0.012486722320318222,
0.07030848413705826,
0.04961429908871651,
0.041076675057411194,
0.0973382517695427,
-0.039169639348983765,
0.029785988852381706,
-0.1337297260761261,
-0.09018051624298096,
-0.08283822238445282,
0.09087652713060379,
0.023442281410098076,
-0.03440149873495102,
0.05430246889591217,
-0.0355888232588768,
-0.00311425793915987,
0.022280657663941383,
-0.048826802521944046,
-0.1214609369635582,
0.05300789326429367,
0.20498645305633545,
0.16814711689949036,
-0.03182118013501167,
-0.0846591666340828,
-0.006814653053879738,
-0.018123921006917953,
0.11331286281347275,
-0.047571465373039246,
0.04246990755200386,
0.0639297291636467,
0.0482337549328804,
0.0686783418059349,
0.036721594631671906,
-0.0997164249420166,
0.07202491909265518,
0.06739485263824463,
-0.04754005745053291,
-0.0692540779709816,
-0.0662955790758133,
-0.06444280594587326,
0.018669530749320984,
0.0707274004817009,
0.12226821482181549,
-0.06371691823005676,
-0.03836677968502045,
-0.04758263751864433,
0.0014224339975044131,
-0.12730459868907928,
0.21087642014026642,
0.05328129976987839,
0.048521798104047775,
-0.11543693393468857,
0.046299468725919724,
-0.000492774008307606,
-0.06531485915184021,
0.034242063760757446,
0.004939007572829723,
-0.08900240808725357,
-0.08376996964216232,
0.016790611669421196,
0.11001061648130417,
0.000543070025742054,
-0.1509028673171997,
-0.11562596261501312,
-0.05928802117705345,
-0.0021846911404281855,
0.069347083568573,
0.061910342425107956,
0.003548074048012495,
-0.10798683017492294,
-0.08477286994457245,
-0.1067008450627327,
0.04399479180574417,
0.04436882585287094,
-0.02934771403670311,
-0.06775552779436111,
0.20097920298576355,
0.06508486717939377,
0.000014241752069210634,
-0.047629453241825104,
-0.08768938481807709,
-0.04453384131193161,
0.08820100128650665,
-0.04543595016002655,
-0.022225109860301018,
-0.03434781730175018,
0.0030638817697763443,
-0.028560025617480278,
-0.06535939127206802,
-0.009796572849154472,
0.09218797832727432,
-0.10595055669546127,
0.05979107692837715,
-0.02103940024971962,
0.06046229973435402,
-0.08666044473648071,
0.045438092201948166,
0.002798603381961584,
-0.03947041556239128,
0.07337573170661926,
0.1062275767326355,
-0.10554758459329605,
0.10426043719053268,
-0.2065133899450302,
-0.03772912174463272,
0.03964196518063545,
0.06116081401705742,
0.0056230733171105385,
-0.06405694037675858,
0.07727910578250885,
0.10876702517271042,
0.04697573557496071,
-0.019218390807509422,
0.1093738004565239,
-0.04945459961891174,
0.008975437842309475,
-0.029000835493206978,
-0.025810901075601578,
-0.019931891933083534,
0.07686485350131989,
0.10582394897937775,
0.111417256295681,
0.11825107783079147,
-0.0905865803360939,
0.11221951246261597,
-0.1044091209769249,
0.002324817469343543,
-0.040079742670059204,
-0.021670889109373093,
-0.13117073476314545,
-0.07850200682878494,
0.06508927792310715,
-0.028580624610185623,
0.10725152492523193,
0.06372164934873581,
0.1194356307387352,
-0.04190471023321152,
-0.11527164280414581,
0.04532112926244736,
-0.00623207725584507,
0.2649080753326416,
0.05531515181064606,
0.03859454393386841,
-0.027972867712378502,
0.03397513926029205,
0.018375173211097717,
0.18547917902469635,
-0.015327326953411102,
0.1519388109445572,
0.016667552292346954,
0.0962967574596405,
0.099856898188591,
-0.048770707100629807,
-0.06290125846862793,
-0.07505650073289871,
-0.07106563448905945,
0.04334288090467453,
-0.08676939457654953,
0.12589643895626068,
0.1045212596654892,
-0.08505760133266449,
0.07589873671531677,
0.03854977712035179,
-0.0757995992898941,
-0.15183964371681213,
-0.12582311034202576,
-0.04684047773480415,
-0.1849779635667801,
0.014246850274503231,
-0.09884951263666153,
0.04252168536186218,
0.011376874521374702,
0.02855757810175419,
-0.05262996256351471,
0.15571708977222443,
0.014416740275919437,
-0.09707795083522797,
0.08600626140832901,
-0.0828855112195015,
0.043690476566553116,
-0.09390414506196976,
0.036676447838544846,
0.12166435271501541,
0.018576916307210922,
0.06945282965898514,
0.019281374290585518,
-0.0631401538848877,
0.009388347156345844,
-0.0756877213716507,
-0.05769098922610283,
-0.020142853260040283,
-0.026660416275262833,
0.05344371497631073,
0.15451382100582123,
0.10898727923631668,
-0.07777820527553558,
0.021617941558361053,
0.11837710440158844,
-0.0509149469435215,
-0.11975331604480743,
-0.13738369941711426,
0.15763390064239502,
0.06490462273359299,
0.05402887612581253,
-0.02830788493156433,
-0.06849127262830734,
-0.023084469139575958,
0.27680301666259766,
0.1912720650434494,
0.044901154935359955,
0.017665022984147072,
0.01853598654270172,
-0.002644364256411791,
-0.013315906748175621,
0.08355001360177994,
0.0708380937576294,
0.25381800532341003,
-0.023567907512187958,
0.004332014825195074,
-0.07591931521892548,
-0.07010769098997116,
-0.029567962512373924,
0.07077980041503906,
-0.08661263436079025,
-0.12058684974908829,
-0.005844658240675926,
0.14313842356204987,
-0.05628451332449913,
-0.06868640333414078,
-0.0548880435526371,
-0.09652840346097946,
-0.09655800461769104,
-0.026442797854542732,
0.0030755270272493362,
0.11320016533136368,
0.043624188750982285,
-0.05675826594233513,
0.022856049239635468,
0.16224655508995056,
-0.0011860219528898597,
-0.0746091976761818,
-0.10595368593931198,
-0.005579210352152586,
-0.19540122151374817,
0.03823770582675934,
-0.024311719462275505,
0.14819203317165375,
0.037786856293678284,
0.11317955702543259,
-0.02214505895972252,
0.14759007096290588,
-0.016144447028636932,
-0.04183141142129898,
0.03498166799545288,
0.08416726440191269,
-0.05973144248127937,
0.09366491436958313,
-0.010940229520201683,
-0.1560245007276535,
0.0856403186917305,
-0.10746916383504868,
-0.0009104267810471356,
-0.09310837090015411,
0.04385523870587349,
-0.05469338968396187,
0.08992397785186768,
0.0731281116604805,
-0.06390378624200821,
-0.05198633298277855,
-0.05173880606889725,
0.08713577687740326,
0.029077235609292984,
-0.03934509679675102,
-0.041679613292217255,
-0.25116464495658875,
-0.033950325101614,
-0.05673284828662872,
0.004511576611548662,
-0.1901203691959381,
0.016841191798448563,
-0.010185359045863152,
-0.08654651790857315,
-0.019777441397309303,
0.017261039465665817,
0.1015462875366211,
0.011691603809595108,
-0.0008532065548934042,
-0.008015512488782406,
0.047983963042497635,
0.11556485295295715,
-0.14229816198349,
-0.12244214862585068
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Marathi using a part of the [InterSpeech 2021 Marathi](https://navana-tech.github.io/IS21SS-indicASRchallenge/data.html) dataset. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi `sentence` and `path` fields:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-2")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-2")
resampler = torchaudio.transforms.Resample(8_000, 16_000) # The original data was with 8,000 sampling rate. You can change it according to your input.
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the test set of the Marathi data on InterSpeech-2021.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-2")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-2")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\'\�]'
resampler = torchaudio.transforms.Resample(8_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"),
attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 19.98 % (555 examples from test set were used for evaluation)
**Test Result on 10% of OpenSLR74 data**: 64.64 %
## Training
5000 examples of the InterSpeech Marathi dataset were used for training.
The colab notebook used for training can be found [here](https://colab.research.google.com/drive/1sIwGOLJPQqhKm_wVZDkzRuoJqAEgArFr?usp=sharing).
|
{"language": "mr", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["interspeech_2021_asr"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Marathi 2 by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "InterSpeech 2021 ASR mr", "type": "interspeech_2021_asr"}, "metrics": [{"type": "wer", "value": 14.53, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-mr-2
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"mr",
"dataset:interspeech_2021_asr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-interspeech_2021_asr #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using a part of the InterSpeech 2021 Marathi dataset. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:
## Evaluation
The model can be evaluated as follows on the test set of the Marathi data on InterSpeech-2021.
Test Result: 19.98 % (555 examples from test set were used for evaluation)
Test Result on 10% of OpenSLR74 data: 64.64 %
## Training
5000 examples of the InterSpeech Marathi dataset were used for training.
The colab notebook used for training can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using a part of the InterSpeech 2021 Marathi dataset. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on the test set of the Marathi data on InterSpeech-2021.\n\n\n\nTest Result: 19.98 % (555 examples from test set were used for evaluation)\n\nTest Result on 10% of OpenSLR74 data: 64.64 %",
"## Training\n\n5000 examples of the InterSpeech Marathi dataset were used for training. \nThe colab notebook used for training can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-interspeech_2021_asr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using a part of the InterSpeech 2021 Marathi dataset. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on the test set of the Marathi data on InterSpeech-2021.\n\n\n\nTest Result: 19.98 % (555 examples from test set were used for evaluation)\n\nTest Result on 10% of OpenSLR74 data: 64.64 %",
"## Training\n\n5000 examples of the InterSpeech Marathi dataset were used for training. \nThe colab notebook used for training can be found here."
] |
[
83,
72,
41,
61,
31
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-interspeech_2021_asr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using a part of the InterSpeech 2021 Marathi dataset. When using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:## Evaluation\n\nThe model can be evaluated as follows on the test set of the Marathi data on InterSpeech-2021.\n\n\n\nTest Result: 19.98 % (555 examples from test set were used for evaluation)\n\nTest Result on 10% of OpenSLR74 data: 64.64 %## Training\n\n5000 examples of the InterSpeech Marathi dataset were used for training. \nThe colab notebook used for training can be found here."
] |
[
-0.1624709814786911,
-0.0018343804404139519,
-0.0004367961664684117,
0.04166071116924286,
0.03658197820186615,
-0.06940620392560959,
0.1456833779811859,
0.1413646936416626,
-0.02770981565117836,
0.04792998358607292,
0.03679966926574707,
0.04129447788000107,
0.09137055277824402,
0.1338266134262085,
0.03245735168457031,
-0.13462328910827637,
-0.004899344407021999,
-0.013185129500925541,
-0.0657331794500351,
0.11556454747915268,
0.14472399652004242,
-0.07276853173971176,
0.028795603662729263,
0.08768903464078903,
-0.13858819007873535,
0.03572431206703186,
-0.033579785376787186,
-0.06159597635269165,
0.06734887510538101,
0.017315221950411797,
0.0864349752664566,
0.005576664116233587,
0.059678856283426285,
-0.1598309725522995,
0.020675327628850937,
0.0158065352588892,
0.02038343995809555,
0.005850270390510559,
0.10968232154846191,
-0.0027831753250211477,
-0.004467476159334183,
-0.012399646453559399,
-0.006741332355886698,
0.03787923976778984,
-0.0758955255150795,
-0.16917838156223297,
-0.08326907455921173,
-0.030392872169613838,
0.03215509653091431,
0.1855907440185547,
-0.059440404176712036,
0.1642870306968689,
-0.0316786915063858,
0.12133856117725372,
0.1262250393629074,
-0.17951226234436035,
-0.0028125436510890722,
0.032635465264320374,
0.05972440540790558,
0.09900019317865372,
-0.08766356110572815,
-0.032290372997522354,
0.05807223543524742,
-0.022447582334280014,
0.015226761810481548,
-0.020145444199442863,
-0.10005699843168259,
0.036554936319589615,
-0.11075996607542038,
0.0030151980463415384,
0.17227689921855927,
-0.04230877012014389,
-0.07415776699781418,
-0.13475461304187775,
-0.01785595342516899,
-0.005269842687994242,
0.02368089370429516,
-0.025039728730916977,
0.017654944211244583,
0.03924417495727539,
0.026910820975899696,
0.01300747413188219,
-0.11970145255327225,
-0.07170522958040237,
0.06384360790252686,
0.06973226368427277,
0.028754647821187973,
0.030723795294761658,
-0.08344791829586029,
0.0038245636969804764,
-0.03053823858499527,
-0.1258213073015213,
-0.07703214138746262,
-0.013425756245851517,
-0.07527666538953781,
0.0004398767196107656,
-0.07143920660018921,
-0.14498592913150787,
0.01952182501554489,
0.05768359452486038,
-0.025232596322894096,
0.054453302174806595,
-0.010142733342945576,
0.014249827712774277,
0.06768395006656647,
0.14284540712833405,
-0.01946166716516018,
-0.018554657697677612,
0.006795367691665888,
0.005156929139047861,
0.04902074113488197,
0.008878836408257484,
0.026997510343790054,
-0.009007488377392292,
-0.014642450027167797,
0.05424803867936134,
-0.039413075894117355,
-0.011669082567095757,
-0.03768853470683098,
-0.0034926095977425575,
0.07752979546785355,
-0.1383187174797058,
-0.03679569065570831,
0.003918620757758617,
-0.037856996059417725,
0.07184374332427979,
-0.0012245740508660674,
0.011432411149144173,
-0.061959825456142426,
-0.13820818066596985,
-0.014348185621201992,
0.059820860624313354,
-0.09822221845388412,
-0.09628382325172424,
-0.00186274410225451,
0.023031460121273994,
-0.023456117138266563,
-0.1186608150601387,
-0.22273953258991241,
-0.07842409610748291,
-0.021930932998657227,
-0.05636954307556152,
0.0981764867901802,
-0.05757942795753479,
0.02623598650097847,
-0.01698228158056736,
-0.0359349250793457,
0.038132842630147934,
-0.048243239521980286,
0.06374488770961761,
0.02070959471166134,
0.07396999001502991,
0.07595188170671463,
0.08201321214437485,
-0.1182962954044342,
-0.03200816363096237,
-0.0032328488305211067,
0.1339968591928482,
-0.09527188539505005,
-0.03454557806253433,
-0.09607566893100739,
-0.03447362780570984,
-0.05169914290308952,
0.027843475341796875,
0.10869255661964417,
0.12629887461662292,
-0.23311369121074677,
-0.05566621571779251,
0.1808800846338272,
-0.19721278548240662,
-0.10433319211006165,
0.1470838189125061,
0.004702181555330753,
0.13802491128444672,
0.05243309959769249,
0.22585922479629517,
0.040638137608766556,
-0.17038479447364807,
-0.08837252110242844,
0.01661347784101963,
0.004737661685794592,
-0.00020390667486935854,
0.11693426966667175,
-0.021861951798200607,
0.06683124601840973,
0.0037401497829705477,
-0.047182921320199966,
0.054115768522024155,
-0.03570026159286499,
-0.04561867192387581,
-0.02937854267656803,
-0.04827718064188957,
-0.01902960054576397,
0.0428035594522953,
0.022367922589182854,
-0.048151347786188126,
-0.04706062003970146,
-0.02768881246447563,
0.12821875512599945,
-0.09282860159873962,
0.05849282070994377,
-0.08788318932056427,
0.14611977338790894,
-0.029929157346487045,
-0.0540631003677845,
-0.19918058812618256,
0.09391500800848007,
0.04300325736403465,
-0.006000859197229147,
0.005763014312833548,
0.06017424538731575,
0.03497805446386337,
0.025968099012970924,
0.01161274779587984,
-0.020359693095088005,
0.03962392359972,
-0.026880696415901184,
-0.08829427510499954,
-0.12565605342388153,
0.014513571746647358,
-0.08287876844406128,
0.1795106828212738,
-0.16770926117897034,
0.01516135223209858,
0.02421499416232109,
0.0521332323551178,
0.001637473120354116,
-0.07750663161277771,
0.07574160397052765,
0.04407825693488121,
-0.0068880184553563595,
-0.046761404722929,
0.020811310037970543,
0.06999015808105469,
0.033846158534288406,
0.08636432886123657,
-0.07121627032756805,
-0.11724977940320969,
0.07492320984601974,
-0.08307913690805435,
-0.065434031188488,
-0.07607340067625046,
-0.016613783314824104,
-0.03264632076025009,
-0.09902113676071167,
-0.0058082942850887775,
0.22634610533714294,
-0.02071996219456196,
0.1533736139535904,
-0.08389171957969666,
-0.04248865321278572,
-0.04400181397795677,
-0.06441979855298996,
-0.003168690949678421,
0.06402934342622757,
-0.062144968658685684,
-0.07356090098619461,
0.10250455141067505,
0.0011237236903980374,
-0.029417240992188454,
0.1772572100162506,
-0.03191071376204491,
-0.11277015507221222,
0.008573547005653381,
0.06090665981173515,
-0.03746672719717026,
0.0714351236820221,
-0.16461032629013062,
-0.027304913848638535,
0.049931399524211884,
0.0016318683046847582,
0.03473927080631256,
-0.05651245266199112,
0.03429345786571503,
0.027061166241765022,
-0.04077354818582535,
-0.08518063277006149,
0.0749744400382042,
-0.047873951494693756,
0.041185855865478516,
-0.0663679838180542,
0.07962426543235779,
-0.046896349638700485,
-0.046834882348775864,
-0.18034668266773224,
0.168991819024086,
-0.13750432431697845,
-0.24229611456394196,
-0.13802063465118408,
0.0012902551097795367,
0.07895511388778687,
-0.023784814402461052,
0.07872796803712845,
-0.17223447561264038,
-0.07280384749174118,
-0.0748264268040657,
0.051360681653022766,
0.023622700944542885,
-0.047994647175073624,
0.007626143284142017,
-0.019350839778780937,
0.00913029070943594,
-0.18068625032901764,
0.03767292946577072,
0.017655083909630775,
0.053534019738435745,
0.05085272341966629,
-0.020782500505447388,
-0.002237253123894334,
0.048801373690366745,
-0.014569534920156002,
0.015494298189878464,
-0.009707926772534847,
0.18719996511936188,
-0.09639324992895126,
0.04127874597907066,
0.16093400120735168,
0.004968961235135794,
-0.0036411697510629892,
0.034642163664102554,
-0.04294702410697937,
-0.061161674559116364,
0.004238949157297611,
-0.043207164853811264,
-0.016142472624778748,
-0.29080548882484436,
-0.0890243873000145,
-0.04825517535209656,
-0.030555259436368942,
-0.009556926786899567,
-0.01239748764783144,
0.039998941123485565,
0.05632416531443596,
-0.030380139127373695,
-0.03208030387759209,
0.005907885264605284,
0.05342275649309158,
0.08923232555389404,
-0.025325389578938484,
0.1018543615937233,
-0.0652749240398407,
0.07488331198692322,
0.04997057467699051,
-0.034704700112342834,
0.17677609622478485,
-0.02703084424138069,
0.1642584204673767,
0.07864993065595627,
0.1525135040283203,
0.056512631475925446,
0.05585089698433876,
-0.06211196631193161,
-0.021788345649838448,
0.04641426354646683,
-0.07293815910816193,
-0.07859593629837036,
0.06344692409038544,
0.026754748076200485,
-0.021897749975323677,
-0.006666194181889296,
-0.011110294610261917,
0.018533717840909958,
0.11512614041566849,
0.0912865698337555,
-0.20136642456054688,
-0.09239310771226883,
-0.026680583134293556,
-0.049452442675828934,
-0.08498355001211166,
0.008720478042960167,
0.1237698346376419,
-0.07851103693246841,
0.017355095595121384,
-0.026719383895397186,
0.06164969876408577,
0.010695894248783588,
-0.028856562450528145,
-0.025552604347467422,
-0.018203938379883766,
-0.027859261259436607,
0.07198743522167206,
-0.17089244723320007,
0.1862354576587677,
0.04143017902970314,
0.06690502911806107,
-0.03468763083219528,
0.026417190209031105,
0.018007563427090645,
-0.012475816532969475,
0.09580133855342865,
0.014831028878688812,
-0.03036913461983204,
-0.10525475442409515,
-0.07355232536792755,
0.004052671138197184,
0.06769128888845444,
0.0965857058763504,
0.08491731435060501,
-0.021054524928331375,
0.040406644344329834,
-0.012862893752753735,
-0.07560639083385468,
-0.13564272224903107,
-0.08082744479179382,
0.03887985646724701,
0.016439832746982574,
0.08020307868719101,
-0.04457363486289978,
-0.10039553791284561,
-0.17172691226005554,
0.16406813263893127,
-0.14714542031288147,
-0.10701457411050797,
-0.08904723823070526,
0.15012764930725098,
0.05548150837421417,
-0.05824460834264755,
0.03177236020565033,
0.016494741663336754,
0.10106980800628662,
0.006466158665716648,
-0.013107096776366234,
-0.03124547004699707,
-0.08539637923240662,
-0.07789038866758347,
-0.02926214598119259,
0.08340729773044586,
0.10116355121135712,
0.02268877439200878,
0.05378122255206108,
0.06094270572066307,
-0.022752687335014343,
-0.0354071706533432,
-0.001775194425135851,
0.05281222239136696,
0.023850200697779655,
0.06608173996210098,
-0.04626918211579323,
-0.11480315774679184,
-0.06934225559234619,
-0.1311485320329666,
0.15589693188667297,
0.12253604084253311,
-0.02212142013013363,
0.14289192855358124,
0.19241230189800262,
-0.11987028270959854,
-0.17317712306976318,
0.05604192614555359,
0.07282224297523499,
0.11034157872200012,
0.012565035372972488,
-0.19207295775413513,
0.06694739311933517,
0.07900195568799973,
-0.0054582892917096615,
-0.06891082227230072,
-0.1472683548927307,
-0.1205567717552185,
0.06279438734054565,
0.06153807416558266,
0.0247124582529068,
-0.10536058992147446,
-0.049216002225875854,
0.03360472247004509,
0.021170971915125847,
-0.10636186599731445,
-0.039595961570739746,
0.06105814874172211,
-0.023104701191186905,
-0.023597201332449913,
0.038449667394161224,
-0.04071220010519028,
0.09075390547513962,
0.011542181484401226,
0.013229300267994404,
-0.0563664436340332,
0.12860475480556488,
0.1275758296251297,
0.02331925369799137,
0.1783532202243805,
-0.042444486171007156,
0.06700317561626434,
-0.16200678050518036,
-0.04053601250052452,
-0.03628961741924286,
0.021681107580661774,
-0.007687770761549473,
-0.04833231121301651,
-0.0693616047501564,
0.05913706123828888,
0.030712351202964783,
-0.0320042222738266,
-0.03212808072566986,
-0.03539268672466278,
0.014185006730258465,
0.13030283153057098,
0.12311413139104843,
0.04961889237165451,
-0.022803038358688354,
-0.013166210614144802,
0.0022488911636173725,
0.049489643424749374,
-0.11692588776350021,
-0.018572282046079636,
0.05082865431904793,
0.07355296611785889,
0.07424329221248627,
-0.01645851694047451,
-0.10008129477500916,
0.06176034361124039,
0.046267520636320114,
0.007091438863426447,
-0.026802709326148033,
-0.007725378032773733,
0.030434925109148026,
-0.03435204550623894,
-0.013051792047917843,
0.11892197281122208,
-0.05804534628987312,
-0.011227556504309177,
-0.04299608990550041,
0.039586517959833145,
-0.04199521988630295,
0.1894313544034958,
0.027028843760490417,
0.04993736743927002,
-0.08951827883720398,
0.052690550684928894,
0.027752090245485306,
-0.07181138545274734,
0.04906028136610985,
0.047140028327703476,
-0.09429957717657089,
-0.05884367227554321,
0.04354695603251457,
0.07363588362932205,
0.0448247455060482,
-0.04928415268659592,
-0.08024433255195618,
-0.04297541081905365,
0.016224291175603867,
0.15912872552871704,
0.057087577879428864,
0.000021240619389573112,
-0.03767566382884979,
-0.03232308477163315,
-0.11459802091121674,
0.11226680129766464,
0.03621792048215866,
-0.0098591772839427,
-0.029518021270632744,
0.09562144428491592,
0.02526364102959633,
0.009437989443540573,
-0.03207295015454292,
-0.08997436612844467,
-0.03490652143955231,
0.029667437076568604,
-0.0773739218711853,
0.018932057544589043,
-0.08417122066020966,
-0.04300984367728233,
-0.04371466487646103,
-0.030797243118286133,
0.02440137043595314,
0.06869658082723618,
-0.018451208248734474,
0.027031537145376205,
-0.03298506140708923,
0.09404973685741425,
-0.14020393788814545,
-0.01191173680126667,
0.03170742839574814,
-0.07828278839588165,
0.058212053030729294,
0.10882949829101562,
-0.049427129328250885,
0.04652198776602745,
-0.07910139113664627,
-0.00007624222052982077,
0.011469266377389431,
0.09373804181814194,
-0.002471317769959569,
-0.20569318532943726,
0.018827781081199646,
0.03217138350009918,
0.0004669215704780072,
0.052098896354436874,
0.19462433457374573,
0.0017360388301312923,
0.06560086458921432,
-0.1156066358089447,
0.015976347029209137,
-0.08035285770893097,
0.11450260132551193,
0.040664173662662506,
0.08220849931240082,
0.1443488895893097,
-0.0857362225651741,
0.048024099320173264,
-0.13547608256340027,
0.005886617116630077,
-0.000648347195237875,
-0.012450838461518288,
-0.032892562448978424,
-0.07631149142980576,
0.07293318212032318,
-0.025464266538619995,
0.16202957928180695,
0.008732329122722149,
-0.004476952366530895,
0.05382449924945831,
-0.11969873309135437,
0.07671527564525604,
-0.005514401476830244,
0.1593851000070572,
0.04461738467216492,
0.03446905314922333,
-0.02297799475491047,
0.026111185550689697,
0.042577121406793594,
0.08562110364437103,
0.02077476866543293,
0.28394201397895813,
-0.04315508157014847,
0.07902567088603973,
-0.02625388279557228,
-0.1519012451171875,
-0.05985506996512413,
-0.07490622997283936,
-0.14213435351848602,
0.01301606185734272,
-0.005529071670025587,
0.18606621026992798,
0.15347564220428467,
-0.15666165947914124,
0.059823568910360336,
-0.03198041021823883,
-0.10033363848924637,
-0.05167125165462494,
-0.043183472007513046,
-0.0383005253970623,
-0.09774498641490936,
0.023895107209682465,
-0.10398739576339722,
0.07085605710744858,
0.06636574119329453,
0.08493690192699432,
0.008346003480255604,
0.15410959720611572,
-0.035756614059209824,
-0.07381218671798706,
0.03675360977649689,
-0.0518454909324646,
0.017209989950060844,
-0.0665762647986412,
0.0007090848521329463,
0.07351616024971008,
-0.033020175993442535,
0.040555112063884735,
0.04182133078575134,
0.0469655767083168,
0.013354870490729809,
-0.04535670578479767,
-0.06331118196249008,
-0.001312972279265523,
0.027249472215771675,
0.04294928163290024,
0.1555510014295578,
0.09233036637306213,
0.006006198935210705,
0.004508778918534517,
0.07495786994695663,
-0.002957593882456422,
-0.08179760724306107,
-0.15453889966011047,
-0.006933803204447031,
0.03040359541773796,
-0.026890084147453308,
0.039708059281110764,
-0.05571815371513367,
0.016453664749860764,
0.24379314482212067,
0.08716987073421478,
0.04332904517650604,
-0.0014704500790685415,
0.00792825035750866,
-0.016145404428243637,
-0.07139058411121368,
0.09323633462190628,
0.07417742162942886,
0.14690858125686646,
-0.07684945315122604,
-0.04757033661007881,
-0.111111581325531,
-0.09403146058320999,
-0.06899479031562805,
0.05164198949933052,
-0.0716906264424324,
-0.08597897738218307,
-0.014232669025659561,
0.12158665060997009,
0.024206409230828285,
-0.1412951499223709,
0.0005666365032084286,
-0.06573978066444397,
-0.11612362414598465,
-0.01865413784980774,
-0.04059022292494774,
0.03292560577392578,
0.009796669706702232,
0.020395055413246155,
0.04939545318484306,
0.1582852005958557,
0.029551822692155838,
-0.020021382719278336,
-0.08025915175676346,
0.08559879660606384,
-0.18816113471984863,
0.012947076000273228,
-0.0022963692899793386,
0.24098484218120575,
0.07211942970752716,
0.091423898935318,
-0.06479231268167496,
0.06904123723506927,
0.0383826345205307,
-0.04384928569197655,
0.006071758456528187,
0.12138456851243973,
0.0008506689919158816,
0.1317044496536255,
0.052212703973054886,
0.0018449811032041907,
0.0789875015616417,
-0.0181607473641634,
-0.017663054168224335,
-0.10945679247379303,
0.04590383544564247,
-0.033078018575906754,
0.13455307483673096,
0.11534343659877777,
-0.04818280413746834,
-0.04264123737812042,
-0.03384888917207718,
0.031049618497490883,
0.04207790642976761,
-0.00463944161310792,
-0.02611435018479824,
-0.2855587601661682,
0.017505085095763206,
-0.07573609054088593,
0.027955960482358932,
-0.22271601855754852,
-0.06970133632421494,
0.02215900458395481,
-0.0799780860543251,
0.01405070535838604,
0.10287691652774811,
0.04913623631000519,
0.015184607356786728,
0.010634914971888065,
-0.02004736289381981,
0.04080065339803696,
0.10291452705860138,
-0.11752866208553314,
-0.09959937632083893
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Marathi using the [OpenSLR SLR64](http://openslr.org/64/) dataset and [InterSpeech 2021](https://navana-tech.github.io/IS21SS-indicASRchallenge/data.html) Marathi datasets. Note that this data OpenSLR contains only female voices. Please keep this in mind before using the model for your task. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi `text` and `audio_path` fields:
```python
import torch
import torchaudio
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# test_data = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-3")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-3")
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["audio_path"])
batch["speech"] = librosa.resample(speech_array[0].numpy(), sampling_rate, 16_000) # sampling_rate can vary
return batch
test_data= test_data.map(speech_file_to_array_fn)
inputs = processor(test_data["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_data["text"][:2])
```
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
```python
import torch
import torchaudio
import librosa
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
# test_data = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-3")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr-3")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…]'
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
batch["text"] = re.sub(chars_to_ignore_regex, '', batch["text"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["audio_path"])
batch["speech"] = librosa.resample(speech_array[0].numpy(), sampling_rate, 16_000)
return batch
test_data= test_data.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_data.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["text"])))
```
**Test Result**: 19.05 % (157+157 examples)
**Test Result on OpenSLR test**: 14.15 % (157 examples)
**Test Results on InterSpeech test**: 27.14 % (157 examples)
## Training
1412 examples of the OpenSLR Marathi dataset and 1412 examples of InterSpeech 2021 Marathi ASR dataset were used for training. For testing, 157 examples from each were used.
The colab notebook used for training and evaluation can be found [here](https://colab.research.google.com/drive/15fUhb4bUFFGJyNLr-_alvPxVX4w0YXRu?usp=sharing).
|
{"language": "mr", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["openslr", "interspeech_2021_asr"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Marathi by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "OpenSLR mr, InterSpeech 2021 ASR mr", "type": "openslr, interspeech_2021_asr"}, "metrics": [{"type": "wer", "value": 19.05, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-mr-3
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"mr",
"dataset:openslr",
"dataset:interspeech_2021_asr",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #dataset-interspeech_2021_asr #license-apache-2.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset and InterSpeech 2021 Marathi datasets. Note that this data OpenSLR contains only female voices. Please keep this in mind before using the model for your task. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'text' and 'audio_path' fields:
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
Test Result: 19.05 % (157+157 examples)
Test Result on OpenSLR test: 14.15 % (157 examples)
Test Results on InterSpeech test: 27.14 % (157 examples)
## Training
1412 examples of the OpenSLR Marathi dataset and 1412 examples of InterSpeech 2021 Marathi ASR dataset were used for training. For testing, 157 examples from each were used.
The colab notebook used for training and evaluation can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset and InterSpeech 2021 Marathi datasets. Note that this data OpenSLR contains only female voices. Please keep this in mind before using the model for your task. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'text' and 'audio_path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 19.05 % (157+157 examples)\n \nTest Result on OpenSLR test: 14.15 % (157 examples)\n\nTest Results on InterSpeech test: 27.14 % (157 examples)",
"## Training\n\n1412 examples of the OpenSLR Marathi dataset and 1412 examples of InterSpeech 2021 Marathi ASR dataset were used for training. For testing, 157 examples from each were used.\n\nThe colab notebook used for training and evaluation can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #dataset-interspeech_2021_asr #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset and InterSpeech 2021 Marathi datasets. Note that this data OpenSLR contains only female voices. Please keep this in mind before using the model for your task. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'text' and 'audio_path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 19.05 % (157+157 examples)\n \nTest Result on OpenSLR test: 14.15 % (157 examples)\n\nTest Results on InterSpeech test: 27.14 % (157 examples)",
"## Training\n\n1412 examples of the OpenSLR Marathi dataset and 1412 examples of InterSpeech 2021 Marathi ASR dataset were used for training. For testing, 157 examples from each were used.\n\nThe colab notebook used for training and evaluation can be found here."
] |
[
86,
106,
43,
69,
60
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #dataset-interspeech_2021_asr #license-apache-2.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset and InterSpeech 2021 Marathi datasets. Note that this data OpenSLR contains only female voices. Please keep this in mind before using the model for your task. When using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'text' and 'audio_path' fields:## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 19.05 % (157+157 examples)\n \nTest Result on OpenSLR test: 14.15 % (157 examples)\n\nTest Results on InterSpeech test: 27.14 % (157 examples)## Training\n\n1412 examples of the OpenSLR Marathi dataset and 1412 examples of InterSpeech 2021 Marathi ASR dataset were used for training. For testing, 157 examples from each were used.\n\nThe colab notebook used for training and evaluation can be found here."
] |
[
-0.1628420054912567,
0.05496589094400406,
-0.0034707558806985617,
0.049603965133428574,
0.033781684935092926,
-0.04522339999675751,
0.10832417756319046,
0.11840904504060745,
0.0007357836584560573,
0.07873206585645676,
0.042994603514671326,
0.02489572763442993,
0.05856522545218468,
0.11255303025245667,
-0.03281340375542641,
-0.09497786313295364,
-0.009071850217878819,
-0.06268967688083649,
-0.09189172834157944,
0.082835853099823,
0.15687718987464905,
-0.035536281764507294,
0.043722886592149734,
0.0862714946269989,
-0.04418887570500374,
0.039160292595624924,
-0.003669087076559663,
-0.04038313031196594,
0.048310231417417526,
0.03922683373093605,
0.06067712977528572,
-0.00031204140395857394,
0.02025560662150383,
-0.18043172359466553,
0.01690629869699478,
0.016698438674211502,
0.027941638603806496,
-0.0012047981144860387,
0.1511373221874237,
-0.018884947523474693,
0.016046589240431786,
-0.05158150941133499,
0.01317108329385519,
0.05341588333249092,
-0.04242303594946861,
-0.17866505682468414,
-0.12863951921463013,
-0.05730455741286278,
0.004396144300699234,
0.1654185652732849,
-0.05078307166695595,
0.13406050205230713,
-0.023714395239949226,
0.12008631974458694,
0.10712562501430511,
-0.08746614307165146,
-0.017305413261055946,
-0.011681546457111835,
0.04638180881738663,
0.08597691357135773,
-0.06159323453903198,
-0.04404569789767265,
0.061588846147060394,
-0.012406136840581894,
-0.044608112424612045,
-0.021812545135617256,
-0.05011700838804245,
0.031177757307887077,
-0.11047685146331787,
-0.01355674397200346,
0.19517245888710022,
-0.034411851316690445,
-0.09950073063373566,
-0.18334099650382996,
-0.055602625012397766,
0.06676352769136429,
0.05908389389514923,
0.007832101546227932,
0.0034767386969178915,
0.0440061129629612,
0.05997738614678383,
-0.007785087451338768,
-0.10187065601348877,
-0.047156065702438354,
0.03662608191370964,
0.04269467666745186,
0.031201910227537155,
0.024596331641077995,
-0.05138382315635681,
-0.0250465776771307,
-0.09492413699626923,
-0.14954428374767303,
-0.08025804162025452,
-0.03896448388695717,
-0.07942512631416321,
-0.011468769051134586,
-0.0723474845290184,
-0.276008665561676,
0.08469873666763306,
0.08681255578994751,
-0.05671229213476181,
0.06905976682901382,
-0.09722672402858734,
-0.014491291716694832,
0.060235220938920975,
0.13687609136104584,
-0.02133302204310894,
0.054738711565732956,
0.017823467031121254,
-0.011970993131399155,
0.06554093211889267,
0.0034266184084117413,
0.07854414731264114,
-0.0357377789914608,
-0.024959085509181023,
0.06403201073408127,
0.0015491218073293567,
-0.016650820150971413,
-0.03311857208609581,
0.02417399361729622,
0.09617940336465836,
-0.16634102165699005,
-0.0008489892934449017,
0.006811388302594423,
-0.00401041004806757,
0.026155291125178337,
0.014873170293867588,
0.014854803681373596,
-0.029937243089079857,
-0.0833040326833725,
-0.030891040340065956,
0.020711390301585197,
-0.08587586879730225,
-0.06319884955883026,
0.034331437200307846,
0.05062253400683403,
-0.031008558347821236,
-0.08348668366670609,
-0.20974266529083252,
-0.07283101230859756,
-0.009866447187960148,
-0.0685749426484108,
0.05876769497990608,
0.01666201278567314,
-0.005593597888946533,
0.017416922375559807,
-0.009783612564206123,
0.07016301900148392,
-0.02405036799609661,
0.06573719531297684,
0.06308448314666748,
0.036620136350393295,
0.12180396169424057,
0.060997724533081055,
-0.10309306532144547,
0.038696978241205215,
-0.06798171252012253,
0.1781609207391739,
-0.1478111445903778,
-0.07671153545379639,
-0.10464166849851608,
-0.006515856366604567,
-0.0394565612077713,
0.016208497807383537,
0.10606316477060318,
0.1276710033416748,
-0.18639232218265533,
-0.01017430517822504,
0.18752406537532806,
-0.19557492434978485,
-0.0964026227593422,
0.15347062051296234,
0.005803671665489674,
0.051240600645542145,
0.011758189648389816,
0.18069760501384735,
0.10197135806083679,
-0.17282384634017944,
-0.06309708207845688,
0.021470440551638603,
0.02398764342069626,
0.11635711789131165,
0.13428060710430145,
-0.0713273212313652,
0.06895621865987778,
-0.02748880162835121,
-0.04462876915931702,
0.023267827928066254,
-0.017056254670023918,
-0.036684565246105194,
-0.06073728948831558,
-0.05980917811393738,
-0.01428605429828167,
0.025828339159488678,
-0.02916828915476799,
-0.04183189198374748,
-0.08654490113258362,
-0.0793163925409317,
0.11286661028862,
-0.07360372692346573,
0.06304285675287247,
-0.10709031671285629,
0.14876259863376617,
-0.07014945149421692,
-0.06645085662603378,
-0.1954987496137619,
0.022212054580450058,
0.048580363392829895,
-0.09494232386350632,
-0.004976003896445036,
0.007143047638237476,
0.03139297291636467,
0.017963817343115807,
-0.03344810754060745,
-0.008308540098369122,
0.06212138757109642,
-0.03693009912967682,
-0.06328001618385315,
-0.1634155660867691,
0.011971302330493927,
-0.06685925275087357,
0.15644699335098267,
-0.17359673976898193,
0.001278178533539176,
0.10232563316822052,
0.06282375007867813,
0.012902557849884033,
-0.08753165602684021,
0.08941025286912918,
0.00851088110357523,
0.013858145102858543,
-0.05122421309351921,
-0.04519316181540489,
0.04096481204032898,
0.000021002604626119137,
0.11040768027305603,
-0.06892397254705429,
-0.09135767817497253,
0.012444966472685337,
0.007279383018612862,
-0.08699927479028702,
-0.010235952213406563,
-0.0245108250528574,
-0.032717667520046234,
-0.15231716632843018,
-0.04238985851407051,
0.13563136756420135,
-0.0034648997243493795,
0.12453149259090424,
-0.09176237136125565,
-0.02541421167552471,
-0.010003594681620598,
-0.08637981861829758,
-0.026315972208976746,
0.0938120186328888,
-0.04849240183830261,
-0.07817745953798294,
0.07853764295578003,
-0.08450499176979065,
-0.06447190791368484,
0.1299782246351242,
-0.059874989092350006,
-0.1116853728890419,
0.0053501310758292675,
0.08175728470087051,
-0.017464665696024895,
0.04620887711644173,
-0.13323752582073212,
0.058057814836502075,
0.06296773999929428,
0.016784464940428734,
0.038764018565416336,
-0.03544825315475464,
0.018919959664344788,
0.045823510736227036,
-0.0170760415494442,
-0.08929811418056488,
0.1263824701309204,
-0.015362752601504326,
0.042352937161922455,
-0.06035329028964043,
0.1166517436504364,
-0.021631993353366852,
-0.0642073005437851,
-0.21188774704933167,
0.14567892253398895,
-0.1226322129368782,
-0.15278835594654083,
-0.17420905828475952,
0.013370895758271217,
0.03544485941529274,
-0.03219529613852501,
0.07470272481441498,
-0.12801870703697205,
-0.11105083674192429,
-0.09600678086280823,
0.0543757863342762,
0.027909427881240845,
-0.05221681669354439,
0.018017766997218132,
-0.034433674067258835,
0.038385018706321716,
-0.17198596894741058,
0.032873913645744324,
0.0667712464928627,
0.045905809849500656,
0.02634197287261486,
0.011871365830302238,
0.04798270761966705,
0.029461495578289032,
-0.012161198072135448,
-0.0038956981152296066,
0.028874993324279785,
0.1957859992980957,
-0.11497609317302704,
0.04677872732281685,
0.19416257739067078,
-0.059792786836624146,
0.031598933041095734,
0.042102571576833725,
-0.04709862172603607,
-0.03917315974831581,
0.011861289851367474,
0.004740825388580561,
-0.014670447446405888,
-0.34903883934020996,
-0.06740321964025497,
-0.07704591006040573,
-0.0007429021643474698,
-0.05272173881530762,
-0.01899135671555996,
0.047804705798625946,
0.05555422231554985,
-0.037641845643520355,
-0.05654577165842056,
0.01132290344685316,
0.03976866602897644,
0.17849530279636383,
0.0012755927164107561,
0.07767131924629211,
-0.0939963087439537,
0.07975668460130692,
0.07070865482091904,
-0.08212482929229736,
0.1453223079442978,
-0.06395959109067917,
0.129757359623909,
0.11217471957206726,
0.10800302773714066,
0.02078644558787346,
0.028344668447971344,
-0.08311338722705841,
0.006733612157404423,
0.05772887170314789,
-0.0901695117354393,
-0.0723467692732811,
0.07965531945228577,
-0.0016772912349551916,
-0.05183665081858635,
0.016043540090322495,
0.03108014352619648,
-0.003334963694214821,
0.08182230591773987,
0.07138071954250336,
-0.11871258914470673,
-0.06880944222211838,
-0.010213013738393784,
-0.11876262724399567,
-0.0985535979270935,
-0.011660216376185417,
0.13512122631072998,
-0.09048497676849365,
0.09208256751298904,
-0.03128475695848465,
0.059415802359580994,
0.06730629503726959,
-0.05389408394694328,
-0.01829843409359455,
0.007654333487153053,
-0.038618531078100204,
0.08346788585186005,
-0.217014878988266,
0.12995170056819916,
0.0357065349817276,
0.042783696204423904,
-0.04548230022192001,
0.06727645546197891,
0.028901034966111183,
-0.05398176982998848,
0.07664558291435242,
0.009494802914559841,
-0.06595005095005035,
-0.0003260950034018606,
-0.029781008139252663,
0.009685094468295574,
0.07579916715621948,
0.09302956610918045,
0.09943103790283203,
-0.034002456814050674,
0.030894257128238678,
-0.0026818825863301754,
0.056035351008176804,
-0.12964829802513123,
-0.10550443083047867,
0.05464042350649834,
-0.029800767078995705,
0.047050125896930695,
-0.05780470371246338,
-0.06763076037168503,
-0.15709900856018066,
0.08763863146305084,
-0.1422184705734253,
-0.11171183735132217,
-0.10575530678033829,
0.08158876746892929,
0.13473671674728394,
-0.06254420429468155,
0.055310893803834915,
-0.007707392796874046,
0.1354030966758728,
-0.019061099737882614,
0.0208041463047266,
-0.04784993827342987,
-0.054050203412771225,
-0.17917561531066895,
-0.009778589941561222,
0.08616473525762558,
0.07895949482917786,
0.029238708317279816,
0.025561193004250526,
0.09743843227624893,
-0.010096495039761066,
-0.02581154741346836,
0.026756657287478447,
0.0525674894452095,
0.009759427979588509,
0.09168913960456848,
0.014459514990448952,
-0.11483392119407654,
-0.04585029557347298,
-0.1804359406232834,
0.10551384836435318,
0.11555743962526321,
0.005131994374096394,
0.12478827685117722,
0.08391863107681274,
-0.12443961948156357,
-0.15000422298908234,
0.06043984740972519,
0.09308134764432907,
0.06637689471244812,
0.05423852428793907,
-0.17196007072925568,
0.09181396663188934,
0.043140560388565063,
-0.018991654738783836,
-0.04411603882908821,
-0.11073528230190277,
-0.13661514222621918,
0.03351125121116638,
0.018399691209197044,
-0.09616164863109589,
-0.1226217970252037,
-0.05869266390800476,
-0.00583610637113452,
0.000354903721017763,
-0.11410635709762573,
-0.0023220067378133535,
0.029331538826227188,
-0.004001022782176733,
0.03188997879624367,
0.06976312398910522,
-0.05670392885804176,
0.10662030428647995,
0.0029301894828677177,
0.018107308074831963,
-0.05292069539427757,
0.15827994048595428,
0.10015267878770828,
0.031449563801288605,
0.18758241832256317,
-0.02935843914747238,
0.06750012189149857,
-0.16232039034366608,
-0.04967598617076874,
-0.020232142880558968,
0.0191318541765213,
-0.01719284988939762,
-0.0315825380384922,
-0.06601779907941818,
0.04299868270754814,
0.06755219399929047,
-0.04357871040701866,
-0.014075580053031445,
-0.06263115257024765,
0.019508477300405502,
0.12070981413125992,
0.19989359378814697,
0.13540324568748474,
-0.012784555554389954,
-0.011780984699726105,
0.005600036587566137,
0.020612772554159164,
-0.16034989058971405,
0.04298694059252739,
0.026178894564509392,
0.0852426066994667,
0.06585649400949478,
-0.04721221327781677,
-0.1156296655535698,
0.049545980989933014,
0.04165190830826759,
0.00842334982007742,
-0.05051993951201439,
0.0031778085976839066,
0.07579541951417923,
-0.0990438312292099,
-0.07639575749635696,
0.12182590365409851,
-0.013136366382241249,
-0.003240833757445216,
-0.04089635983109474,
0.09107048809528351,
-0.03353254869580269,
0.17496562004089355,
0.06665287166833878,
0.05771774798631668,
-0.06115340813994408,
0.04221648350358009,
0.013203654438257217,
-0.11840720474720001,
0.07204759120941162,
0.06806065887212753,
-0.042389094829559326,
-0.0467742383480072,
0.03609937056899071,
0.08691735565662384,
0.016372451558709145,
-0.051865965127944946,
-0.05098944529891014,
-0.05788961425423622,
0.03173625096678734,
0.2408369928598404,
0.035275693982839584,
0.04105337709188461,
-0.0022866635117679834,
-0.027246590703725815,
-0.12098269164562225,
0.1506129950284958,
0.06511782854795456,
0.01295692753046751,
-0.06534601747989655,
0.03976909816265106,
-0.019783014431595802,
-0.014833984896540642,
-0.01966395601630211,
-0.02600056864321232,
-0.014116388745605946,
0.01847100257873535,
-0.16304410994052887,
0.05272616073489189,
-0.04673168808221817,
-0.022903302684426308,
-0.03925726190209389,
-0.03884848579764366,
0.013586586341261864,
0.05922478064894676,
-0.0005453944904729724,
-0.0017300705658271909,
-0.0337417908012867,
0.09645930677652359,
-0.15371623635292053,
-0.0024986944627016783,
0.07633598148822784,
-0.09546366333961487,
0.05694066360592842,
0.06570432335138321,
-0.07994885742664337,
-0.007903552614152431,
-0.08915247768163681,
0.02786126546561718,
-0.015077164396643639,
0.08262158185243607,
-0.059093113988637924,
-0.24259960651397705,
0.01930590532720089,
0.0336221419274807,
-0.020708084106445312,
0.06466823071241379,
0.1743685007095337,
-0.022441841661930084,
0.13644666969776154,
-0.07547494769096375,
-0.02189396135509014,
-0.08924663066864014,
0.1062081828713417,
0.062096722424030304,
0.061814602464437485,
0.11083714663982391,
-0.06632333248853683,
0.08508811146020889,
-0.14318770170211792,
0.00013872218551114202,
0.016801152378320694,
0.027616506442427635,
-0.011065905913710594,
-0.02919766679406166,
0.08183562755584717,
-0.044090405106544495,
0.2016621232032776,
0.027603331953287125,
-0.03468763455748558,
0.07297143340110779,
-0.06256608664989471,
0.12290218472480774,
0.01900123804807663,
0.09320171177387238,
-0.01743089035153389,
0.0443369522690773,
-0.01712711527943611,
-0.024032598361372948,
-0.006104836706072092,
0.04501049220561981,
0.008562623523175716,
0.24886615574359894,
-0.017825139686465263,
0.012180937454104424,
-0.022851597517728806,
-0.16811321675777435,
-0.058320313692092896,
-0.08560813218355179,
-0.12346646934747696,
0.00521913543343544,
-0.02054961770772934,
0.1094672828912735,
0.1123301237821579,
-0.1650833636522293,
0.11723854392766953,
-0.047786734998226166,
-0.08334077894687653,
-0.09617409110069275,
-0.08764050155878067,
-0.0493435300886631,
-0.07510514557361603,
0.043478820472955704,
-0.1036902368068695,
0.10051343590021133,
0.0591275729238987,
0.02129269391298294,
-0.0033605818171054125,
0.1650187224149704,
-0.10230227559804916,
-0.07002109289169312,
0.016994738951325417,
-0.030257921665906906,
0.02726970613002777,
0.06251829862594604,
0.019203200936317444,
0.08948779106140137,
-0.01967785134911537,
0.037951093167066574,
0.06955014914274216,
0.03317389264702797,
-0.0026082636322826147,
-0.07236972451210022,
-0.06219777464866638,
0.02061094529926777,
0.014833703637123108,
0.04264869540929794,
0.16743594408035278,
0.046189747750759125,
0.0403447300195694,
-0.0030229659751057625,
0.10771357268095016,
0.00556431757286191,
-0.06912726908922195,
-0.17274002730846405,
-0.03012450784444809,
-0.015457850880920887,
-0.0184797253459692,
0.04764552786946297,
-0.08744385093450546,
0.06370077282190323,
0.1849908083677292,
0.025013336911797523,
0.04538717120885849,
0.005386462435126305,
0.03300414979457855,
-0.00758732482790947,
-0.06557875126600266,
0.056183524429798126,
0.0523265041410923,
0.22221684455871582,
-0.07489891350269318,
-0.005492970813065767,
-0.08884882181882858,
-0.06793128699064255,
-0.049685124307870865,
0.10155168175697327,
-0.0699663832783699,
-0.07171428203582764,
-0.00915499683469534,
0.13700082898139954,
-0.017868947237730026,
-0.17898915708065033,
-0.002256576670333743,
-0.028560860082507133,
-0.10202477872371674,
-0.015729470178484917,
-0.028919775038957596,
0.007706968579441309,
-0.0057981270365417,
0.033100489526987076,
0.018997859209775925,
0.1386619210243225,
0.04404190182685852,
0.015241175889968872,
-0.01364590972661972,
0.07173660397529602,
-0.19606633484363556,
0.013813614845275879,
0.017541423439979553,
0.2531645596027374,
0.09726182371377945,
0.050079260021448135,
-0.056042540818452835,
0.04267016425728798,
0.09005782753229141,
-0.05886735022068024,
0.017442874610424042,
0.16839691996574402,
0.018450157716870308,
0.1702992469072342,
0.09401597827672958,
0.01465828437358141,
0.053838081657886505,
0.02545955218374729,
-0.03188084065914154,
-0.11982928216457367,
0.07012699544429779,
-0.09073576331138611,
0.1151093915104866,
0.11513085663318634,
-0.0577930323779583,
-0.00485056871548295,
-0.04454680532217026,
0.025151561945676804,
0.012032325379550457,
0.02697952277958393,
0.003623504191637039,
-0.2903955578804016,
0.03666588291525841,
-0.029625041410326958,
0.040730152279138565,
-0.24044831097126007,
-0.06496378779411316,
0.03460002318024635,
-0.06765314191579819,
0.03209448978304863,
0.08033914864063263,
0.051116202026605606,
-0.020092841237783432,
0.016433415934443474,
-0.06593851745128632,
0.05240499600768089,
0.09980513900518417,
-0.08715532720088959,
-0.03671806678175926
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Marathi using the [OpenSLR SLR64](http://openslr.org/64/) dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi `sentence` and `path` fields:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr")
resampler = torchaudio.transforms.Resample(48_000, 16_000) # The original data was with 48,000 sampling rate. You can change it according to your input.
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
# test_dataset = #TODO: WRITE YOUR CODE TO LOAD THE TEST DATASET. For sample see the Colab link in Training Section.
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-mr")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"),
attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 14.53 %
## Training
90% of the OpenSLR Marathi dataset was used for training.
The colab notebook used for training can be found [here](https://colab.research.google.com/drive/1_BbLyLqDUsXG3RpSULfLRjC6UY3RjwME?usp=sharing).
|
{"language": "mr", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["openslr"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Marathi by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "OpenSLR mr", "type": "openslr"}, "metrics": [{"type": "wer", "value": 14.53, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-mr
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"mr",
"dataset:openslr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"mr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Marathi
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:
## Evaluation
The model can be evaluated as follows on 10% of the Marathi data on OpenSLR.
Test Result: 14.53 %
## Training
90% of the OpenSLR Marathi dataset was used for training.
The colab notebook used for training can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 14.53 %",
"## Training\n\n90% of the OpenSLR Marathi dataset was used for training.\nThe colab notebook used for training can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:",
"## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 14.53 %",
"## Training\n\n90% of the OpenSLR Marathi dataset was used for training.\nThe colab notebook used for training can be found here."
] |
[
78,
103,
41,
29,
28
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mr #dataset-openslr #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Marathi\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Marathi using the OpenSLR SLR64 dataset. Note that this data contains only female voices. Please keep this in mind before using the model for your task, although it works very well for male voice too. When using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows, assuming you have a dataset with Marathi 'sentence' and 'path' fields:## Evaluation\n\nThe model can be evaluated as follows on 10% of the Marathi data on OpenSLR.\n\n\n\nTest Result: 14.53 %## Training\n\n90% of the OpenSLR Marathi dataset was used for training.\nThe colab notebook used for training can be found here."
] |
[
-0.14064158499240875,
0.03931112587451935,
-0.0007290017092600465,
0.0197933167219162,
0.08570568263530731,
-0.06901244819164276,
0.15479505062103271,
0.08988252282142639,
-0.06108381971716881,
0.019858304411172867,
0.016141418367624283,
0.02785462886095047,
0.060221150517463684,
0.14245609939098358,
0.016560954973101616,
-0.17385181784629822,
0.0005505825974978507,
0.0006122778868302703,
-0.04510495439171791,
0.10929132252931595,
0.13644275069236755,
-0.06951094418764114,
-0.040059931576251984,
0.10651721060276031,
-0.16184002161026,
0.00995442271232605,
0.013927045278251171,
-0.058384593576192856,
0.07271581143140793,
0.04501673951745033,
0.08155014365911484,
0.05127610266208649,
0.0653582215309143,
-0.16708268225193024,
0.034522645175457,
0.012964270077645779,
0.021271610632538795,
0.03883718326687813,
0.08193054795265198,
-0.02726052701473236,
0.047886673361063004,
0.0214861948043108,
0.009682521224021912,
0.058891136199235916,
-0.03426331654191017,
-0.23616841435432434,
-0.048637595027685165,
0.007801526226103306,
0.10049641877412796,
0.14821188151836395,
-0.03734273463487625,
0.13445767760276794,
-0.07508188486099243,
0.13330398499965668,
0.09583234786987305,
-0.1357906609773636,
-0.020461855456233025,
0.10102888196706772,
0.03546089679002762,
0.10153257101774216,
-0.12346186488866806,
0.0073846979066729546,
0.03161173686385155,
-0.04060864448547363,
0.09516022354364395,
-0.013353271409869194,
-0.12777858972549438,
-0.01924266666173935,
-0.0991227850317955,
0.004435480106621981,
0.20949959754943848,
-0.02077866531908512,
-0.05109458044171333,
-0.13387610018253326,
-0.05521601065993309,
-0.008365387097001076,
0.006615170277655125,
-0.02057538740336895,
-0.02375420741736889,
0.042976707220077515,
0.03873283788561821,
-0.06335967779159546,
-0.10247650742530823,
-0.09006725996732712,
0.061861418187618256,
0.13457877933979034,
0.007982457056641579,
-0.0035408069379627705,
-0.058345407247543335,
0.051361408084630966,
-0.09051572531461716,
-0.11956238746643066,
-0.02138931304216385,
0.02985561080276966,
-0.10959549993276596,
0.007463360205292702,
-0.05634503439068794,
-0.20300807058811188,
0.0025835803244262934,
-0.031076297163963318,
0.047808632254600525,
0.027475403621792793,
-0.03687335550785065,
0.04847954958677292,
0.029047895222902298,
0.14405152201652527,
-0.022925036028027534,
0.013237180188298225,
0.0757865235209465,
-0.021766113117337227,
0.002026689238846302,
-0.02693316712975502,
-0.028273362666368484,
-0.050114624202251434,
0.007029591593891382,
0.05623263865709305,
-0.07406964153051376,
0.03628556802868843,
-0.007539226673543453,
-0.013389972038567066,
0.04826672002673149,
-0.12914910912513733,
-0.00215916964225471,
0.04238247498869896,
-0.00032837814069353044,
0.08201439678668976,
0.03961033374071121,
0.013381658121943474,
-0.06136768311262131,
-0.08040747046470642,
-0.03883513808250427,
0.05170917883515358,
-0.11807220429182053,
-0.11660107970237732,
-0.013011910021305084,
-0.0025127262342721224,
-0.017858698964118958,
-0.11284411698579788,
-0.1853858381509781,
-0.0558895617723465,
-0.03089527040719986,
-0.011316481977701187,
0.01756022498011589,
-0.08022986352443695,
-0.03093721903860569,
0.003833106253296137,
-0.04570207744836807,
0.05864030122756958,
-0.026056233793497086,
0.06855888664722443,
0.004657916259020567,
0.05119568109512329,
0.06017851084470749,
0.07787822186946869,
-0.03703118860721588,
-0.0185746718198061,
-0.044417474418878555,
0.1449156254529953,
-0.09475450962781906,
-0.02963566966354847,
-0.10588948428630829,
-0.04774139076471329,
0.004022593144327402,
0.028108863160014153,
0.07417857646942139,
0.07561986893415451,
-0.1607113778591156,
-0.018956560641527176,
0.21251270174980164,
-0.15282753109931946,
-0.01240114402025938,
0.1358424723148346,
-0.02374039590358734,
0.1433073729276657,
0.05168761685490608,
0.20043522119522095,
0.1942151039838791,
-0.11327620595693588,
0.058162588626146317,
0.020688390359282494,
0.011056971736252308,
-0.02422661893069744,
0.10239533334970474,
-0.01450587809085846,
-0.012166041880846024,
0.0057498314417898655,
-0.10484342277050018,
0.07012876868247986,
-0.035209570080041885,
-0.06758229434490204,
-0.03521686792373657,
-0.042189884930849075,
0.003037448972463608,
0.007524191867560148,
0.04603559151291847,
-0.013781314715743065,
-0.061118580400943756,
0.036586128175258636,
0.14316393435001373,
-0.145111545920372,
0.03958442062139511,
-0.0842428207397461,
0.0854613333940506,
-0.09244730323553085,
-0.03077930212020874,
-0.17041447758674622,
0.07620885223150253,
0.007693602703511715,
0.001976753817871213,
0.010391701012849808,
0.16356851160526276,
0.005641289055347443,
0.04618844389915466,
-0.043511491268873215,
-0.0140099311247468,
0.026853008195757866,
0.0011378562776371837,
-0.04514442756772041,
-0.12686914205551147,
0.01828032359480858,
-0.07335519045591354,
0.11449125409126282,
-0.20082633197307587,
0.03860081732273102,
0.05236388370394707,
0.01964806765317917,
0.012801785953342915,
-0.03765003755688667,
0.08225126564502716,
0.05948597565293312,
0.010006112977862358,
-0.017092762514948845,
0.037188004702329636,
0.058846551924943924,
-0.03061491623520851,
0.09071435779333115,
-0.10151854157447815,
0.003090887563303113,
0.04479461908340454,
-0.026354955509305,
-0.09310227632522583,
-0.050266917794942856,
-0.01050008274614811,
-0.031215794384479523,
-0.1021478995680809,
-0.03808683902025223,
0.18715651333332062,
-0.021298274397850037,
0.1572948396205902,
-0.07381582260131836,
-0.015267422422766685,
-0.0005082158604636788,
-0.08833255618810654,
0.07639145851135254,
0.06341410428285599,
-0.023694444447755814,
-0.023396054282784462,
0.07556179910898209,
-0.11783009022474289,
-0.11891883611679077,
0.15696820616722107,
-0.004126746207475662,
-0.06199783459305763,
0.015610561706125736,
0.03635890409350395,
-0.04688463732600212,
0.020667560398578644,
-0.15363658964633942,
-0.010336472652852535,
0.057428400963544846,
0.05542499199509621,
0.0270683690905571,
-0.12581388652324677,
0.015147943049669266,
0.039669644087553024,
-0.05197504907846451,
-0.10274302959442139,
0.1019023135304451,
-0.06336168199777603,
0.055759552866220474,
-0.07878617942333221,
0.05423343926668167,
-0.009287583641707897,
-0.04328303784132004,
-0.18772728741168976,
0.16282489895820618,
-0.1160338893532753,
-0.1707916259765625,
-0.1696339100599289,
-0.01966998726129532,
0.05344916507601738,
-0.00726697500795126,
0.08209837973117828,
-0.09139897674322128,
-0.07162374258041382,
-0.0224610585719347,
0.13394641876220703,
0.014110381714999676,
-0.055780649185180664,
-0.012664437294006348,
-0.026158271357417107,
0.026508046314120293,
-0.15942278504371643,
0.035295382142066956,
0.07033400982618332,
-0.03127765655517578,
0.046044714748859406,
-0.014518370851874352,
0.013407209888100624,
0.11473073065280914,
-0.049937453120946884,
-0.00042592297540977597,
-0.012196178548038006,
0.11935141682624817,
-0.0931345522403717,
0.027158506214618683,
0.16492752730846405,
-0.005129145458340645,
-0.0015550532843917608,
0.0841565728187561,
-0.0015012342482805252,
-0.0356748141348362,
-0.004350002855062485,
-0.049752313643693924,
-0.06705530732870102,
-0.2696763575077057,
-0.08482100814580917,
-0.08140445500612259,
-0.037450291216373444,
-0.055624984204769135,
-0.007273655850440264,
0.12360883504152298,
0.06669218838214874,
-0.018100949004292488,
-0.0567636676132679,
0.013887990266084671,
0.0454208180308342,
0.13483569025993347,
-0.019294243305921555,
0.11325933784246445,
-0.05860328674316406,
0.052198901772499084,
0.017733709886670113,
0.01576838828623295,
0.16843567788600922,
-0.022399941459298134,
0.09724202752113342,
0.11839378625154495,
0.09500839561223984,
0.10179927945137024,
0.06417429447174072,
-0.06004174053668976,
-0.03047511912882328,
0.030894333496689796,
-0.06870144605636597,
-0.06732607632875443,
0.05844426900148392,
0.0394732803106308,
-0.012546621263027191,
-0.042669303715229034,
-0.04871286451816559,
0.008147756569087505,
0.10518361628055573,
0.038729965686798096,
-0.17439045011997223,
-0.09809515625238419,
-0.026259295642375946,
-0.06810414046049118,
-0.05663234367966652,
0.019518334418535233,
0.14982087910175323,
-0.10801831632852554,
-0.03391415253281593,
-0.0016846504295244813,
0.09344353526830673,
0.006853795610368252,
-0.033377163112163544,
-0.08593928813934326,
0.04149473085999489,
-0.013586667366325855,
0.08921626210212708,
-0.1697923094034195,
0.22316843271255493,
0.01710653491318226,
0.08677329123020172,
0.010233374312520027,
-0.016437368467450142,
0.02446550317108631,
0.061253901571035385,
0.0868116244673729,
0.011412126943469048,
0.014750619418919086,
-0.07288751751184464,
-0.08624937385320663,
0.014973064884543419,
-0.0040341634303331375,
0.029402893036603928,
0.06779415160417557,
0.001968416618183255,
0.028490029275417328,
0.02078210934996605,
-0.003634792985394597,
-0.09680597484111786,
-0.05434688553214073,
0.011204379610717297,
0.08529181033372879,
0.09620283544063568,
-0.0455433689057827,
-0.10339391231536865,
-0.14761607348918915,
0.1559603214263916,
-0.06061062589287758,
-0.10469536483287811,
-0.1047835424542427,
0.08554121851921082,
0.01391503494232893,
-0.06654135882854462,
0.037714213132858276,
0.01665056310594082,
0.131090447306633,
0.0018150205723941326,
0.017207449302077293,
-0.0044568441808223724,
-0.05652925744652748,
-0.07733751088380814,
-0.019191492348909378,
0.13438497483730316,
0.09396380931138992,
0.040519554167985916,
0.05512232333421707,
0.0240031611174345,
-0.012293180450797081,
-0.06791633367538452,
-0.012817159295082092,
0.1432933211326599,
-0.04659197852015495,
0.03349192067980766,
-0.027395084500312805,
-0.1495385766029358,
-0.04796266928315163,
-0.13513316214084625,
0.18776246905326843,
0.12768858671188354,
-0.0712505578994751,
0.15667743980884552,
0.16998088359832764,
-0.09486740827560425,
-0.20884451270103455,
0.058418747037649155,
0.11785101890563965,
0.1262294501066208,
-0.002474468434229493,
-0.19218547642230988,
0.060322873294353485,
-0.023984558880329132,
-0.017919249832630157,
-0.0967428907752037,
-0.21676278114318848,
-0.1205645352602005,
0.10246129333972931,
0.08587958663702011,
0.10185721516609192,
-0.07239719480276108,
-0.04599307104945183,
0.006674277596175671,
-0.10513486713171005,
-0.045502688735723495,
-0.09799763560295105,
0.10424050688743591,
-0.005136734340339899,
0.0839671716094017,
0.045634202659130096,
-0.05306541174650192,
0.10509475320577621,
0.030441056936979294,
0.010371743701398373,
-0.03759878873825073,
0.13164915144443512,
0.09556356072425842,
0.028207039460539818,
0.15671758353710175,
-0.0060348608531057835,
0.05622675642371178,
-0.11208439618349075,
-0.0918770506978035,
-0.04436364024877548,
0.03213292732834816,
0.014852694235742092,
-0.07312982529401779,
-0.03758891299366951,
0.005930112209171057,
0.06236269697546959,
-0.033471155911684036,
-0.07223492115736008,
-0.08738446235656738,
0.051047515124082565,
0.10581759363412857,
0.16928379237651825,
0.09521710127592087,
-0.07055943459272385,
-0.04061085730791092,
-0.020285144448280334,
0.10490396618843079,
-0.17590081691741943,
0.013292239047586918,
0.04903585463762283,
0.06569866091012955,
0.11717217415571213,
-0.0076831914484500885,
-0.04663359746336937,
0.07339683175086975,
0.0380888395011425,
-0.0077758352272212505,
-0.11682998389005661,
-0.01538640446960926,
0.07301308214664459,
-0.010097253136336803,
-0.01188619900494814,
0.1268772929906845,
-0.10974220931529999,
0.000777368142735213,
-0.018089499324560165,
0.004525701981037855,
-0.07386302202939987,
0.1825089454650879,
0.051315419375896454,
0.08290731906890869,
-0.06350875645875931,
-0.023208945989608765,
0.00048042749403975904,
-0.048921458423137665,
0.022744398564100266,
0.019580194726586342,
-0.1261497288942337,
-0.057471368461847305,
0.024968700483441353,
0.06048950180411339,
0.005854046903550625,
-0.0693918764591217,
-0.028628205880522728,
-0.08861729502677917,
-0.011599984019994736,
0.15217165648937225,
0.03800290822982788,
0.020011303946375847,
-0.1082267090678215,
-0.016571497544646263,
-0.10531098395586014,
0.07119257003068924,
0.018884697929024696,
-0.031493086367845535,
-0.06301118433475494,
0.19150125980377197,
0.0376426987349987,
0.02154405787587166,
-0.044329479336738586,
-0.08448633551597595,
-0.02751033566892147,
0.04784305393695831,
-0.10129769891500473,
0.013813862577080727,
-0.03085305728018284,
-0.02853512205183506,
-0.04630309343338013,
-0.04700334370136261,
0.018564313650131226,
0.09066951274871826,
-0.04198436066508293,
0.031663928180933,
-0.03065529093146324,
0.09011325985193253,
-0.05816709250211716,
-0.017979765310883522,
0.04729865491390228,
-0.05137791112065315,
0.06626570969820023,
0.07304469496011734,
-0.07945545762777328,
0.05790386348962784,
-0.19262705743312836,
0.028143638744950294,
0.02492007613182068,
0.052856698632240295,
-0.019343463703989983,
-0.22305075824260712,
0.04231269285082817,
0.046812910586595535,
0.02613871917128563,
0.017768830060958862,
0.12347310781478882,
-0.036314528435468674,
0.03639848902821541,
-0.12699437141418457,
0.021708836778998375,
-0.09831864386796951,
0.08169663697481155,
0.016421347856521606,
0.127243772149086,
0.11045610904693604,
-0.06706184148788452,
0.08427875488996506,
-0.11725228279829025,
0.007196689024567604,
-0.006336198654025793,
-0.006941587198525667,
-0.08041093498468399,
-0.06324592232704163,
0.055399760603904724,
-0.07687216997146606,
0.12160318344831467,
0.022730806842446327,
-0.06257346272468567,
0.007405139971524477,
-0.09485108405351639,
0.07873740792274475,
-0.01314109843224287,
0.2129739224910736,
0.040377888828516006,
0.05830542743206024,
0.018673233687877655,
0.03285285085439682,
-0.004369363188743591,
0.08101548999547958,
0.01900077611207962,
0.1543205976486206,
-0.0588872954249382,
0.04836086928844452,
-0.010809195227921009,
-0.10923904180526733,
-0.1013224646449089,
-0.04970497265458107,
-0.0805826261639595,
0.006942177191376686,
-0.0829504057765007,
0.15031930804252625,
0.1983523815870285,
-0.10463007539510727,
0.11238030344247818,
0.019217561930418015,
-0.110256128013134,
-0.09136620163917542,
-0.09765901416540146,
-0.052835073322057724,
-0.11737670004367828,
0.028801588341593742,
-0.09943655878305435,
0.03235660120844841,
-0.0165605116635561,
0.044160764664411545,
-0.01670791581273079,
0.19756199419498444,
-0.05289708822965622,
-0.07939761132001877,
0.017891570925712585,
-0.05929291993379593,
0.0350581631064415,
-0.10546433925628662,
0.034782230854034424,
0.10737019777297974,
0.012726517394185066,
0.06012284383177757,
0.03880895674228668,
0.02327868342399597,
-0.00901163648813963,
-0.10160697251558304,
-0.041279859840869904,
-0.007302223239094019,
0.015517983585596085,
0.06386684626340866,
0.21400992572307587,
0.0823458805680275,
-0.05457261577248573,
-0.013604368083178997,
0.1530306190252304,
0.0038666578475385904,
-0.11172221601009369,
-0.15355578064918518,
0.07592461258172989,
0.05164676159620285,
-0.01636141911149025,
0.03605077415704727,
-0.022679805755615234,
0.03304876387119293,
0.24581271409988403,
0.12899595499038696,
0.04307032749056816,
0.03887658566236496,
-0.003570968983694911,
-0.013984267599880695,
-0.04407254979014397,
0.11823678016662598,
0.07420664280653,
0.16589030623435974,
-0.047189950942993164,
-0.0004885696689598262,
-0.1299104243516922,
-0.0522528775036335,
-0.025251569226384163,
0.0043947733938694,
-0.06264571845531464,
-0.07703694701194763,
-0.034229062497615814,
0.12424442172050476,
-0.0026494686026126146,
-0.11533303558826447,
-0.08316162973642349,
-0.027204463258385658,
-0.10045880079269409,
0.011331881396472454,
-0.0656537264585495,
0.04367977753281593,
-0.03258093073964119,
-0.025280460715293884,
0.04547177255153656,
0.1944216936826706,
0.011969446204602718,
0.026130152866244316,
0.009212425909936428,
0.08987432718276978,
-0.1497543901205063,
-0.011005844920873642,
-0.01992250233888626,
0.19305948913097382,
0.05889563634991646,
0.08248448371887207,
-0.016692934557795525,
0.14265654981136322,
0.023638851940631866,
-0.07284754514694214,
0.02276904508471489,
0.17221581935882568,
-0.029632044956088066,
0.07226534187793732,
0.013913373462855816,
-0.038747403770685196,
0.04612075537443161,
-0.09460298717021942,
-0.01892692968249321,
-0.07257344573736191,
0.0803014412522316,
-0.040466926991939545,
0.11104337126016617,
0.10489650815725327,
-0.06969912350177765,
-0.07518139481544495,
-0.04738785699009895,
0.015909112989902496,
-0.002049430273473263,
-0.035776495933532715,
-0.02635975368320942,
-0.25982949137687683,
0.004182257689535618,
-0.051962319761514664,
0.02427426539361477,
-0.2506650984287262,
-0.08347712457180023,
-0.015585272572934628,
-0.0728861391544342,
0.01108365599066019,
0.08044490963220596,
0.05082840844988823,
0.02985924854874611,
-0.006014698650687933,
-0.17949476838111877,
0.024651918560266495,
0.09891539067029953,
-0.11923348158597946,
-0.11412704735994339
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Odia
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Odia using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "or", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-or")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-or")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "or", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-or")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-or")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�\–\…\'\_\’\।\|]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 52.64 %
## Training
The Common Voice `train` and `validation` datasets were used for training.The colab notebook used can be found [here](https://colab.research.google.com/drive/1s8DrwgB5y4Z7xXIrPXo1rQA5_1OZ8WD5?usp=sharing).
|
{"language": "or", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "XLSR Wav2Vec2 Large 53 Odia by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice or", "type": "common_voice", "args": "or"}, "metrics": [{"type": "wer", "value": 52.64, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-or
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"or",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"or"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Odia
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Odia test data of Common Voice.
Test Result: 52.64 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training.The colab notebook used can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 52.64 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training.The colab notebook used can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 52.64 %",
"## Training\nThe Common Voice 'train' and 'validation' datasets were used for training.The colab notebook used can be found here."
] |
[
80,
63,
20,
29,
33
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #or #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Odia\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Odia using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\nThe model can be used directly (without a language model) as follows:## Evaluation\nThe model can be evaluated as follows on the Odia test data of Common Voice.\n\nTest Result: 52.64 %## Training\nThe Common Voice 'train' and 'validation' datasets were used for training.The colab notebook used can be found here."
] |
[
-0.13317465782165527,
0.03680947795510292,
-0.0017117433017119765,
-0.0014778324402868748,
0.06938573718070984,
-0.04021764174103737,
0.2317367047071457,
0.09008678048849106,
-0.03441401198506355,
-0.026782754808664322,
-0.015197386965155602,
0.04216489940881729,
0.04584186151623726,
0.10698867589235306,
0.011208031326532364,
-0.19052016735076904,
-0.011638559401035309,
0.04520566016435623,
0.021774791181087494,
0.09972590953111649,
0.07802987843751907,
-0.06626535207033157,
-0.03480271250009537,
0.1172415167093277,
-0.1821422576904297,
0.025879815220832825,
0.028325935825705528,
-0.11747990548610687,
0.126645028591156,
0.09770825505256653,
0.10942095518112183,
0.04620776325464249,
0.08202862739562988,
-0.1720295548439026,
0.03802816569805145,
0.018221424892544746,
0.013544336892664433,
0.04315099120140076,
0.0367312990128994,
-0.016664426773786545,
0.02967805042862892,
0.1118374913930893,
-0.012206701561808586,
0.07081817835569382,
-0.07065337896347046,
-0.25879958271980286,
-0.003945064730942249,
0.008617239072918892,
0.08798477053642273,
0.12320178747177124,
-0.05188406631350517,
0.13520735502243042,
-0.10180234909057617,
0.092795230448246,
0.14676962792873383,
-0.20900945365428925,
-0.008652455173432827,
0.1204029992222786,
0.06891658902168274,
0.054208461195230484,
-0.05805268511176109,
0.0186303798109293,
0.024690214544534683,
0.006861121393740177,
0.060917336493730545,
-0.044248200953006744,
-0.22244271636009216,
-0.027653897181153297,
-0.11756114661693573,
-0.016584746539592743,
0.1926255226135254,
-0.026094797998666763,
-0.06094285845756531,
-0.09784576296806335,
-0.0164503026753664,
-0.009401185438036919,
-0.016290966421365738,
-0.041251782327890396,
0.0012710769660770893,
0.03258100152015686,
-0.0076123736798763275,
-0.053148798644542694,
-0.13087685406208038,
-0.12452632188796997,
-0.03273779898881912,
0.08855026960372925,
0.002622969215735793,
0.01117086224257946,
-0.09820972383022308,
0.07888881862163544,
-0.07231149077415466,
-0.07032861560583115,
0.006257484667003155,
0.04234857112169266,
-0.07763363420963287,
0.01257143821567297,
-0.09493761509656906,
-0.12899909913539886,
0.021294042468070984,
-0.004962693899869919,
0.11669288575649261,
0.004270108416676521,
-0.027347400784492493,
0.06797608733177185,
0.04287080839276314,
0.11589161306619644,
0.00548181589692831,
-0.03310646861791611,
0.07600179314613342,
0.016120795160531998,
-0.03866034373641014,
-0.03416884317994118,
-0.096128910779953,
-0.04969312995672226,
-0.0016075659077614546,
0.03042050078511238,
-0.021643679589033127,
0.013518830761313438,
-0.05598866939544678,
-0.045979879796504974,
0.02238963358104229,
-0.10478372126817703,
-0.012949294410645962,
0.08722906559705734,
0.009321050718426704,
0.06142912060022354,
0.09758246690034866,
0.04772959649562836,
-0.06747337430715561,
-0.08230416476726532,
0.00227342383004725,
0.0583721362054348,
-0.06783197820186615,
-0.09313498437404633,
-0.002645219210535288,
0.023362064734101295,
-0.003409819444641471,
-0.11353237926959991,
-0.10657739639282227,
-0.07455503195524216,
0.007244336884468794,
0.03856996074318886,
-0.026084991171956062,
-0.08461993932723999,
0.006208624225109816,
-0.0359223298728466,
-0.0689622312784195,
0.012442982755601406,
-0.04489296302199364,
0.07520980387926102,
0.0489831417798996,
0.05714971572160721,
-0.018200913444161415,
0.07697416096925735,
-0.0504969097673893,
-0.0600811205804348,
0.007959243841469288,
0.10711735486984253,
-0.04596531391143799,
-0.01737520843744278,
-0.06468646228313446,
-0.05819494277238846,
-0.02316362038254738,
0.06410505622625351,
0.04713224247097969,
0.05169787257909775,
-0.2212669551372528,
-0.09887619316577911,
0.176650732755661,
-0.11751651018857956,
-0.02294648252427578,
0.18960827589035034,
-0.023012222722172737,
0.1368357092142105,
0.11006630212068558,
0.23330572247505188,
0.1688307672739029,
-0.19341124594211578,
0.04781351983547211,
-0.02215784229338169,
0.0278595220297575,
-0.08966801315546036,
0.05490460619330406,
0.019367631524801254,
-0.013057214207947254,
0.03714852035045624,
-0.01896604150533676,
0.08113949000835419,
-0.012017766013741493,
-0.07049178332090378,
-0.009774581529200077,
-0.06618733704090118,
0.012308946810662746,
0.042484261095523834,
0.03500118851661682,
-0.002124947030097246,
-0.04497889056801796,
0.11134164780378342,
0.10270272195339203,
-0.15351729094982147,
0.03186120092868805,
-0.10383693873882294,
0.07439793646335602,
-0.09017004072666168,
-0.02452111802995205,
-0.1593976616859436,
0.1242741271853447,
-0.006588388234376907,
0.05084314942359924,
0.08094877749681473,
0.2268100380897522,
-0.008981738239526749,
0.03049902804195881,
-0.014583169482648373,
0.02340012975037098,
-0.002003925386816263,
-0.01718512363731861,
-0.04424821957945824,
-0.06425679475069046,
-0.007166248746216297,
-0.06287841498851776,
0.1023496612906456,
-0.1624245047569275,
0.02766006998717785,
0.013537297956645489,
-0.013061495497822762,
0.015577796846628189,
-0.015787336975336075,
0.06994779407978058,
0.09908297657966614,
0.002971296664327383,
0.025417936965823174,
0.0653216615319252,
0.056526489555835724,
-0.03127927705645561,
0.07745274901390076,
-0.18716461956501007,
0.017805563285946846,
0.11198052763938904,
-0.02779054082930088,
-0.021538712084293365,
-0.0623970590531826,
-0.02676713466644287,
-0.01096794381737709,
-0.08339281380176544,
-0.0045852563343942165,
0.3271259665489197,
-0.034738484770059586,
0.1154346615076065,
-0.08638034015893936,
-0.011268101632595062,
-0.003725066315382719,
-0.08295175433158875,
0.0803561806678772,
0.047044072300195694,
-0.012660105712711811,
0.042115092277526855,
0.03494507446885109,
-0.04265892133116722,
-0.12271172553300858,
0.20032407343387604,
-0.010384839028120041,
-0.07448265701532364,
-0.002126385923475027,
-0.06242095306515694,
0.011194130405783653,
0.07659101486206055,
-0.12222253531217575,
-0.027213487774133682,
0.04701736569404602,
0.032532986253499985,
0.04534537345170975,
-0.16744917631149292,
0.03749211132526398,
0.035860635340213776,
-0.1072433665394783,
-0.12878359854221344,
0.04046932980418205,
-0.041426870971918106,
0.019437076523900032,
-0.09465083479881287,
-0.008028242737054825,
-0.01132122240960598,
-0.036366384476423264,
-0.13668440282344818,
0.14214234054088593,
-0.09264068305492401,
-0.22593916952610016,
-0.15610402822494507,
-0.009548518806695938,
0.05362139642238617,
0.03989178314805031,
0.10653406381607056,
-0.047995470464229584,
-0.016834931448101997,
-0.036120422184467316,
0.13726161420345306,
0.08112668991088867,
-0.044001419097185135,
-0.019000960513949394,
0.02339581958949566,
0.05650269612669945,
-0.16526557505130768,
0.01558526512235403,
-0.01884193904697895,
-0.0595351979136467,
0.019546661525964737,
-0.021990390494465828,
-0.01803985796868801,
0.1589658558368683,
0.030965127050876617,
-0.0028638686053454876,
-0.019975587725639343,
0.17894688248634338,
-0.08517569303512573,
-0.034411754459142685,
0.17730174958705902,
-0.034467898309230804,
-0.021729150786995888,
0.08938965946435928,
0.03601786121726036,
-0.07345101237297058,
-0.003861468518152833,
-0.02350233681499958,
-0.08356910198926926,
-0.2227584272623062,
-0.10346491634845734,
-0.06594932824373245,
-0.02520231157541275,
-0.027767974883317947,
-0.0045103877782821655,
0.06326594203710556,
0.043011460453271866,
0.020366089418530464,
-0.011714617721736431,
0.017320457845926285,
-0.0017234266269952059,
0.18466459214687347,
-0.005476790480315685,
0.09426434338092804,
-0.04806949943304062,
-0.019870884716510773,
0.02038697525858879,
0.09222080558538437,
0.15571993589401245,
0.024051541462540627,
0.11451958119869232,
0.08135402947664261,
0.09018490463495255,
0.13433495163917542,
0.0805225595831871,
-0.042037226259708405,
-0.022870466113090515,
0.011884021572768688,
-0.05683785304427147,
-0.05941588059067726,
0.02821744792163372,
0.10269477218389511,
-0.04213032126426697,
-0.04438285529613495,
-0.10903692245483398,
0.0029025673866271973,
0.16554081439971924,
0.05770436301827431,
-0.23039771616458893,
-0.09309861063957214,
-0.025788523256778717,
-0.057749271392822266,
-0.006973680574446917,
0.05158166214823723,
0.14424879848957062,
-0.10958711057901382,
-0.028870539739727974,
0.009411709383130074,
0.09354819357395172,
-0.07293695956468582,
0.006827457342296839,
-0.08711332827806473,
0.02790331095457077,
0.003461550222709775,
0.07291195541620255,
-0.17374053597450256,
0.18487097322940826,
0.013930846005678177,
0.09737994521856308,
-0.0196677315980196,
-0.00617598183453083,
0.022559821605682373,
0.06240823492407799,
0.06702149659395218,
-0.0010205494472756982,
0.022166309878230095,
-0.10983318090438843,
-0.1238740012049675,
0.058709148317575455,
-0.05547265708446503,
0.013155653141438961,
0.04206286370754242,
-0.0017398128984495997,
0.005068919155746698,
0.002094259485602379,
-0.08022474497556686,
-0.1235598474740982,
-0.07275624573230743,
0.011020321398973465,
0.16588179767131805,
0.1103992760181427,
-0.02719194069504738,
-0.10930487513542175,
-0.08244886249303818,
0.13085393607616425,
-0.03385177254676819,
-0.035184457898139954,
-0.07245101034641266,
0.0022403066977858543,
0.07203183323144913,
-0.04898801073431969,
0.01380134467035532,
0.08866680413484573,
0.11495862901210785,
-0.024123061448335648,
-0.06571368128061295,
0.04932665824890137,
-0.10337680578231812,
-0.06543411314487457,
-0.039842765778303146,
0.13124410808086395,
0.10229246318340302,
0.0854920893907547,
0.07579292356967926,
0.0040578292682766914,
-0.03706891089677811,
-0.030247846618294716,
-0.01932845264673233,
0.13972103595733643,
-0.037464022636413574,
-0.021663304418325424,
-0.030661435797810555,
-0.18242990970611572,
-0.06409507244825363,
-0.039100442081689835,
0.17432108521461487,
0.10339976847171783,
-0.05307785049080849,
0.17007172107696533,
0.21146659553050995,
-0.09620631486177444,
-0.21245872974395752,
-0.010861855000257492,
0.12823301553726196,
0.13592912256717682,
0.028224673122167587,
-0.18767546117305756,
0.08991724997758865,
0.019995467737317085,
-0.03687131777405739,
0.006996199022978544,
-0.2643094062805176,
-0.1355038583278656,
0.16986024379730225,
0.03558929264545441,
0.14357322454452515,
-0.04688478633761406,
-0.01823795959353447,
-0.010934853926301003,
-0.10348573327064514,
0.08922459185123444,
-0.11595381051301956,
0.1394735425710678,
0.010413339361548424,
0.06675302982330322,
0.04276119917631149,
-0.02760579064488411,
0.07440592348575592,
0.05438763275742531,
-0.014084133319556713,
-0.020784158259630203,
0.07033350318670273,
0.04231691360473633,
0.0015117444563657045,
0.08181946724653244,
-0.051798440515995026,
0.03303757682442665,
-0.08798053860664368,
-0.10535692423582077,
-0.04984831064939499,
0.06761738657951355,
0.015116124413907528,
-0.04749798774719238,
-0.001994685735553503,
-0.024054154753684998,
0.0013225700240582228,
-0.013226095587015152,
-0.046195611357688904,
-0.12849818170070648,
0.07377775013446808,
0.1586623638868332,
0.16318458318710327,
-0.009661509655416012,
-0.1134212538599968,
-0.005448722746223211,
-0.03000504896044731,
0.11746276915073395,
-0.09727008640766144,
0.025025147944688797,
0.07792104780673981,
0.019520271569490433,
0.11957906186580658,
0.02717117965221405,
-0.08471480756998062,
0.08645283430814743,
0.06003039702773094,
-0.06980600953102112,
-0.048179611563682556,
-0.02209394983947277,
0.01847410574555397,
-0.018448207527399063,
0.017736639827489853,
0.1347840130329132,
-0.07754883915185928,
-0.04210525378584862,
0.012999016791582108,
-0.016319895163178444,
-0.10777805000543594,
0.183675616979599,
0.010418021120131016,
0.0676070973277092,
-0.08654937148094177,
-0.014495446346700191,
0.02298114448785782,
-0.057158276438713074,
0.010645885951817036,
-0.018986204639077187,
-0.07957814633846283,
-0.07652624696493149,
0.03883295878767967,
0.08813323825597763,
0.0233139730989933,
-0.09323009848594666,
-0.07796968519687653,
-0.08526004105806351,
-0.026784244924783707,
0.1142420545220375,
0.07097204774618149,
0.014854278415441513,
-0.10962788760662079,
-0.04054342955350876,
-0.07405766099691391,
0.06993736326694489,
0.03376878425478935,
-0.0034383912570774555,
-0.060954540967941284,
0.1803126037120819,
0.04218808934092522,
0.02732808142900467,
-0.0533260814845562,
-0.08255865424871445,
-0.0620308592915535,
0.05710161477327347,
-0.0579926036298275,
0.004449624568223953,
-0.044524360448122025,
0.019420968368649483,
0.0030111372470855713,
-0.05728514865040779,
0.011629446409642696,
0.08111440390348434,
-0.1010952889919281,
0.06708979606628418,
-0.019707519561052322,
0.06702591478824615,
-0.05029267817735672,
-0.002969166496768594,
0.047165755182504654,
-0.04551827162504196,
0.08492078632116318,
0.10710399597883224,
-0.1001516804099083,
0.0842248946428299,
-0.21200639009475708,
-0.02727236971259117,
0.07071342319250107,
0.08184120059013367,
0.000494000967592001,
-0.1321638822555542,
0.0440242700278759,
0.07046538591384888,
0.07562513649463654,
-0.011711488477885723,
0.1306118369102478,
-0.0702977180480957,
-0.0442705936729908,
-0.07563170045614243,
-0.009764638729393482,
-0.04752200469374657,
-0.008738678880035877,
0.041003867983818054,
0.15313228964805603,
0.12565991282463074,
-0.09027724713087082,
0.08935219049453735,
-0.1110333651304245,
0.005432024132460356,
-0.02859828621149063,
-0.05397907271981239,
-0.10141563415527344,
-0.1028202474117279,
0.051209498196840286,
-0.009131550788879395,
0.11987373977899551,
0.016555067151784897,
0.03320162370800972,
-0.04995765537023544,
-0.1401321291923523,
0.06220172718167305,
-0.02470593899488449,
0.23762346804141998,
0.04417342692613602,
0.029275404289364815,
0.005498351063579321,
0.03143298253417015,
0.025357145816087723,
0.08204136788845062,
0.012404385954141617,
0.07171199470758438,
-0.01826276071369648,
0.07496299594640732,
0.0784110352396965,
-0.03372740000486374,
-0.06929613649845123,
-0.04741021618247032,
-0.10870223492383957,
0.04877690598368645,
-0.09160786122083664,
0.1458636373281479,
0.11784759163856506,
-0.06021663174033165,
0.09791717678308487,
0.021736780181527138,
-0.08123582601547241,
-0.14271000027656555,
-0.10393635183572769,
-0.05317122861742973,
-0.16467656195163727,
0.007444226182997227,
-0.10138211399316788,
0.009573745541274548,
0.032934244722127914,
0.034990713000297546,
-0.00238714343868196,
0.17111775279045105,
0.00018115746206603944,
-0.061565469950437546,
0.02812856249511242,
-0.08238869160413742,
-0.014453917741775513,
-0.14727364480495453,
0.04047466069459915,
0.1284853219985962,
0.044205836951732635,
0.05756078660488129,
0.01886874996125698,
-0.031156392768025398,
0.026805637404322624,
-0.07377307116985321,
-0.056757859885692596,
-0.039174940437078476,
-0.01293351873755455,
0.07850639522075653,
0.11903849244117737,
0.11583148688077927,
-0.06466051936149597,
0.017555825412273407,
0.11858516186475754,
-0.03745017945766449,
-0.16012048721313477,
-0.1452489048242569,
0.10014880448579788,
0.008905830793082714,
-0.01637471280992031,
-0.009082862176001072,
-0.022587472572922707,
-0.00770148728042841,
0.2703559696674347,
0.17858323454856873,
0.05688967555761337,
0.040401700884103775,
0.01839330606162548,
-0.011951669119298458,
-0.03996589779853821,
0.0782713070511818,
0.09496812522411346,
0.25223207473754883,
-0.018583834171295166,
-0.07290609180927277,
-0.05895063281059265,
-0.06531523913145065,
-0.019920840859413147,
-0.02569642849266529,
-0.08437356352806091,
-0.07107146084308624,
0.011560587212443352,
0.10935619473457336,
-0.03094971366226673,
-0.11401227116584778,
-0.06724964827299118,
-0.05695819482207298,
-0.08782890439033508,
-0.013004028238356113,
0.04363883286714554,
0.06516334414482117,
0.008569460362195969,
-0.07835206389427185,
0.026494339108467102,
0.1776750534772873,
-0.03907941281795502,
-0.06048951297998428,
-0.018277490511536598,
0.051183804869651794,
-0.13076834380626678,
0.034632302820682526,
-0.0199675764888525,
0.16382265090942383,
0.017026051878929138,
0.08851559460163116,
-0.016059458255767822,
0.1513298898935318,
-0.022623851895332336,
-0.11824192106723785,
0.027767838910222054,
0.11667497456073761,
-0.035960759967565536,
0.06754328310489655,
0.006537273060530424,
-0.09570597857236862,
0.07674366980791092,
-0.1552046239376068,
0.00645696185529232,
-0.05753406137228012,
0.03361370414495468,
-0.04776248708367348,
0.08831249922513962,
0.0789676234126091,
-0.07089369744062424,
-0.07487257570028305,
-0.02894672192633152,
0.05152256041765213,
0.043682340532541275,
-0.0652250126004219,
-0.0549142062664032,
-0.21211721003055573,
-0.01575864851474762,
-0.073066346347332,
-0.022634048014879227,
-0.20223723351955414,
-0.054310571402311325,
-0.006537031382322311,
-0.09615359455347061,
-0.02251358889043331,
0.04100886359810829,
0.09222835302352905,
0.018481310456991196,
-0.01885497011244297,
-0.0775030329823494,
0.02983230911195278,
0.10025162994861603,
-0.1789017617702484,
-0.15327532589435577
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Portuguese using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "pt", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-pt")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-pt")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "pt", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-pt")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-pt")
model.to("cuda")
chars_to_ignore_regex = '[\,\?\.\!\-\;\;\"\“\'\�]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 17.22 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The script used for training can be found [here](https://github.com/jqueguiner/wav2vec2-sprint/blob/main/run_common_voice.py).
The parameters passed were:
```bash
#!/usr/bin/env bash
python run_common_voice.py \
--model_name_or_path="facebook/wav2vec2-large-xlsr-53" \
--dataset_config_name="pt" \
--output_dir=/workspace/output_models/pt/wav2vec2-large-xlsr-pt \
--cache_dir=/workspace/data \
--overwrite_output_dir \
--num_train_epochs="30" \
--per_device_train_batch_size="32" \
--per_device_eval_batch_size="32" \
--evaluation_strategy="steps" \
--learning_rate="3e-4" \
--warmup_steps="500" \
--fp16 \
--freeze_feature_extractor \
--save_steps="500" \
--eval_steps="500" \
--save_total_limit="1" \
--logging_steps="500" \
--group_by_length \
--feat_proj_dropout="0.0" \
--layerdrop="0.1" \
--gradient_checkpointing \
--do_train --do_eval \
```
Notebook containing the evaluation can be found [here](https://colab.research.google.com/drive/14e-zNK_5pm8EMY9EbeZerpHx7WsGycqG?usp=sharing).
|
{"language": "pt", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Portugese by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice pt", "type": "common_voice", "args": "pt"}, "metrics": [{"type": "wer", "value": 17.22, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-pt
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"pt",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"pt"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #pt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 17.22 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here.
The parameters passed were:
Notebook containing the evaluation can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 17.22 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here.\n The parameters passed were:\n\n\n\nNotebook containing the evaluation can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #pt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 17.22 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here.\n The parameters passed were:\n\n\n\nNotebook containing the evaluation can be found here."
] |
[
80,
67,
20,
29,
49
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #pt #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 17.22 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The script used for training can be found here.\n The parameters passed were:\n\n\n\nNotebook containing the evaluation can be found here."
] |
[
-0.14910899102687836,
0.023432724177837372,
-0.0020139378029853106,
0.00607210211455822,
0.13362321257591248,
-0.055871929973363876,
0.17262285947799683,
0.11623161286115646,
-0.06297843158245087,
-0.02665073610842228,
0.01745113730430603,
-0.004029807634651661,
0.06604428589344025,
0.09497375786304474,
0.0690816342830658,
-0.20868481695652008,
-0.00396781275048852,
-0.01670280657708645,
-0.030788108706474304,
0.11407015472650528,
0.09007594734430313,
-0.06552644819021225,
0.0018078156281262636,
0.08645837008953094,
-0.12783977389335632,
0.06505248695611954,
0.018827561289072037,
-0.13781319558620453,
0.13857346773147583,
0.08477877825498581,
0.09923702478408813,
0.042573511600494385,
0.0901484563946724,
-0.19870325922966003,
0.028552992269396782,
0.06564274430274963,
0.019789565354585648,
0.02522445283830166,
0.0707164853811264,
-0.036268122494220734,
0.12785367667675018,
0.11012188345193863,
-0.006829208694398403,
0.10038141161203384,
-0.09368870407342911,
-0.24502193927764893,
-0.010805578902363777,
-0.03204717859625816,
0.07458540052175522,
0.16712568700313568,
-0.047204598784446716,
0.08057387173175812,
-0.12089332193136215,
0.06559120863676071,
0.1288018673658371,
-0.12690338492393494,
-0.005425977520644665,
0.04502130299806595,
0.06606914103031158,
0.06264400482177734,
-0.04443848505616188,
0.00829764548689127,
0.026036985218524933,
0.0019313161028549075,
0.047078266739845276,
-0.02236299403011799,
-0.20466701686382294,
-0.02511587180197239,
-0.11986132711172104,
-0.06443145126104355,
0.2159949094057083,
-0.027297213673591614,
-0.07747070491313934,
-0.12519152462482452,
-0.0009608985274098814,
0.023902785032987595,
0.01915736496448517,
-0.03252308815717697,
-0.012606806121766567,
0.04866409674286842,
-0.03166189417243004,
-0.042780302464962006,
-0.11145179718732834,
-0.1473560482263565,
0.05477758124470711,
0.046204179525375366,
0.010067510418593884,
0.011997230350971222,
-0.14622920751571655,
0.11010084301233292,
-0.03365607559680939,
-0.07273714989423752,
-0.0010230483021587133,
0.016207009553909302,
-0.028653819113969803,
0.021606456488370895,
-0.04337509348988533,
-0.1411842405796051,
0.013428471982479095,
-0.016517752781510353,
0.0971846953034401,
-0.007598078344017267,
-0.036435555666685104,
0.07197196781635284,
0.016100026667118073,
0.10720254480838776,
-0.06453656405210495,
0.0006045917980372906,
0.03121846169233322,
0.0053123654797673225,
-0.058267075568437576,
-0.02688778005540371,
-0.0848584771156311,
-0.02927403338253498,
-0.018720019608736038,
0.09572991728782654,
-0.0023598959669470787,
-0.010041954927146435,
-0.048224423080682755,
-0.054616667330265045,
0.05312712490558624,
-0.098729707300663,
-0.03906432166695595,
0.08752284944057465,
-0.043097805231809616,
0.08818377554416656,
0.11309917271137238,
0.035525381565093994,
-0.10954349488019943,
-0.047603365033864975,
0.019136033952236176,
0.058532778173685074,
-0.06405213475227356,
-0.11910944432020187,
0.019810227677226067,
0.012378801591694355,
-0.024683883413672447,
-0.09978927671909332,
-0.11598774045705795,
-0.08447807282209396,
-0.01566222496330738,
0.03628729656338692,
0.037883978337049484,
-0.09966570138931274,
0.0005597982672043145,
-0.04355010390281677,
-0.03515671193599701,
0.06779511272907257,
-0.038429923355579376,
0.07123539596796036,
0.025486063212156296,
0.04838959872722626,
0.05828623101115227,
0.07851989567279816,
-0.0793875902891159,
-0.0760735422372818,
-0.012982827611267567,
0.13354824483394623,
-0.01637330837547779,
-0.030188245698809624,
-0.1034754291176796,
-0.07941338419914246,
-0.07895322889089584,
0.06295241415500641,
0.07338899374008179,
0.14057692885398865,
-0.26298052072525024,
-0.1108049750328064,
0.26045748591423035,
-0.12781015038490295,
-0.04245532304048538,
0.15293265879154205,
-0.01667877472937107,
0.16940218210220337,
0.11371643841266632,
0.16706512868404388,
0.14448367059230804,
-0.20754945278167725,
0.07945689558982849,
-0.014017238281667233,
-0.018215201795101166,
-0.08011288940906525,
0.1142648383975029,
-0.08719878643751144,
0.023257233202457428,
0.03158233314752579,
-0.10762131959199905,
0.11140070855617523,
-0.024127839133143425,
-0.058102142065763474,
0.00002444788333377801,
-0.04663269966840744,
0.02971864491701126,
0.04531730338931084,
0.018683521077036858,
-0.0058907149359583855,
-0.07797124981880188,
0.0667002871632576,
0.10305286943912506,
-0.14465674757957458,
0.046112723648548126,
-0.08518946170806885,
0.04869191721081734,
-0.03866560384631157,
-0.011012526229023933,
-0.14137984812259674,
0.13683919608592987,
-0.031711407005786896,
0.09078489989042282,
0.023235872387886047,
0.12031222134828568,
0.011935373768210411,
0.047373685985803604,
-0.012562626972794533,
-0.005356913898140192,
-0.0453418530523777,
-0.033425938338041306,
-0.010265585035085678,
-0.08601612597703934,
-0.02712717466056347,
-0.05461093410849571,
0.09615113586187363,
-0.1662723273038864,
0.03764072805643082,
0.033774085342884064,
-0.019066885113716125,
0.009843157604336739,
-0.0382113978266716,
0.0746738538146019,
0.09368585795164108,
-0.008802355267107487,
-0.02753344736993313,
0.07019651681184769,
0.032884515821933746,
-0.023593539372086525,
0.07727118581533432,
-0.16157476603984833,
-0.006312664598226547,
0.09536580741405487,
-0.04986253008246422,
-0.007783969398587942,
-0.0027169426903128624,
-0.0239780955016613,
-0.004183163400739431,
-0.1310633271932602,
-0.0018923446768894792,
0.3039341866970062,
0.002954612486064434,
0.14398698508739471,
-0.11123406141996384,
0.0026811815332621336,
0.023667581379413605,
-0.06528013944625854,
0.05994002893567085,
0.03643994778394699,
0.0423525832593441,
0.1004100888967514,
0.02175852656364441,
-0.04500749334692955,
-0.04255859553813934,
0.273999959230423,
-0.014479662291705608,
-0.1015855222940445,
0.022977227345108986,
0.0059364717453718185,
0.01040671207010746,
0.043736059218645096,
-0.18044520914554596,
-0.04148627817630768,
0.028237223625183105,
0.039749693125486374,
0.05886780470609665,
-0.1573389619588852,
-0.0003756204096134752,
0.042143814265728,
-0.1229536309838295,
-0.19148573279380798,
0.035376567393541336,
-0.06924998760223389,
0.04090069606900215,
-0.08522205799818039,
0.008661358617246151,
-0.027695532888174057,
-0.04779260605573654,
-0.18378405272960663,
0.15165214240550995,
-0.08629368990659714,
-0.22067762911319733,
-0.1737615317106247,
0.10961830615997314,
0.06447527557611465,
0.038142189383506775,
0.09916606545448303,
-0.12450253963470459,
0.017892006784677505,
-0.022971825674176216,
0.11450657993555069,
0.02402796596288681,
-0.06234253570437431,
-0.02133648656308651,
0.06816604733467102,
0.055943988263607025,
-0.16287009418010712,
0.019718453288078308,
-0.017361152917146683,
-0.07588818669319153,
-0.02844194322824478,
-0.027993138879537582,
0.03376759588718414,
0.1724516898393631,
0.04490718990564346,
0.021303053945302963,
-0.042342789471149445,
0.14345389604568481,
-0.09250103682279587,
-0.028712112456560135,
0.24806922674179077,
-0.007997680455446243,
-0.018147679045796394,
0.03372000530362129,
0.026470186188817024,
-0.057945702224969864,
0.006534957326948643,
-0.029024513438344002,
-0.11341226100921631,
-0.2524932324886322,
-0.08690720051527023,
-0.05355909839272499,
-0.03283203765749931,
-0.0013818080769851804,
-0.009444675408303738,
0.04450329765677452,
0.021778583526611328,
-0.026569275185465813,
-0.07251884043216705,
0.09695898741483688,
0.016269246116280556,
0.018726861104369164,
-0.004716111347079277,
0.08698016405105591,
-0.052997659891843796,
-0.02217886596918106,
0.03837473690509796,
0.01716284640133381,
0.15669003129005432,
0.015122940763831139,
0.11529959738254547,
0.1005893424153328,
0.09757991135120392,
0.0839429423213005,
0.0989019125699997,
-0.01970689371228218,
-0.0070951879024505615,
0.0035745869390666485,
-0.0754963606595993,
-0.06839345395565033,
0.01220561284571886,
0.06760654598474503,
-0.03179427608847618,
-0.09756191074848175,
-0.01247145514935255,
0.02262560836970806,
0.15555797517299652,
0.03293546289205551,
-0.2054324895143509,
-0.0885455533862114,
-0.028828762471675873,
-0.011915910989046097,
0.002993997186422348,
0.03823431208729744,
0.1918729841709137,
-0.16202481091022491,
-0.01426671352237463,
-0.005671360529959202,
0.10131411999464035,
-0.030267031863331795,
0.009625817649066448,
-0.05806902423501015,
0.00570971705019474,
-0.0027032429352402687,
0.1015813946723938,
-0.2311556488275528,
0.19197556376457214,
-0.007925408892333508,
0.1355372816324234,
-0.06408070772886276,
0.014290244318544865,
0.013089265674352646,
0.02415172941982746,
0.14254136383533478,
0.00995122641324997,
0.026591993868350983,
-0.09062165766954422,
-0.08525174111127853,
0.04563545808196068,
-0.021546222269535065,
-0.01185451541095972,
0.031507231295108795,
0.0011425642296671867,
0.013045665808022022,
0.010522698983550072,
-0.05656125769019127,
-0.15584887564182281,
-0.07485541701316833,
0.008217252790927887,
0.08270461857318878,
0.10568764060735703,
-0.0477457270026207,
-0.08848318457603455,
-0.014519174583256245,
0.10865841805934906,
-0.13667801022529602,
-0.05458858609199524,
-0.09226938337087631,
0.012782730162143707,
0.05569789558649063,
-0.06687486916780472,
0.002143404446542263,
0.09280964732170105,
0.1182570606470108,
-0.03206431493163109,
-0.043128982186317444,
0.06898592412471771,
-0.13249598443508148,
-0.09869075566530228,
-0.03953424096107483,
0.1668776273727417,
0.09747177362442017,
0.05707195773720741,
0.03800949081778526,
-0.017302019521594048,
0.010180385783314705,
-0.051785990595817566,
-0.005026288330554962,
0.16702276468276978,
-0.0759495347738266,
-0.019149765372276306,
-0.04184592142701149,
-0.13241665065288544,
-0.10832499712705612,
-0.0747624933719635,
0.19279715418815613,
0.04570058360695839,
-0.06702995300292969,
0.15089033544063568,
0.18132849037647247,
-0.12558040022850037,
-0.20910431444644928,
0.0024042839650064707,
0.11786509305238724,
0.10093487054109573,
0.003970946650952101,
-0.24691219627857208,
0.022276513278484344,
0.02936377003788948,
-0.006983764003962278,
-0.05912093073129654,
-0.37777429819107056,
-0.14151401817798615,
0.1077491044998169,
0.020323656499385834,
0.1070069819688797,
-0.014335532672703266,
-0.023926354944705963,
-0.026204897090792656,
-0.03136056661605835,
0.01084114983677864,
-0.17514687776565552,
0.08480808138847351,
0.031843401491642,
0.02380623109638691,
0.036827072501182556,
-0.040178362280130386,
0.05998207628726959,
0.06305805593729019,
-0.018266374245285988,
-0.01376320794224739,
0.05124068632721901,
0.060116615146398544,
-0.0044504269026219845,
0.11372867971658707,
-0.047686632722616196,
0.02651248127222061,
-0.12025972455739975,
-0.09045596420764923,
-0.07689215987920761,
0.06927540898323059,
0.005128258373588324,
-0.030107535421848297,
0.03285295143723488,
-0.04330847039818764,
0.008434446528553963,
0.006598672829568386,
-0.07390984147787094,
-0.11497699469327927,
0.050966065376996994,
0.14688675105571747,
0.16858062148094177,
0.038626305758953094,
-0.09967377036809921,
0.0022129984572529793,
0.005211838521063328,
0.11282376945018768,
-0.1278807818889618,
0.029680496081709862,
0.061377961188554764,
0.039558518677949905,
0.11590512096881866,
0.034084975719451904,
-0.10606022924184799,
0.08452295511960983,
0.062032800167798996,
-0.04529887065291405,
-0.08951114863157272,
-0.06528344750404358,
-0.011292056180536747,
-0.042305197566747665,
0.0014367616968229413,
0.09813277423381805,
-0.08026856184005737,
-0.031723830848932266,
-0.028325682505965233,
0.0029599887784570456,
-0.11503078043460846,
0.20788811147212982,
0.033332012593746185,
0.060214195400476456,
-0.08511879295110703,
0.07003053277730942,
0.003623135620728135,
-0.047836463898420334,
0.037818003445863724,
0.0016985881375148892,
-0.09690240770578384,
-0.06452903896570206,
0.024493834003806114,
0.13377635180950165,
0.014002007432281971,
-0.12524576485157013,
-0.08952923864126205,
-0.0544925257563591,
-0.005655334331095219,
0.07195399701595306,
0.03459138795733452,
-0.0010333170648664236,
-0.10418775677680969,
-0.04433120787143707,
-0.10035843402147293,
0.051828790456056595,
0.06229113042354584,
-0.059532154351472855,
-0.05537794530391693,
0.23165734112262726,
0.11159082502126694,
-0.003196755424141884,
-0.02703639306128025,
-0.10011453926563263,
-0.03962704539299011,
0.08622416853904724,
-0.014522873796522617,
-0.013551035895943642,
-0.05469374358654022,
-0.0017682308098301291,
-0.01857360638678074,
-0.049565717577934265,
0.025337684899568558,
0.09357167780399323,
-0.07647716999053955,
0.03910673037171364,
-0.03893614932894707,
0.08473032712936401,
-0.09729256480932236,
0.011084353551268578,
0.018549326807260513,
-0.053462788462638855,
0.06691375374794006,
0.10352376103401184,
-0.08077280968427658,
0.10961990058422089,
-0.19219784438610077,
-0.014097824692726135,
0.062250107526779175,
0.03794366121292114,
-0.03134426102042198,
-0.09766614437103271,
0.06449069827795029,
0.0550389401614666,
0.047095708549022675,
-0.022570570930838585,
0.09633912891149521,
-0.06545368582010269,
0.00852553267031908,
0.009041668847203255,
-0.010542081668972969,
-0.020893149077892303,
0.08339116722345352,
0.05858789384365082,
0.11754142493009567,
0.12509022653102875,
-0.10187693685293198,
0.13425873219966888,
-0.14645658433437347,
-0.0008301748894155025,
-0.03159279748797417,
-0.001777313998900354,
-0.10330097377300262,
-0.07002586871385574,
0.05578318238258362,
-0.03268934041261673,
0.08746089041233063,
0.042259495705366135,
0.13081873953342438,
-0.029341697692871094,
-0.10080157220363617,
0.03872813656926155,
-0.010393447242677212,
0.21174781024456024,
0.04071575775742531,
0.039205633103847504,
-0.013126428239047527,
0.037454862147569656,
0.02422392927110195,
0.11963234096765518,
0.04716379567980766,
0.12481307983398438,
0.010562577284872532,
0.08476357161998749,
0.06091050058603287,
-0.05204417183995247,
-0.06821708381175995,
-0.0987299233675003,
-0.04879229515790939,
0.03304240107536316,
-0.08362431079149246,
0.10215940326452255,
0.15544988214969635,
-0.10165268182754517,
0.07873154431581497,
0.05206936597824097,
-0.09356630593538284,
-0.16113783419132233,
-0.12074878811836243,
-0.013792725279927254,
-0.14935104548931122,
0.011592122726142406,
-0.09296271950006485,
0.019382033497095108,
0.007535507902503014,
0.03980119526386261,
-0.037284206598997116,
0.17067782580852509,
0.024055765941739082,
-0.13260746002197266,
0.05599332973361015,
-0.07687245309352875,
0.03892068937420845,
-0.11191562563180923,
0.07667888700962067,
0.13187243044376373,
0.017847729846835136,
0.07415567338466644,
0.03207706660032272,
-0.05633830651640892,
0.025146933272480965,
-0.08046475052833557,
-0.05646047368645668,
-0.024721862748265266,
-0.006806850899010897,
0.05147182196378708,
0.18063876032829285,
0.11671160161495209,
-0.06872309744358063,
0.028810758143663406,
0.1424124836921692,
-0.03986631706357002,
-0.12858691811561584,
-0.16216617822647095,
0.1334638148546219,
0.0688621923327446,
0.005550450179725885,
-0.03771668300032616,
-0.04078691080212593,
-0.013024299405515194,
0.28363338112831116,
0.21375973522663116,
0.08543122559785843,
0.01856938935816288,
-0.012064527720212936,
-0.006094546057283878,
0.006132415030151606,
0.08575179427862167,
0.07625740021467209,
0.1906558722257614,
-0.032543983310461044,
0.014921920374035835,
-0.05821887403726578,
-0.07321030646562576,
-0.00784242432564497,
0.060805246233940125,
-0.08398615568876266,
-0.07281316071748734,
0.0069464766420423985,
0.13676738739013672,
-0.03489915654063225,
-0.11410532146692276,
-0.07567460089921951,
-0.05483074486255646,
-0.06828156858682632,
-0.025223804637789726,
0.003657461144030094,
0.1095678061246872,
0.04070957005023956,
-0.04072610288858414,
0.008399042300879955,
0.19715270400047302,
0.01756400056183338,
-0.08886634558439255,
-0.09757569432258606,
0.01196600403636694,
-0.1398863047361374,
0.0473705492913723,
-0.03618962690234184,
0.1283475011587143,
0.022994814440608025,
0.09607380628585815,
-0.04505814239382744,
0.11893611401319504,
-0.02555103972554207,
0.006962015759199858,
0.008539307862520218,
0.07469699531793594,
-0.05937239155173302,
0.08672574907541275,
-0.00921703688800335,
-0.13384267687797546,
0.07697923481464386,
-0.11968625336885452,
-0.004127994645386934,
-0.08177204430103302,
0.060967545956373215,
-0.04143040254712105,
0.08047723770141602,
0.09749279171228409,
-0.07815952599048615,
-0.05227695032954216,
-0.05397987738251686,
0.05749940127134323,
0.02571079134941101,
-0.006427780259400606,
-0.03047998808324337,
-0.22364823520183563,
-0.026588473469018936,
-0.0901840329170227,
-0.009026536718010902,
-0.20757712423801422,
-0.008592710830271244,
0.017786385491490364,
-0.10466734319925308,
0.0016498190816491842,
0.058938466012477875,
0.05271964520215988,
0.03673325851559639,
-0.002886279718950391,
-0.009938939474523067,
0.04763452336192131,
0.13707920908927917,
-0.15594346821308136,
-0.10505256056785583
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Romansh-Sursilvan
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Romansh Sursilvan using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "rm-sursilv", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-rm-sursilv")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-rm-sursilv")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "rm-sursilv", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("gchhablani/wav2vec2-large-xlsr-rm-sursilv")
model = Wav2Vec2ForCTC.from_pretrained("gchhablani/wav2vec2-large-xlsr-rm-sursilv")
model.to("cuda")
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“\\%\\‘\\”\\�\\…\\«\\»\\–]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 25.16 %
## Training
The Common Voice `train` and `validation` datasets were used for training. The code can be found [here](https://colab.research.google.com/drive/1dpZr_GzRowCciUbzM3GnW04TNKnB7vrP?usp=sharing).
|
{"language": "rm-sursilv", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Wav2Vec2 Large 53 Romansh Sursilvan by Gunjan Chhablani", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice rm-sursilv", "type": "common_voice", "args": "rm-sursilv"}, "metrics": [{"type": "wer", "value": 25.16, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gchhablani/wav2vec2-large-xlsr-rm-sursilv
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"rm-sursilv"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Romansh-Sursilvan
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilvan using the Common Voice dataset.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result: 25.16 %
## Training
The Common Voice 'train' and 'validation' datasets were used for training. The code can be found here.
|
[
"# Wav2Vec2-Large-XLSR-53-Romansh-Sursilvan\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilvan using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 25.16 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Romansh-Sursilvan\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilvan using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 25.16 %",
"## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
78,
72,
20,
29,
30
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Romansh-Sursilvan\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Romansh Sursilvan using the Common Voice dataset. \nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result: 25.16 %## Training\n\nThe Common Voice 'train' and 'validation' datasets were used for training. The code can be found here."
] |
[
-0.18322943150997162,
0.10682005435228348,
-0.0020146274473518133,
0.007506316062062979,
0.10492363572120667,
-0.046422526240348816,
0.15865495800971985,
0.10089057683944702,
-0.061347514390945435,
-0.004828309174627066,
0.017275802791118622,
-0.015742303803563118,
0.07017204910516739,
0.09898306429386139,
0.0075531937181949615,
-0.14598767459392548,
0.01258398313075304,
-0.0022864907514303923,
0.043661195784807205,
0.11150206625461578,
0.09128198772668839,
-0.07769981771707535,
0.011994709260761738,
0.0678405836224556,
-0.1370883285999298,
0.06189054623246193,
0.03135683760046959,
-0.12153008580207825,
0.14184649288654327,
0.04266346991062164,
0.07719868421554565,
0.0574127733707428,
0.09320619702339172,
-0.1625429093837738,
0.021673832088708878,
0.042777642607688904,
0.015021049417555332,
0.005673462525010109,
0.06962171941995621,
-0.035206571221351624,
0.05962352082133293,
0.09312029927968979,
-0.03279183432459831,
0.07751308381557465,
-0.08150643855333328,
-0.2224694788455963,
-0.037602249532938004,
-0.061154138296842575,
0.06347254663705826,
0.1531275510787964,
-0.05572832375764847,
0.1472853124141693,
-0.12514598667621613,
0.08189086616039276,
0.09373815357685089,
-0.19491660594940186,
-0.006639695726335049,
0.08345388621091843,
0.05391360819339752,
0.0867161974310875,
-0.03602533042430878,
0.03826051950454712,
0.05681346729397774,
0.006141338963061571,
-0.003325534053146839,
-0.037021148949861526,
-0.2352759689092636,
-0.015223786234855652,
-0.11986808478832245,
-0.06383485347032547,
0.23132076859474182,
-0.029905756935477257,
-0.07128053158521652,
-0.10819616168737411,
-0.01833268627524376,
0.009863410145044327,
0.019809097051620483,
-0.037259750068187714,
-0.006729630287736654,
0.045323148369789124,
-0.040032707154750824,
-0.025376925244927406,
-0.09717298299074173,
-0.14701442420482635,
0.021706437692046165,
0.052708081901073456,
0.01858692616224289,
0.009763394482433796,
-0.11904773861169815,
0.08492650091648102,
-0.08353383839130402,
-0.09217037260532379,
-0.007545333821326494,
0.032533761113882065,
-0.0690869614481926,
0.012358549050986767,
-0.07956801354885101,
-0.14157038927078247,
0.0717921182513237,
0.010228916071355343,
0.04928342252969742,
-0.003295472590252757,
-0.0782829150557518,
0.037467651069164276,
0.010248341597616673,
0.11411364376544952,
-0.06489289551973343,
-0.04497254267334938,
0.027571072801947594,
-0.030771926045417786,
-0.019488641992211342,
-0.014700467698276043,
-0.07431913167238235,
-0.06632339954376221,
0.008679182268679142,
0.10484714806079865,
-0.02083694189786911,
-0.01961260288953781,
-0.03977392241358757,
-0.03810429573059082,
0.01950095035135746,
-0.10496459901332855,
-0.016808412969112396,
0.06363237649202347,
-0.010511020198464394,
0.07518066465854645,
0.02697148360311985,
0.04657396301627159,
-0.0980367586016655,
-0.048016633838415146,
0.014510465785861015,
0.034052249044179916,
-0.027074072510004044,
-0.11346342414617538,
0.025388330221176147,
-0.0808793231844902,
-0.021290462464094162,
-0.0925181582570076,
-0.12898994982242584,
-0.0893280953168869,
-0.03195120394229889,
0.01682666689157486,
0.029633015394210815,
-0.11850446462631226,
-0.022384241223335266,
-0.03288727626204491,
-0.03879870846867561,
0.03364893049001694,
-0.04937645420432091,
0.07507509738206863,
0.0626525953412056,
0.0622258186340332,
0.0004271834623068571,
0.0740400180220604,
-0.10934456437826157,
-0.06509244441986084,
0.059864312410354614,
0.13894356787204742,
-0.03449607640504837,
-0.05604347959160805,
-0.08992454409599304,
-0.053893059492111206,
-0.07159825414419174,
0.04990129545331001,
0.09500609338283539,
0.13272812962532043,
-0.31357258558273315,
-0.0957292690873146,
0.26006758213043213,
-0.15017475187778473,
-0.04954221844673157,
0.17930787801742554,
-0.002338218269869685,
0.10921608656644821,
0.13282200694084167,
0.21988698840141296,
0.11233940720558167,
-0.17819629609584808,
0.04664512723684311,
-0.04028034955263138,
-0.010702426545321941,
-0.08749257773160934,
0.09403146803379059,
-0.07986551523208618,
0.034483905881643295,
0.041044313460588455,
-0.08532334119081497,
0.07573206722736359,
-0.030297821387648582,
-0.07046584784984589,
-0.03240117430686951,
-0.07467064261436462,
0.05833375081419945,
0.05552418529987335,
0.011905037797987461,
-0.04600350558757782,
-0.04579330235719681,
0.005996375810354948,
0.1065182164311409,
-0.15250688791275024,
0.042562469840049744,
-0.10595270246267319,
0.09904072433710098,
-0.0806727334856987,
-0.012453150935471058,
-0.15578195452690125,
0.14427362382411957,
-0.007406430784612894,
0.11442936956882477,
0.025585368275642395,
0.12126031517982483,
0.020337501540780067,
0.016482573002576828,
-0.021023940294981003,
-0.01604386977851391,
-0.009630189277231693,
-0.026991914957761765,
-0.007774658501148224,
-0.06927520036697388,
-0.015331513248383999,
-0.057297442108392715,
0.12484954297542572,
-0.16677020490169525,
0.006538663990795612,
0.06140759214758873,
0.03656879439949989,
0.009402880445122719,
-0.03289850801229477,
0.052054036408662796,
0.10041636973619461,
-0.010470897890627384,
-0.022432735189795494,
0.03688014671206474,
0.02258879505097866,
-0.0026942139957100153,
0.09131832420825958,
-0.10317886620759964,
0.014860259369015694,
0.12841038405895233,
-0.035274896770715714,
0.004236625507473946,
0.04677663370966911,
0.0005968367331661284,
-0.002746821381151676,
-0.12241514027118683,
-0.015839481726288795,
0.2379167228937149,
-0.004096945282071829,
0.12674735486507416,
-0.11297125369310379,
0.018285777419805527,
0.03195875883102417,
-0.04979419708251953,
0.04029157757759094,
0.05584176257252693,
0.017283305525779724,
0.06855513155460358,
0.0339963436126709,
-0.026602240279316902,
-0.09228937327861786,
0.22121255099773407,
-0.048498161137104034,
-0.10235967487096786,
0.02435101754963398,
-0.0023494912311434746,
0.006547098513692617,
0.0510224886238575,
-0.20246686041355133,
-0.04796561598777771,
0.02869928628206253,
0.03451200947165489,
0.06385014951229095,
-0.14096210896968842,
0.014360015280544758,
0.012286284938454628,
-0.13659515976905823,
-0.17522838711738586,
0.05077661573886871,
-0.04732976481318474,
0.0389733724296093,
-0.1157936304807663,
-0.04774582013487816,
-0.009019490331411362,
-0.04202777147293091,
-0.1654418259859085,
0.12229837477207184,
-0.07980524748563766,
-0.20971877872943878,
-0.17209023237228394,
0.06681746244430542,
-0.0010407359804958105,
0.048986297100782394,
0.11169899255037308,
-0.11995960026979446,
0.008579245768487453,
-0.02300574816763401,
0.07449071109294891,
0.015238245017826557,
-0.024871187284588814,
-0.03155533969402313,
0.04283417388796806,
0.04866381362080574,
-0.15946705639362335,
0.020039409399032593,
-0.037064842879772186,
-0.07012851536273956,
0.002139241434633732,
-0.03977199271321297,
0.057823214679956436,
0.17594961822032928,
0.008301571942865849,
0.037119705229997635,
-0.018523503094911575,
0.16730724275112152,
-0.09060677140951157,
-0.0500127375125885,
0.2383277714252472,
0.01177039835602045,
-0.03343702480196953,
0.0878206118941307,
0.02034861408174038,
-0.07770700007677078,
-0.010650725103914738,
-0.02390143647789955,
-0.09153909236192703,
-0.2568102180957794,
-0.1248253807425499,
-0.06216239556670189,
-0.07861685007810593,
-0.02457083947956562,
-0.005032591056078672,
0.048441898077726364,
0.027841057628393173,
-0.023345503956079483,
-0.07254024595022202,
0.06462380290031433,
0.0011338477488607168,
0.1067846491932869,
0.010767344385385513,
0.08981004357337952,
-0.05354885011911392,
-0.047203727066516876,
0.04302414506673813,
0.027889106422662735,
0.1372942477464676,
0.036078017204999924,
0.120076484978199,
0.08182306587696075,
0.12617620825767517,
0.09248248487710953,
0.0840810164809227,
-0.017112018540501595,
-0.007691261824220419,
0.018364695832133293,
-0.07952205091714859,
-0.05921363830566406,
0.010156802833080292,
0.12411897629499435,
-0.034873321652412415,
-0.06608932465314865,
-0.048301488161087036,
0.02126600779592991,
0.1709548979997635,
0.10446370393037796,
-0.2558371424674988,
-0.06395604461431503,
-0.029775967821478844,
-0.013310309499502182,
-0.0006160287302918732,
0.04353371635079384,
0.18788716197013855,
-0.12830795347690582,
0.009098256938159466,
0.0031905958894640207,
0.09484884142875671,
-0.03573976084589958,
0.023788414895534515,
-0.08594575524330139,
0.0019852854311466217,
0.0064998809248209,
0.09552396833896637,
-0.25109946727752686,
0.20255137979984283,
0.011024932377040386,
0.13259275257587433,
-0.05312889441847801,
0.012872171588242054,
-0.00003480406303424388,
0.05176728218793869,
0.120947927236557,
-0.004447926767170429,
0.05113454535603523,
-0.08233653753995895,
-0.07166246324777603,
0.056186359375715256,
-0.004082341678440571,
0.010429502464830875,
0.042519453912973404,
0.009854219853878021,
0.0003512814873829484,
-0.008875786326825619,
-0.026664361357688904,
-0.17197461426258087,
-0.05959790572524071,
0.0017581398133188486,
0.18619970977306366,
0.12054462730884552,
-0.021073168143630028,
-0.09326193481683731,
-0.08623228222131729,
0.10888742655515671,
-0.09882374107837677,
-0.06143296882510185,
-0.07945886999368668,
-0.023972274735569954,
0.10631196200847626,
-0.050879813730716705,
-0.0008411801536567509,
0.09614332020282745,
0.1295504868030548,
-0.03373729810118675,
-0.022950313985347748,
0.06341847032308578,
-0.10482729226350784,
-0.09703056514263153,
-0.040243301540613174,
0.18825644254684448,
0.0792432352900505,
0.04811296612024307,
0.07785752415657043,
0.018381115049123764,
0.003137296764180064,
-0.04975508525967598,
0.012294132262468338,
0.11485118418931961,
-0.09661690145730972,
-0.0009870583890005946,
0.008566539734601974,
-0.17431506514549255,
-0.08982893079519272,
-0.04647006839513779,
0.1788271814584732,
0.06431227177381516,
-0.06229790672659874,
0.1485873907804489,
0.19345548748970032,
-0.09369222074747086,
-0.2209307998418808,
0.002084294566884637,
0.09477946162223816,
0.10861131548881531,
-0.01060139387845993,
-0.18648850917816162,
0.04870280250906944,
0.01404851395636797,
-0.0242008063942194,
-0.07292089611291885,
-0.30972930788993835,
-0.12330886721611023,
0.10551680624485016,
-0.015464437194168568,
0.1254161298274994,
-0.009423943236470222,
-0.05426224321126938,
-0.03917532041668892,
-0.07835374772548676,
0.014735681936144829,
-0.17367583513259888,
0.08258739113807678,
0.02985529787838459,
0.03210138529539108,
0.03674344718456268,
-0.025747543200850487,
0.0767691358923912,
0.0976409912109375,
-0.015488511882722378,
0.01167372241616249,
0.07405200600624084,
0.09980112314224243,
0.01747678406536579,
0.12136261910200119,
-0.03599526360630989,
0.04570251330733299,
-0.12631432712078094,
-0.08023617416620255,
-0.06368403881788254,
0.07715190947055817,
0.020075201988220215,
-0.019145917147397995,
0.05130860581994057,
-0.052757617086172104,
-0.006008420139551163,
0.010310524143278599,
-0.034198589622974396,
-0.0947747603058815,
0.053442373871803284,
0.16243210434913635,
0.17461317777633667,
-0.039127662777900696,
-0.03916381299495697,
-0.0038097614888101816,
-0.015809867531061172,
0.10285185277462006,
-0.056521330028772354,
0.05696472153067589,
0.06582359224557877,
0.05707674100995064,
0.08204269409179688,
0.026214947924017906,
-0.09898567944765091,
0.08955725282430649,
0.050647176802158356,
-0.05477248504757881,
-0.07163099199533463,
-0.048515502363443375,
-0.08103621006011963,
-0.005320177413523197,
0.07632197439670563,
0.13493533432483673,
-0.05428491532802582,
-0.031099356710910797,
-0.04141250625252724,
0.00018450251081958413,
-0.1018642783164978,
0.23879069089889526,
0.023702703416347504,
0.05967138335108757,
-0.13585993647575378,
0.04807515814900398,
0.014859877526760101,
-0.04631780460476875,
0.028914591297507286,
-0.03285524621605873,
-0.10915972292423248,
-0.06255801767110825,
-0.0018111462704837322,
0.0952925980091095,
0.01924395188689232,
-0.15797194838523865,
-0.08520810306072235,
-0.06236491724848747,
-0.007343311794102192,
0.07022809982299805,
0.0659039244055748,
0.028436284512281418,
-0.09060594439506531,
-0.08819618821144104,
-0.09826882928609848,
0.045725591480731964,
0.07874010503292084,
-0.008902029134333134,
-0.0783059224486351,
0.18542245030403137,
0.05629809945821762,
-0.0020027467980980873,
-0.04381570219993591,
-0.0870470181107521,
-0.034452155232429504,
0.09552221745252609,
-0.0627039223909378,
0.004354268778115511,
-0.048051487654447556,
0.002908306894823909,
-0.011359506286680698,
-0.07309607416391373,
-0.00671384995803237,
0.09713702648878098,
-0.0863216295838356,
0.05109158158302307,
-0.02603163942694664,
0.08060308545827866,
-0.11784105747938156,
0.03400878980755806,
0.003943079616874456,
-0.05271404981613159,
0.08091694861650467,
0.09961248934268951,
-0.09698683768510818,
0.11450108140707016,
-0.2290281504392624,
-0.06786740571260452,
0.061327219009399414,
0.05657519772648811,
-0.011950965970754623,
-0.07995332032442093,
0.06681855767965317,
0.11189066618680954,
0.043885596096515656,
0.0040983473882079124,
0.10230943560600281,
-0.04884825274348259,
0.0065039945766329765,
-0.0471099317073822,
-0.02774740941822529,
0.002559137064963579,
0.060174789279699326,
0.10293960571289062,
0.12333136051893234,
0.1535031795501709,
-0.0979299545288086,
0.08590377122163773,
-0.12898385524749756,
0.000955796567723155,
-0.05527737736701965,
-0.010179194621741772,
-0.16973412036895752,
-0.06279221922159195,
0.0731833279132843,
-0.03339477255940437,
0.102182537317276,
0.05247191712260246,
0.14453347027301788,
-0.03660060092806816,
-0.05797648802399635,
0.023900484666228294,
-0.009561516344547272,
0.25111302733421326,
0.06853997707366943,
0.0360809862613678,
-0.022391540929675102,
0.01247350126504898,
0.025037240236997604,
0.133638396859169,
0.0012011174112558365,
0.1369430273771286,
0.009933937340974808,
0.09325235337018967,
0.09151219576597214,
-0.041545502841472626,
-0.047181639820337296,
-0.051822323352098465,
-0.07652371376752853,
0.03494979441165924,
-0.06648952513933182,
0.1242186650633812,
0.11511977016925812,
-0.09721122682094574,
0.046893712133169174,
0.03190989047288895,
-0.07509689033031464,
-0.14564882218837738,
-0.11892572045326233,
-0.07028800249099731,
-0.16257941722869873,
0.022768668830394745,
-0.10144123435020447,
0.031823720782995224,
0.058289796113967896,
0.031495824456214905,
-0.04465688765048981,
0.1281844973564148,
0.020196398720145226,
-0.10816656798124313,
0.06082931160926819,
-0.08012507110834122,
0.03508028760552406,
-0.06476695835590363,
0.046369608491659164,
0.13388127088546753,
0.03337424620985985,
0.0727856457233429,
0.03021387942135334,
-0.02419230155646801,
0.005784572102129459,
-0.08761528879404068,
-0.05378378555178642,
-0.01149815320968628,
-0.011605365201830864,
0.05739586055278778,
0.1285802274942398,
0.13515372574329376,
-0.07663214951753616,
0.029813814908266068,
0.1369418352842331,
-0.04504421725869179,
-0.1364700049161911,
-0.16834700107574463,
0.10892587155103683,
0.06367861479520798,
0.0540340431034565,
-0.014492571353912354,
-0.07973181456327438,
-0.003493725322186947,
0.2559512257575989,
0.19394080340862274,
0.08075688779354095,
0.023811008781194687,
-0.003777395933866501,
-0.007402441464364529,
-0.012810228392481804,
0.0643799901008606,
0.05816732347011566,
0.2054906040430069,
-0.022917892783880234,
0.002372447168454528,
-0.07509669661521912,
-0.07754542678594589,
-0.022844575345516205,
0.05384579300880432,
-0.11948416382074356,
-0.11999979615211487,
0.017047733068466187,
0.1622275561094284,
-0.030092431232333183,
-0.08828722685575485,
-0.06652085483074188,
-0.09137453138828278,
-0.09599456191062927,
-0.012810316868126392,
0.0027584361378103495,
0.10701430588960648,
0.015913620591163635,
-0.05557837337255478,
0.031649839133024216,
0.16266819834709167,
0.002481962088495493,
-0.0747784897685051,
-0.09492112696170807,
0.023871775716543198,
-0.15751972794532776,
0.041816968470811844,
-0.02720826119184494,
0.16354046761989594,
0.036938443779945374,
0.10457400977611542,
-0.027657873928546906,
0.14606918394565582,
-0.014647653326392174,
-0.02422390505671501,
0.04418262839317322,
0.0827544629573822,
-0.04211428388953209,
0.10175928473472595,
0.016297349706292152,
-0.10443280637264252,
0.0750516876578331,
-0.09103687852621078,
0.025714579969644547,
-0.11099068075418472,
0.059661801904439926,
-0.04656533896923065,
0.08391658961772919,
0.05917724221944809,
-0.07157853245735168,
-0.021832432597875595,
-0.047515638172626495,
0.0760008841753006,
0.02226548083126545,
-0.022189611569046974,
-0.0366639718413353,
-0.25860676169395447,
-0.01675344444811344,
-0.06155525892972946,
-0.003867575665935874,
-0.18880097568035126,
0.006416701711714268,
-0.0050997380167245865,
-0.0805312991142273,
-0.004450078587979078,
0.047134920954704285,
0.09853212535381317,
0.018471648916602135,
-0.0072192298248410225,
-0.02636144310235977,
0.052127353847026825,
0.11767895519733429,
-0.15039384365081787,
-0.1112416684627533
] |
null | null |
transformers
|
# GreekSocialBERT
## Model description
A Greek language model based on [GreekBERT](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1)
## Training data
The training data is a corpus of 458,293 documents collected from Greek social media accounts.
The training corpus has been collected and provided by [Palo LTD](http://www.paloservices.com/)
## Eval results
### BibTeX entry and citation info
```bibtex
@Article{info12080331,
AUTHOR = {Alexandridis, Georgios and Varlamis, Iraklis and Korovesis, Konstantinos and Caridakis, George and Tsantilas, Panagiotis},
TITLE = {A Survey on Sentiment Analysis and Opinion Mining in Greek Social Media},
JOURNAL = {Information},
VOLUME = {12},
YEAR = {2021},
NUMBER = {8},
ARTICLE-NUMBER = {331},
URL = {https://www.mdpi.com/2078-2489/12/8/331},
ISSN = {2078-2489},
DOI = {10.3390/info12080331}
}
```
|
{"language": "el"}
|
fill-mask
|
gealexandri/greeksocialbert-base-greek-uncased-v1
|
[
"transformers",
"pytorch",
"tf",
"bert",
"fill-mask",
"el",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"el"
] |
TAGS
#transformers #pytorch #tf #bert #fill-mask #el #autotrain_compatible #endpoints_compatible #region-us
|
# GreekSocialBERT
## Model description
A Greek language model based on GreekBERT
## Training data
The training data is a corpus of 458,293 documents collected from Greek social media accounts.
The training corpus has been collected and provided by Palo LTD
## Eval results
### BibTeX entry and citation info
|
[
"# GreekSocialBERT",
"## Model description\n\nA Greek language model based on GreekBERT",
"## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. \n\nThe training corpus has been collected and provided by Palo LTD",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #fill-mask #el #autotrain_compatible #endpoints_compatible #region-us \n",
"# GreekSocialBERT",
"## Model description\n\nA Greek language model based on GreekBERT",
"## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. \n\nThe training corpus has been collected and provided by Palo LTD",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
41,
5,
12,
37,
4,
11
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #fill-mask #el #autotrain_compatible #endpoints_compatible #region-us \n# GreekSocialBERT## Model description\n\nA Greek language model based on GreekBERT## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. \n\nThe training corpus has been collected and provided by Palo LTD## Eval results### BibTeX entry and citation info"
] |
[
-0.12229442596435547,
-0.14004483819007874,
-0.0013704488519579172,
0.10577995330095291,
0.18900305032730103,
0.010348910465836525,
0.1925252079963684,
0.09336590766906738,
0.052923783659935,
-0.048197511583566666,
0.09787634015083313,
0.11843466013669968,
0.03423197567462921,
0.1433657705783844,
-0.017770882695913315,
-0.30729082226753235,
-0.025683050975203514,
0.04092671722173691,
-0.06737968325614929,
0.07290452718734741,
0.09062784910202026,
-0.03188682347536087,
0.09549585729837418,
-0.057805053889751434,
-0.08065242320299149,
0.06485392153263092,
0.049690838903188705,
-0.11042934656143188,
0.1632211059331894,
0.08417307585477829,
0.0544225238263607,
-0.030972110107541084,
0.06893720477819443,
-0.0662413164973259,
0.030425719916820526,
0.0006427437183447182,
-0.033748798072338104,
0.035812534391880035,
0.03466007858514786,
-0.1026637852191925,
0.16899585723876953,
-0.015152441337704659,
0.057456400245428085,
-0.03146887198090553,
-0.1905154138803482,
-0.0532403402030468,
-0.04148812219500542,
0.01990765519440174,
0.11786429584026337,
0.13695786893367767,
-0.024628223851323128,
0.10315518081188202,
-0.11607484519481659,
0.08850749582052231,
0.07059264928102493,
-0.14482980966567993,
-0.07172349095344543,
0.11763542890548706,
0.039126597344875336,
0.05506852641701698,
-0.007112685590982437,
0.08684230595827103,
0.03539564833045006,
0.09059426188468933,
-0.0030874651856720448,
-0.013438157737255096,
-0.06447946280241013,
-0.014446116052567959,
-0.07914315909147263,
-0.016172613948583603,
0.2708178162574768,
0.013135474175214767,
-0.012183155864477158,
-0.02249380387365818,
0.05428505688905716,
0.11167740821838379,
0.019179625436663628,
-0.053630560636520386,
-0.05817735195159912,
-0.011482751928269863,
-0.03214358538389206,
-0.052838802337646484,
-0.10510377585887909,
-0.05401383340358734,
-0.13844621181488037,
0.16846589744091034,
0.04085187986493111,
0.058399591594934464,
-0.10994015634059906,
0.09618505835533142,
-0.078847236931324,
-0.0477156899869442,
0.07910902053117752,
0.032226018607616425,
0.09109354019165039,
-0.06450345367193222,
0.0007362360483966768,
-0.18301603198051453,
-0.007760887499898672,
0.10065039247274399,
0.011956321075558662,
-0.04862982779741287,
0.036811187863349915,
0.08201096951961517,
-0.026461109519004822,
0.057724107056856155,
0.036953236907720566,
-0.038944363594055176,
0.014298775233328342,
0.014707903377711773,
-0.055858198553323746,
0.030036788433790207,
-0.15518708527088165,
-0.036778971552848816,
0.03501661866903305,
0.016505207866430283,
-0.061602283269166946,
0.07909438014030457,
-0.003842003410682082,
-0.04433131590485573,
0.01049136370420456,
-0.08473281562328339,
-0.05347009748220444,
-0.10691885650157928,
-0.12341400980949402,
-0.03376220166683197,
0.12542733550071716,
-0.0006686609121970832,
-0.08564263582229614,
0.062000822275877,
0.03958853706717491,
-0.024102937430143356,
-0.010427228175103664,
-0.10373757779598236,
0.048560790717601776,
-0.10986903309822083,
0.07045581936836243,
-0.20045998692512512,
-0.21310557425022125,
-0.05569707229733467,
0.07044294476509094,
-0.04246227815747261,
-0.03386639431118965,
-0.06606724113225937,
0.04031503573060036,
-0.06207561492919922,
-0.009604062885046005,
0.0674094706773758,
-0.05969281122088432,
0.07378619909286499,
-0.07087860256433487,
0.060754988342523575,
-0.12729479372501373,
0.050520848482847214,
-0.09108999371528625,
-0.0787748172879219,
-0.08692071586847305,
0.05488927662372589,
0.011931074783205986,
0.015893809497356415,
-0.08878842741250992,
-0.03802793473005295,
-0.09153275936841965,
0.06930495798587799,
0.05783950909972191,
0.2283836454153061,
-0.1270555853843689,
-0.10926929116249084,
0.23732556402683258,
-0.04609832912683487,
-0.10758678615093231,
0.1372484415769577,
-0.07533249258995056,
0.19713342189788818,
0.1595684140920639,
0.13703274726867676,
-0.12240354716777802,
-0.1309107542037964,
0.015787668526172638,
-0.013780460692942142,
-0.010689030401408672,
-0.026164552196860313,
0.05917670950293541,
-0.004452497698366642,
0.012333273887634277,
0.06163119152188301,
0.012889540754258633,
0.09527085721492767,
-0.08245410025119781,
-0.05335771664977074,
0.028429469093680382,
-0.07323689758777618,
0.04184218496084213,
0.04589996114373207,
0.1570688635110855,
-0.09501020610332489,
-0.06041601300239563,
0.10342901945114136,
0.051983751356601715,
-0.013266205787658691,
0.002777848858386278,
-0.07417908310890198,
0.1272878497838974,
-0.015404026955366135,
-0.00416192039847374,
-0.10324324667453766,
0.028996460139751434,
-0.037602804601192474,
0.1711319237947464,
0.14558374881744385,
0.019083619117736816,
0.131490096449852,
0.03046516515314579,
-0.12295660376548767,
0.04114144667983055,
0.006636305246502161,
0.005180321168154478,
-0.12253522872924805,
-0.17456850409507751,
0.051032181829214096,
0.0072921570390462875,
0.07747901976108551,
-0.25895601511001587,
0.06734161078929901,
-0.03256160765886307,
0.12098850309848785,
0.01994260959327221,
0.003749681869521737,
-0.1034550592303276,
0.03377952426671982,
-0.044515226036310196,
-0.04418765380978584,
0.03541167452931404,
-0.04781695082783699,
-0.05720340833067894,
0.04839468002319336,
-0.0693676546216011,
0.16788068413734436,
0.1897088885307312,
-0.16914807260036469,
-0.09622393548488617,
-0.01023260597139597,
-0.022612806409597397,
-0.044044073671102524,
-0.09293182939291,
-0.012174492701888084,
0.2259749174118042,
-0.01672586053609848,
0.148405984044075,
-0.04096581041812897,
0.007222214713692665,
0.07481296360492706,
-0.11590638011693954,
0.015602885745465755,
0.12170316278934479,
0.052120134234428406,
-0.09951354563236237,
0.12136528640985489,
0.046511851251125336,
-0.0750570073723793,
0.2258327156305313,
0.027821410447359085,
-0.06819198280572891,
0.007675157394260168,
-0.039729125797748566,
0.056338775902986526,
0.1063423678278923,
-0.2036014050245285,
-0.0435381680727005,
-0.022871946915984154,
0.05322933569550514,
0.06926025450229645,
-0.13327360153198242,
-0.08455672115087509,
0.05699077621102333,
-0.026853427290916443,
0.03144929185509682,
0.05787073075771332,
-0.05317806079983711,
0.1212075874209404,
0.00765925133600831,
-0.11107110977172852,
0.0076492768712341785,
0.00006557833694387227,
-0.10245725512504578,
0.1855410486459732,
-0.010207303799688816,
-0.17577210068702698,
-0.16106516122817993,
-0.05740482360124588,
0.0020134663209319115,
0.04323361814022064,
0.013573378324508667,
-0.10441707819700241,
0.03548877313733101,
0.042947109788656235,
0.19647614657878876,
-0.02101939357817173,
-0.021902941167354584,
-0.16138947010040283,
0.06401528418064117,
-0.0679979920387268,
-0.04196460545063019,
-0.09278903156518936,
-0.11089854687452316,
-0.17405834794044495,
-0.0346786230802536,
-0.2613275647163391,
0.0075881448574364185,
0.08079921454191208,
-0.08780699968338013,
0.04479725658893585,
-0.07177891582250595,
0.1907012015581131,
-0.14440637826919556,
0.05676935613155365,
0.1326512098312378,
-0.0001504957617726177,
-0.03340110927820206,
0.045588210225105286,
0.036954376846551895,
-0.030771292746067047,
0.003041432239115238,
-0.0010970832081511617,
-0.1164131909608841,
-0.2984943687915802,
-0.12865574657917023,
-0.05426805838942528,
0.039746664464473724,
0.05616895109415054,
0.04353528842329979,
0.17923220992088318,
0.09290297329425812,
0.05401751399040222,
0.09132660180330276,
0.08917125314474106,
0.017530599609017372,
0.04045117273926735,
0.023185471072793007,
0.10180314630270004,
-0.01807195506989956,
-0.05716853216290474,
0.12071250379085541,
0.05067097023129463,
0.20253852009773254,
0.041029009968042374,
-0.0802106112241745,
0.03951466456055641,
0.04112958908081055,
-0.042447518557310104,
0.07960914820432663,
0.041304804384708405,
-0.015805961564183235,
-0.005954924505203962,
-0.033546045422554016,
0.007152987644076347,
0.07026059180498123,
-0.029721742495894432,
-0.032498303800821304,
-0.14483913779258728,
-0.022887149825692177,
0.06017180159687996,
0.15009209513664246,
0.026581088081002235,
-0.25989624857902527,
-0.1267963945865631,
0.007271833252161741,
-0.03404863178730011,
-0.06292124092578888,
0.027078447863459587,
0.041991960257291794,
-0.19269269704818726,
0.026820216327905655,
-0.04286865144968033,
0.10680343210697174,
-0.12614896893501282,
0.020642660558223724,
-0.012331446632742882,
-0.07970128953456879,
-0.009654604829847813,
0.10052143037319183,
-0.22800259292125702,
0.3352978527545929,
-0.023233208805322647,
0.1215418204665184,
-0.0843346118927002,
0.0033964887261390686,
0.0465877428650856,
0.12697623670101166,
0.16735851764678955,
0.03271760419011116,
0.09367568045854568,
-0.162380188703537,
-0.084859199821949,
0.018663404509425163,
0.0009851256618276238,
-0.1247088685631752,
-0.031774602830410004,
0.01713620312511921,
0.035310372710227966,
0.017490752041339874,
0.02777751162648201,
-0.08267854154109955,
-0.11285264790058136,
-0.02581845596432686,
-0.05106088146567345,
-0.06229311600327492,
-0.005846153479069471,
-0.1764817237854004,
-0.05474204197525978,
0.16310231387615204,
0.057401541620492935,
-0.04126804694533348,
-0.06060168519616127,
0.09278504550457001,
-0.024392545223236084,
-0.08748088777065277,
0.03477306663990021,
0.01034407876431942,
-0.081018827855587,
-0.08875492215156555,
-0.04794970899820328,
0.14866302907466888,
-0.08874758332967758,
-0.028888629749417305,
-0.06312128901481628,
0.08556745946407318,
0.11948712170124054,
0.024253331124782562,
0.05734814703464508,
-0.005835825111716986,
0.03726058080792427,
-0.0010217346716672182,
0.010204088874161243,
-0.050813790410757065,
0.06453416496515274,
0.07068338990211487,
-0.08785125613212585,
-0.1731913834810257,
-0.10609786957502365,
-0.09330715239048004,
0.1418713927268982,
0.11163825541734695,
-0.062254518270492554,
-0.052458640187978745,
0.08309106528759003,
-0.14285913109779358,
-0.17468245327472687,
0.08058102428913116,
0.013277637772262096,
0.12053745985031128,
0.005567133892327547,
-0.19423788785934448,
0.08610199391841888,
0.13676631450653076,
-0.025541674345731735,
-0.02766367979347706,
-0.19366155564785004,
-0.1597495675086975,
0.16434286534786224,
0.03196774423122406,
0.26702210307121277,
-0.07817724347114563,
0.0177399143576622,
-0.01565946638584137,
-0.04439368471503258,
0.05379471927881241,
-0.05642570182681084,
0.09283456951379776,
0.010242612101137638,
0.035421110689640045,
0.015624379739165306,
0.008945304900407791,
0.10565424710512161,
0.0581434890627861,
0.04163770005106926,
-0.08886295557022095,
-0.200489804148674,
0.05313925817608833,
-0.0021632371935993433,
0.09706996381282806,
0.02568010240793228,
-0.040055051445961,
-0.1899084448814392,
-0.06101817637681961,
-0.07070053368806839,
0.0677303671836853,
0.010904219932854176,
-0.0878773182630539,
0.03295446187257767,
0.0760936439037323,
-0.009971548803150654,
0.0189405158162117,
0.0758197233080864,
-0.07420462369918823,
0.2173815816640854,
0.04008672386407852,
0.16747596859931946,
0.030324388295412064,
-0.04726818576455116,
0.022067703306674957,
-0.04680018126964569,
0.03096149116754532,
-0.11849071085453033,
-0.031913768500089645,
0.13254597783088684,
-0.02401427924633026,
0.13459496200084686,
0.03455600142478943,
-0.08903005719184875,
0.01476235780864954,
0.10301381349563599,
-0.11661605536937714,
-0.07619500905275345,
-0.08761493116617203,
0.021263834089040756,
0.02924703061580658,
0.08792536705732346,
0.11744201928377151,
-0.11598284542560577,
-0.027133645489811897,
-0.038483232259750366,
-0.028941314667463303,
-0.1404889076948166,
0.14626456797122955,
0.057973712682724,
-0.01472856942564249,
-0.037249527871608734,
0.05079759284853935,
0.06251246482133865,
-0.06191970407962799,
0.05033969506621361,
0.09760771691799164,
-0.05556916072964668,
-0.029896104708313942,
0.0485980249941349,
0.14735813438892365,
-0.05676102265715599,
-0.07514238357543945,
-0.08587045967578888,
-0.11993227899074554,
0.011909609660506248,
0.14015859365463257,
0.1419377624988556,
-0.0024110942613333464,
-0.03633461892604828,
0.02692577801644802,
-0.07778679579496384,
0.05142184719443321,
0.18764743208885193,
-0.0374494306743145,
0.02210540696978569,
0.08811094611883163,
0.0651886910200119,
0.04824681952595711,
-0.06870321929454803,
-0.037236105650663376,
-0.10974635928869247,
0.07500378042459488,
-0.13266199827194214,
-0.022603707388043404,
-0.010986121371388435,
-0.03317500650882721,
-0.04175977036356926,
-0.05682838335633278,
-0.009942225180566311,
-0.012221329845488071,
-0.08802978694438934,
0.05500054731965065,
0.008040089160203934,
0.058875929564237595,
-0.03148491680622101,
-0.03725231811404228,
0.016010461375117302,
-0.0011406365083530545,
0.060638461261987686,
0.10481540113687515,
0.01998874917626381,
0.09230397641658783,
-0.1246444508433342,
0.07435067743062973,
0.05346306413412094,
0.04048794135451317,
0.06224394217133522,
0.016299281269311905,
0.02795891836285591,
0.011010225862264633,
0.10369907319545746,
0.08242084830999374,
0.013561548665165901,
-0.04997370019555092,
0.06664838641881943,
0.07399590313434601,
-0.06851014494895935,
-0.0461261011660099,
0.13968177139759064,
0.00034193001920357347,
0.05732466280460358,
-0.0008589171920903027,
-0.09836702048778534,
0.06368009001016617,
-0.16597488522529602,
0.0334949754178524,
-0.028866220265626907,
-0.14656847715377808,
-0.050520408898591995,
-0.017748182639479637,
0.07424493134021759,
0.0004263256851118058,
0.16602517664432526,
0.1777583807706833,
0.056862782686948776,
0.041981328278779984,
0.05279901251196861,
0.05289532244205475,
-0.0031835197005420923,
0.0068703461438417435,
0.012247451581060886,
-0.006326209753751755,
-0.1395949274301529,
0.05538502708077431,
0.002195445355027914,
0.07018356770277023,
0.02702823095023632,
0.10072395950555801,
-0.004911367315798998,
0.05796590447425842,
0.05870521068572998,
0.059397656470537186,
-0.08839000016450882,
-0.20403911173343658,
-0.031738247722387314,
0.06174110621213913,
-0.05784372612833977,
0.07881803810596466,
0.09346939623355865,
-0.14640870690345764,
0.03106575645506382,
-0.02325366623699665,
-0.11159736663103104,
-0.1621703952550888,
-0.21375000476837158,
-0.001761100022122264,
-0.014624053612351418,
0.0007743024616502225,
-0.09597186744213104,
-0.052403856068849564,
0.14913450181484222,
0.061454787850379944,
-0.07686421275138855,
0.10282357037067413,
-0.025006283074617386,
-0.03407564014196396,
0.13101065158843994,
-0.060459498316049576,
0.06388566642999649,
-0.180977925658226,
-0.023735569790005684,
-0.1313439905643463,
-0.009658187627792358,
-0.023904912173748016,
-0.015841709449887276,
-0.05211762711405754,
0.00844353437423706,
0.0018718542996793985,
-0.04945174604654312,
-0.019480345770716667,
-0.027530664578080177,
0.01509209256619215,
0.12613387405872345,
0.020277544856071472,
0.003915059380233288,
0.025893742218613625,
0.20878413319587708,
0.007326224818825722,
-0.047691695392131805,
-0.11681944876909256,
0.13259083032608032,
0.028555339202284813,
0.04588845744729042,
-0.02130851522088051,
-0.03006836585700512,
-0.028669502586126328,
0.3181982934474945,
0.3772626221179962,
0.03719174116849899,
-0.0029100298415869474,
0.05789882317185402,
0.0166173055768013,
0.04274832084774971,
0.07737398892641068,
-0.04738296568393707,
0.15301407873630524,
-0.12121430784463882,
0.026565883308649063,
-0.025927983224391937,
-0.008764583617448807,
0.041864410042762756,
-0.002666260115802288,
0.050342854112386703,
-0.005305303260684013,
-0.04445450007915497,
0.13037677109241486,
-0.14891134202480316,
-0.12027256935834885,
-0.08100252598524094,
-0.2600955367088318,
-0.17706407606601715,
-0.07569896429777145,
-0.04510203376412392,
0.06820043176412582,
0.15242162346839905,
-0.008503531105816364,
-0.0009548256057314575,
-0.04277917370200157,
0.02341518923640251,
-0.12648387253284454,
-0.13205072283744812,
0.14771614968776703,
0.07174766808748245,
0.11217384785413742,
-0.046531274914741516,
0.11118929833173752,
0.05918896198272705,
0.0350748673081398,
0.07788949459791183,
0.014005200937390327,
-0.009235350415110588,
0.18261310458183289,
-0.011374450288712978,
0.005985978990793228,
-0.07116780430078506,
0.014756003394722939,
0.037334226071834564,
-0.16728141903877258,
0.07192203402519226,
-0.13444507122039795,
-0.14525820314884186,
-0.07217484712600708,
0.09833623468875885,
-0.06556618958711624,
0.04177095741033554,
0.21531330049037933,
0.0094078304246068,
-0.03526294231414795,
-0.05674358829855919,
0.022070972248911858,
0.12652170658111572,
-0.06295253336429596,
-0.08901333063840866,
-0.14307790994644165,
-0.07673557102680206,
-0.11232390999794006,
0.011022835969924927,
-0.20057563483715057,
-0.056951139122247696,
-0.10029909014701843,
-0.10314098745584488,
-0.08503212034702301,
0.08073645085096359,
0.13806699216365814,
0.010364025831222534,
-0.02590523473918438,
-0.003959740977734327,
0.023130541667342186,
0.0685664638876915,
-0.1013014018535614,
-0.09490973502397537
] |
null | null |
transformers
|
# PaloBERT
## Model description
A Greek language model based on [RoBERTa](https://arxiv.org/abs/1907.11692)
## Training data
The training data is a corpus of 458,293 documents collected from Greek social media accounts. It also contains a GTP-2 tokenizer trained from scratch on the same corpus.
The training corpus has been collected and provided by [Palo LTD](http://www.paloservices.com/)
## Eval results
### BibTeX entry and citation info
```bibtex
@Article{info12080331,
AUTHOR = {Alexandridis, Georgios and Varlamis, Iraklis and Korovesis, Konstantinos and Caridakis, George and Tsantilas, Panagiotis},
TITLE = {A Survey on Sentiment Analysis and Opinion Mining in Greek Social Media},
JOURNAL = {Information},
VOLUME = {12},
YEAR = {2021},
NUMBER = {8},
ARTICLE-NUMBER = {331},
URL = {https://www.mdpi.com/2078-2489/12/8/331},
ISSN = {2078-2489},
DOI = {10.3390/info12080331}
}
```
|
{"language": "el"}
|
fill-mask
|
gealexandri/palobert-base-greek-uncased-v1
|
[
"transformers",
"pytorch",
"tf",
"roberta",
"fill-mask",
"el",
"arxiv:1907.11692",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1907.11692"
] |
[
"el"
] |
TAGS
#transformers #pytorch #tf #roberta #fill-mask #el #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #region-us
|
# PaloBERT
## Model description
A Greek language model based on RoBERTa
## Training data
The training data is a corpus of 458,293 documents collected from Greek social media accounts. It also contains a GTP-2 tokenizer trained from scratch on the same corpus.
The training corpus has been collected and provided by Palo LTD
## Eval results
### BibTeX entry and citation info
|
[
"# PaloBERT",
"## Model description\n\nA Greek language model based on RoBERTa",
"## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. It also contains a GTP-2 tokenizer trained from scratch on the same corpus.\n\nThe training corpus has been collected and provided by Palo LTD",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
"TAGS\n#transformers #pytorch #tf #roberta #fill-mask #el #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #region-us \n",
"# PaloBERT",
"## Model description\n\nA Greek language model based on RoBERTa",
"## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. It also contains a GTP-2 tokenizer trained from scratch on the same corpus.\n\nThe training corpus has been collected and provided by Palo LTD",
"## Eval results",
"### BibTeX entry and citation info"
] |
[
50,
4,
12,
57,
4,
11
] |
[
"passage: TAGS\n#transformers #pytorch #tf #roberta #fill-mask #el #arxiv-1907.11692 #autotrain_compatible #endpoints_compatible #region-us \n# PaloBERT## Model description\n\nA Greek language model based on RoBERTa## Training data\n\nThe training data is a corpus of 458,293 documents collected from Greek social media accounts. It also contains a GTP-2 tokenizer trained from scratch on the same corpus.\n\nThe training corpus has been collected and provided by Palo LTD## Eval results### BibTeX entry and citation info"
] |
[
-0.09938175231218338,
-0.03325428068637848,
-0.0015135178109630942,
0.0969533696770668,
0.18347424268722534,
0.022674141451716423,
0.16746632754802704,
0.11628696322441101,
0.011110933497548103,
-0.040914926677942276,
0.12647467851638794,
0.1774863302707672,
0.03799096867442131,
0.15391525626182556,
0.014604434370994568,
-0.30606234073638916,
-0.008937729522585869,
0.03651946783065796,
-0.0217747762799263,
0.08323545753955841,
0.0867006927728653,
-0.04003404825925827,
0.09623447060585022,
-0.04514194279909134,
-0.06766590476036072,
0.06076948344707489,
0.05154925957322121,
-0.11131228506565094,
0.1664174646139145,
0.06189742311835289,
0.05395786091685295,
-0.027731038630008698,
0.07101820409297943,
-0.09502702206373215,
0.03608554229140282,
-0.005706233903765678,
-0.059728819876909256,
0.04490628093481064,
0.029308723285794258,
-0.10663705319166183,
0.19650940597057343,
0.05211018770933151,
0.02934236079454422,
-0.03807126358151436,
-0.19937604665756226,
-0.0966024175286293,
-0.04520842060446739,
0.05744932219386101,
0.08685653656721115,
0.10734018683433533,
-0.0027082611341029406,
0.11490101367235184,
-0.08545347303152084,
0.08013173192739487,
0.11798471957445145,
-0.20020011067390442,
-0.07055601477622986,
0.11098956316709518,
0.0806947723031044,
0.06694197654724121,
0.022497639060020447,
0.06815196573734283,
0.05714432895183563,
0.09483404457569122,
0.008873038925230503,
-0.04264804348349571,
-0.055968597531318665,
-0.006464680656790733,
-0.08316292613744736,
-0.05609384551644325,
0.22048179805278778,
-0.03545891493558884,
-0.017694808542728424,
0.03319737687706947,
0.03440096974372864,
0.07103817909955978,
0.021885570138692856,
-0.06658612936735153,
-0.03346337378025055,
0.005062386393547058,
-0.024175919592380524,
-0.03722340241074562,
-0.09928219020366669,
-0.0707927867770195,
-0.13502207398414612,
0.12611716985702515,
0.046587225049734116,
0.06697224080562592,
-0.1263752430677414,
0.09773869067430496,
-0.1029699370265007,
-0.06150469556450844,
0.06558605283498764,
0.023507846519351006,
0.09670314192771912,
-0.041180383414030075,
-0.025059007108211517,
-0.09310369938611984,
0.005303225014358759,
0.18203352391719818,
0.026790466159582138,
-0.01606292650103569,
0.07343951612710953,
0.08303704112768173,
-0.006894142832607031,
0.06314250081777573,
0.020019102841615677,
-0.05950099974870682,
0.05559372901916504,
-0.016041090711951256,
-0.012498850002884865,
0.015509597025811672,
-0.19473737478256226,
-0.056962959468364716,
0.05948060005903244,
0.02513836696743965,
-0.020204227417707443,
0.08013232797384262,
0.0036309685092419386,
-0.022725922986865044,
0.022809850051999092,
-0.07421932369470596,
-0.04988710582256317,
-0.10951261967420578,
-0.0812116265296936,
-0.06917592883110046,
0.10031526535749435,
-0.007645214442163706,
-0.08542584627866745,
0.007520532235503197,
-0.00703764520585537,
-0.024259857833385468,
-0.029197275638580322,
-0.08891844004392624,
0.046094074845314026,
-0.07650427520275116,
0.04167668893933296,
-0.21187227964401245,
-0.22838439047336578,
-0.054228536784648895,
0.06675919890403748,
-0.05371323972940445,
-0.04828457906842232,
-0.06588908284902573,
0.04274186119437218,
-0.051885850727558136,
-0.022860197350382805,
0.048164527863264084,
-0.06070676073431969,
0.06515930593013763,
-0.020986108109354973,
0.09682987630367279,
-0.12307647615671158,
0.03345521539449692,
-0.08531920611858368,
-0.030157502740621567,
-0.19152642786502838,
0.023938460275530815,
0.014133390039205551,
0.014708403497934341,
-0.08780285716056824,
-0.0301017165184021,
-0.08947820216417313,
0.07411212474107742,
0.07588419318199158,
0.19488798081874847,
-0.13615168631076813,
-0.10024294257164001,
0.2224573791027069,
-0.0878649353981018,
-0.1198144480586052,
0.12751854956150055,
-0.05779920145869255,
0.20510928332805634,
0.13195976614952087,
0.16636429727077484,
-0.05885346606373787,
-0.07378659397363663,
0.037063661962747574,
-0.0015542813343927264,
-0.018562670797109604,
-0.10731831192970276,
0.05802760645747185,
0.0035927630960941315,
-0.027297599241137505,
0.0491328239440918,
-0.0030817140359431505,
0.07709945738315582,
-0.08686742186546326,
-0.0477847084403038,
0.035059526562690735,
-0.05967967212200165,
-0.01952201873064041,
0.01651271991431713,
0.14232003688812256,
-0.07370200753211975,
-0.05213563144207001,
0.011503095738589764,
0.05230892822146416,
-0.01413820218294859,
0.02387845143675804,
-0.07501812279224396,
0.1516268402338028,
-0.06082091107964516,
-0.004278830252587795,
-0.11940295249223709,
0.08141838014125824,
-0.031763169914484024,
0.11514438688755035,
0.09868159145116806,
0.013623077422380447,
0.09979227930307388,
0.0011613992974162102,
-0.09118802100419998,
0.040424227714538574,
0.03535253554582596,
-0.018719013780355453,
-0.1078483909368515,
-0.14582352340221405,
0.034220799803733826,
-0.017612293362617493,
-0.0044533442705869675,
-0.18872271478176117,
0.03959311544895172,
-0.09493958950042725,
0.09856433421373367,
-0.02050146833062172,
0.0070550888776779175,
-0.09010373800992966,
0.04071701318025589,
-0.05353726074099541,
-0.034731674939394,
0.053608134388923645,
0.004037107806652784,
-0.04182945191860199,
0.09835154563188553,
-0.10746755450963974,
0.17379239201545715,
0.16095206141471863,
-0.1847955584526062,
-0.11305342614650726,
-0.013273490592837334,
-0.0174568984657526,
-0.02782757766544819,
-0.07062740623950958,
0.005523039028048515,
0.19654446840286255,
-0.04537631571292877,
0.14282415807247162,
-0.056686852127313614,
0.006489911116659641,
0.05618606507778168,
-0.12431953847408295,
0.022573307156562805,
0.12821470201015472,
0.10309018194675446,
-0.07057856768369675,
0.09420376271009445,
0.03887231647968292,
-0.102435402572155,
0.17686258256435394,
0.047814201563596725,
-0.04694075882434845,
-0.037926677614450455,
-0.061089951545000076,
0.05504676327109337,
0.09111976623535156,
-0.17887713015079498,
-0.06146711856126785,
-0.008547087199985981,
0.03525160998106003,
0.08423181623220444,
-0.14474137127399445,
-0.07003733515739441,
0.04021143913269043,
-0.027039967477321625,
-0.006942493841052055,
0.03599594160914421,
-0.09032254666090012,
0.12124497443437576,
0.01405268907546997,
-0.09481032937765121,
0.01747909188270569,
0.00998938549309969,
-0.11213359236717224,
0.19040434062480927,
-0.029196279123425484,
-0.21001996099948883,
-0.1719651222229004,
-0.02519676834344864,
0.059788141399621964,
0.09421422332525253,
0.026059992611408234,
-0.08171454071998596,
0.03199189901351929,
0.03221555054187775,
0.14275115728378296,
-0.032310206443071365,
0.0058541265316307545,
-0.10114509612321854,
0.050160184502601624,
-0.04697796329855919,
-0.0387604683637619,
-0.08310634642839432,
-0.1102941632270813,
-0.16346780955791473,
0.009117974899709225,
-0.2362338900566101,
0.047856662422418594,
0.10673055052757263,
-0.05207204818725586,
0.05529553070664406,
-0.06004796549677849,
0.17552028596401215,
-0.10579226166009903,
0.023084772750735283,
0.1825501173734665,
-0.003445890499278903,
-0.01295517198741436,
0.030050775036215782,
0.009183328598737717,
-0.05199987813830376,
0.009885150007903576,
-0.04650352895259857,
-0.1318466067314148,
-0.26965582370758057,
-0.08915358781814575,
-0.03771403804421425,
0.0548548549413681,
0.05895920470356941,
0.03296991437673569,
0.11777721345424652,
0.10460925102233887,
0.050814464688301086,
0.1152639091014862,
0.02448582462966442,
0.029606392607092857,
0.06490468233823776,
0.015896055847406387,
0.11287220567464828,
-0.0403812974691391,
-0.06492333859205246,
0.09601710736751556,
0.06971264630556107,
0.18267513811588287,
0.053731519728899,
-0.05963701009750366,
0.040223076939582825,
0.04931677132844925,
-0.005857787095010281,
0.04662554711103439,
0.0524006262421608,
-0.027303995564579964,
-0.034018464386463165,
-0.018328117206692696,
-0.0846041664481163,
0.05687086284160614,
-0.032192666083574295,
-0.030139800161123276,
-0.13547244668006897,
0.027558252215385437,
0.03784159943461418,
0.10242976248264313,
0.011595383286476135,
-0.28350183367729187,
-0.10385660082101822,
0.012659191153943539,
-0.03609849512577057,
-0.026511531323194504,
0.052233416587114334,
0.06164385750889778,
-0.16766725480556488,
0.006877460982650518,
-0.012154117226600647,
0.0899340808391571,
-0.16164512932300568,
0.02609034813940525,
-0.05102372542023659,
-0.046345409005880356,
-0.011238442733883858,
0.08935441076755524,
-0.2718697488307953,
0.32752642035484314,
-0.020866943523287773,
0.10383973270654678,
-0.08677731454372406,
-0.009621309116482735,
0.05185560882091522,
0.0758836567401886,
0.17330938577651978,
0.022609077394008636,
-0.019385170191526413,
-0.12030158936977386,
-0.08465320616960526,
0.04251144826412201,
-0.002068710047751665,
-0.10454951226711273,
0.001573015353642404,
-0.005531315226107836,
0.046904608607292175,
0.015207227319478989,
0.08942385762929916,
-0.057225704193115234,
-0.07533689588308334,
-0.016816839575767517,
-0.034226905554533005,
-0.10046100616455078,
-0.01635408215224743,
-0.14877521991729736,
-0.05838082358241081,
0.13754390180110931,
0.051205556839704514,
-0.028273029252886772,
-0.05014517530798912,
0.055146194994449615,
-0.03145899623632431,
-0.1148005872964859,
0.0376732237637043,
0.020884616300463676,
-0.04021408036351204,
-0.07475581765174866,
-0.05945250019431114,
0.1395910382270813,
-0.11179069429636002,
-0.011851644143462181,
-0.07699120789766312,
0.04473847895860672,
0.1010679081082344,
0.04430025815963745,
0.06559201329946518,
-0.001984209753572941,
0.011335384100675583,
-0.020423879846930504,
0.031186005100607872,
-0.008004363626241684,
0.06835287064313889,
0.03550650179386139,
-0.08118505030870438,
-0.07381310313940048,
-0.05815201997756958,
-0.09323559701442719,
0.1389615684747696,
0.14351114630699158,
-0.05362314730882645,
-0.012190952897071838,
0.09529992192983627,
-0.15259455144405365,
-0.25889211893081665,
0.018827276304364204,
0.017223317176103592,
0.08007445931434631,
-0.01413913257420063,
-0.17789433896541595,
0.06031796708703041,
0.1164398193359375,
-0.03715655952692032,
-0.03458276763558388,
-0.21118959784507751,
-0.13383260369300842,
0.1896689534187317,
0.06535044312477112,
0.3485236167907715,
-0.10606507211923599,
-0.026991665363311768,
-0.05561548471450806,
-0.06991926580667496,
0.040366772562265396,
-0.04407123476266861,
0.09063868224620819,
-0.01846793293952942,
0.026690639555454254,
-0.003610332030802965,
-0.0033714103046804667,
0.10453850775957108,
0.07876316457986832,
0.033760055899620056,
-0.1016453206539154,
-0.14462201297283173,
0.07084657996892929,
-0.022086793556809425,
0.05158328264951706,
0.05630727484822273,
-0.01960393227636814,
-0.156705841422081,
-0.054731883108615875,
-0.07786135375499725,
0.04873378574848175,
0.020005982369184494,
-0.08195993304252625,
0.03983839228749275,
0.06783214956521988,
-0.012010092847049236,
0.030072825029492378,
0.11235826462507248,
-0.07567206770181656,
0.22058825194835663,
0.0644107311964035,
0.15534217655658722,
0.027860889211297035,
-0.009396921843290329,
0.0066220201551914215,
-0.06774922460317612,
0.04512179270386696,
-0.09674686938524246,
-0.015384235419332981,
0.09374570846557617,
-0.026424162089824677,
0.09870077669620514,
0.04920993372797966,
-0.0761537179350853,
0.021033329889178276,
0.13097447156906128,
-0.13724133372306824,
-0.04813213646411896,
-0.04454313963651657,
-0.011408255435526371,
0.02832202799618244,
0.08697371929883957,
0.12676666676998138,
-0.09376541525125504,
-0.04977334290742874,
-0.03491325303912163,
-0.025754641741514206,
-0.107518769800663,
0.17055048048496246,
0.0943693146109581,
-0.024935437366366386,
-0.05121595785021782,
0.07589702308177948,
0.04721635580062866,
-0.017824623733758926,
0.06431546807289124,
0.06741983443498611,
-0.04779835045337677,
-0.04921393096446991,
0.022561363875865936,
0.15776574611663818,
-0.13785332441329956,
-0.0914236530661583,
-0.14347511529922485,
-0.07159807533025742,
0.047991566359996796,
0.12570339441299438,
0.13625842332839966,
0.000563155161216855,
-0.03360865265130997,
0.0368439219892025,
-0.06623920798301697,
0.01770555041730404,
0.14837877452373505,
-0.018570076674222946,
0.001170105766505003,
0.15261320769786835,
0.02464957721531391,
0.061207931488752365,
-0.07068558782339096,
-0.037118300795555115,
-0.1568676233291626,
0.07699607312679291,
-0.18080471456050873,
0.013662275858223438,
-0.05056073144078255,
-0.044279541820287704,
-0.018650714308023453,
-0.04276604205369949,
-0.051932331174612045,
-0.00002094213414238766,
-0.10024500638246536,
0.05352674424648285,
-0.0048148855566978455,
0.051212504506111145,
-0.02187184989452362,
-0.03155529871582985,
0.02508845366537571,
0.0011874907650053501,
0.058378107845783234,
0.099512979388237,
0.02351243980228901,
0.0782073363661766,
-0.12410357594490051,
0.0638243705034256,
0.04978662729263306,
0.01645704172551632,
0.02653413452208042,
0.000796258042100817,
0.04200148954987526,
0.03240354359149933,
0.056027334183454514,
0.07400697469711304,
0.010920951142907143,
-0.07074052095413208,
0.08010982722043991,
0.04434271156787872,
-0.08111504465341568,
-0.026092417538166046,
0.10215088725090027,
0.008811208419501781,
0.0536501482129097,
0.002910657087340951,
-0.11312448978424072,
0.03418135270476341,
-0.16960643231868744,
0.03339337557554245,
-0.019983090460300446,
-0.1120210736989975,
-0.08220315724611282,
-0.039392467588186264,
0.06524496525526047,
0.026510227471590042,
0.18348848819732666,
0.16681745648384094,
0.025856448337435722,
0.016370929777622223,
0.09262850135564804,
0.06495263427495956,
-0.023653030395507812,
0.0331321582198143,
0.02915269136428833,
-0.022684548050165176,
-0.10411950200796127,
0.0834105908870697,
0.04698226600885391,
0.05280189961194992,
0.0742856040596962,
0.046115364879369736,
0.06541071087121964,
0.08613119274377823,
0.05745449289679527,
0.05272362381219864,
-0.05828458070755005,
-0.15808826684951782,
-0.03322947770357132,
0.04478580504655838,
-0.0013883351348340511,
0.014798242598772049,
0.13748782873153687,
-0.12211084365844727,
0.01580212451517582,
-0.030643748119473457,
-0.07354072481393814,
-0.1829398274421692,
-0.3067367672920227,
-0.03797687962651253,
-0.04651077091693878,
0.0046662138774991035,
-0.09713663905858994,
-0.05565265193581581,
0.17411556839942932,
0.03882144019007683,
-0.07758509367704391,
0.038759879767894745,
-0.01827266812324524,
-0.051929865032434464,
0.09735669940710068,
-0.04567582905292511,
0.07757291197776794,
-0.15726004540920258,
-0.03840281814336777,
-0.09165693074464798,
0.02752061001956463,
-0.004487065132707357,
-0.029262922704219818,
-0.008072707802057266,
-0.0037156343460083008,
-0.0066175321117043495,
-0.03735162317752838,
-0.03521566838026047,
-0.01374518871307373,
0.012391293421387672,
0.14267617464065552,
0.020544353872537613,
-0.01622163876891136,
0.05878137797117233,
0.22780568897724152,
-0.006862669251859188,
-0.039201948791742325,
-0.10943389683961868,
0.244988813996315,
0.026928644627332687,
0.06583227217197418,
-0.02575998567044735,
-0.011728759855031967,
-0.006784580647945404,
0.3193856477737427,
0.3446618914604187,
0.002430843422189355,
-0.0059226443991065025,
0.06977199018001556,
0.010934065096080303,
0.06143336743116379,
0.09643703699111938,
0.010563593357801437,
0.1476198434829712,
-0.11022627353668213,
-0.0170647781342268,
-0.034920766949653625,
0.010368322022259235,
0.038451697677373886,
0.01840853877365589,
0.018143082037568092,
0.0016699816333130002,
-0.037883006036281586,
0.0976414754986763,
-0.16697664558887482,
-0.12385427951812744,
-0.07995230704545975,
-0.20716053247451782,
-0.15940135717391968,
-0.040460020303726196,
-0.007463681045919657,
0.06792039424180984,
0.15391667187213898,
-0.019216734915971756,
0.006530435755848885,
0.0037618065252900124,
0.01985233463346958,
-0.13458296656608582,
-0.1379932165145874,
0.12368974089622498,
0.051994744688272476,
0.14360204339027405,
-0.05169593170285225,
0.09059491753578186,
0.08329199254512787,
0.05139197036623955,
0.05201157554984093,
0.04997915402054787,
-0.0007961545488797128,
0.13508076965808868,
0.0017019747756421566,
-0.01679808646440506,
-0.05583924800157547,
-0.014175977557897568,
0.05211733281612396,
-0.11264889687299728,
0.04997242987155914,
-0.08146698027849197,
-0.10554865002632141,
-0.07513494789600372,
0.07047101855278015,
-0.08274850249290466,
0.07528989017009735,
0.17009998857975006,
-0.00927093904465437,
-0.02244902402162552,
-0.05346881225705147,
0.02123180404305458,
0.10179999470710754,
-0.05371972173452377,
-0.05770699679851532,
-0.15096984803676605,
-0.07200209051370621,
-0.13402605056762695,
0.0012651820434257388,
-0.19960923492908478,
-0.05716116726398468,
-0.10305464267730713,
-0.11019659787416458,
-0.08720497041940689,
0.06908658146858215,
0.07859017699956894,
0.04179529845714569,
-0.03521871939301491,
0.05456873029470444,
0.010745909065008163,
0.06597737967967987,
-0.1114521399140358,
-0.12832637131214142
] |
null | null |
transformers
|
hello
|
{}
|
feature-extraction
|
geekfeed/gpt2_ja
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"feature-extraction",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #gpt2 #feature-extraction #endpoints_compatible #text-generation-inference #region-us
|
hello
|
[] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
43
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.03398109972476959,
0.018888965249061584,
-0.007880806922912598,
0.02206742763519287,
0.1638207882642746,
0.037731513381004333,
0.014167925342917442,
0.15188084542751312,
0.02111278846859932,
-0.01889481209218502,
0.12038586288690567,
0.2157774418592453,
0.007423579227179289,
0.026820959523320198,
-0.04919186979532242,
-0.2778262794017792,
0.07768701016902924,
0.07007818669080734,
-0.015200494788587093,
0.11311005800962448,
0.06332482397556305,
-0.061867669224739075,
0.08461543172597885,
-0.0075836158357560635,
-0.16446669399738312,
0.02983976900577545,
0.03459633141756058,
-0.09017550945281982,
0.09742081165313721,
0.031538162380456924,
0.08061277121305466,
-0.0030193328857421875,
-0.09619250148534775,
-0.1701367348432541,
0.03636811301112175,
0.01950777694582939,
-0.07170717418193817,
0.05267986282706261,
0.09415104985237122,
-0.11167719960212708,
0.10849305242300034,
0.0750725269317627,
-0.037247199565172195,
0.03811614215373993,
-0.17764493823051453,
-0.15073971450328827,
-0.04758445918560028,
0.0533391609787941,
0.017745453864336014,
0.11936510354280472,
-0.012455670163035393,
0.07070692628622055,
-0.11889233440160751,
0.08903111517429352,
0.22376300394535065,
-0.30080661177635193,
0.0018852984067052603,
0.07588654011487961,
0.10191844403743744,
0.03295342996716499,
-0.03178124129772186,
0.040101271122694016,
-0.003247296903282404,
0.013320171274244785,
0.011266719549894333,
-0.08773378282785416,
-0.11182939261198044,
0.07106613367795944,
-0.09017190337181091,
-0.10721877217292786,
0.21817174553871155,
-0.03244520351290703,
0.05500418692827225,
-0.014300752431154251,
-0.09267299622297287,
-0.04172993078827858,
-0.03811154142022133,
0.030281148850917816,
-0.03982057794928551,
0.09436848014593124,
0.04030688852071762,
-0.09127417206764221,
-0.11697132140398026,
-0.04788464680314064,
-0.1479818969964981,
0.16025954484939575,
0.012929270043969154,
0.08378656208515167,
-0.20367296040058136,
0.09093020111322403,
-0.06377104669809341,
-0.09114819020032883,
0.0008944225264713168,
-0.09258513897657394,
0.05764925107359886,
0.030522892251610756,
-0.05159629508852959,
-0.058944087475538254,
0.08484308421611786,
0.08071741461753845,
-0.08752124011516571,
0.019243856891989708,
-0.027394913136959076,
0.12002275139093399,
0.034472789615392685,
0.10747028142213821,
-0.014223221689462662,
0.012611130252480507,
0.028984911739826202,
-0.1251607984304428,
-0.019202345982193947,
-0.06676150113344193,
-0.13292451202869415,
-0.043604638427495956,
0.044865842908620834,
0.08819202333688736,
0.03449966385960579,
0.056729692965745926,
-0.06338269263505936,
-0.03321978077292442,
0.0426153689622879,
-0.06894927471876144,
-0.010898982174694538,
0.013982950709760189,
0.02018221653997898,
0.13711854815483093,
-0.014758848585188389,
-0.008163164369761944,
-0.13247188925743103,
0.01187373697757721,
-0.09224829077720642,
0.04010491073131561,
-0.04569730535149574,
-0.04844955727458,
0.029080700129270554,
-0.11786365509033203,
0.015217216685414314,
-0.12803979218006134,
-0.19048921763896942,
0.021529939025640488,
0.022580565884709358,
-0.016545366495847702,
-0.007920068688690662,
-0.03488989919424057,
-0.048507750034332275,
0.03022949770092964,
-0.06396739929914474,
-0.04076502099633217,
-0.07272981107234955,
0.11143545061349869,
-0.045736029744148254,
0.057781461626291275,
-0.1116175726056099,
0.08935883641242981,
-0.12268462032079697,
0.02292976900935173,
-0.13628758490085602,
0.07401396334171295,
0.0003269376466050744,
0.1265856772661209,
-0.030688438564538956,
-0.05801154300570488,
-0.10415884852409363,
0.04890894889831543,
-0.05099237710237503,
0.18874415755271912,
-0.07314716279506683,
-0.132068932056427,
0.2825451195240021,
-0.08712209761142731,
-0.1619558483362198,
0.0821814239025116,
0.002288002287968993,
0.0011562600266188383,
0.07734411209821701,
0.26328665018081665,
0.04767141491174698,
0.012914770282804966,
0.05733843892812729,
0.14165394008159637,
-0.11889725178480148,
-0.08776232600212097,
0.04592924565076828,
-0.05467674508690834,
-0.06730940192937851,
0.03713081777095795,
0.008485463447868824,
0.08996264636516571,
-0.03478427976369858,
-0.03712758049368858,
-0.03343157842755318,
0.02333451248705387,
0.056347258388996124,
0.019738823175430298,
0.10667337477207184,
-0.026125825941562653,
-0.02497928962111473,
-0.034567736089229584,
-0.011004184372723103,
-0.059423934668302536,
0.053771961480379105,
-0.030248109251260757,
0.1906294971704483,
-0.06823684275150299,
0.05094831436872482,
-0.22435173392295837,
-0.06820753216743469,
-0.004211461171507835,
0.07073214650154114,
-0.039220795035362244,
0.0832938551902771,
0.06353346258401871,
-0.058927517384290695,
0.013046800158917904,
-0.01576404459774494,
0.06569082289934158,
-0.028422480449080467,
-0.024334972724318504,
-0.052377067506313324,
0.027739550918340683,
-0.061196014285087585,
-0.053170040249824524,
-0.03612736985087395,
0.010731925256550312,
0.07667560130357742,
0.05965448543429375,
0.007638424169272184,
0.0006284333067014813,
-0.015595687553286552,
0.007677802350372076,
-0.036700669676065445,
-0.012013350613415241,
0.0970931351184845,
-0.016269613057374954,
-0.05795665457844734,
0.21139363944530487,
-0.11804867535829544,
0.229364275932312,
0.1870921552181244,
-0.2910948693752289,
0.03751945123076439,
-0.06085366755723953,
-0.029841719195246696,
0.04603089764714241,
0.055164728313684464,
-0.03180317208170891,
0.15262430906295776,
0.006412658374756575,
0.1575840264558792,
-0.07221928238868713,
-0.04858708009123802,
-0.0020706183277070522,
-0.022634578868746758,
-0.005056939087808132,
0.06708059459924698,
0.10437332093715668,
-0.13821543753147125,
0.18204782903194427,
0.21798604726791382,
0.03703448921442032,
0.17130611836910248,
-0.03605227544903755,
-0.049429163336753845,
0.06596335768699646,
0.030548300594091415,
-0.0606444887816906,
-0.0224105603992939,
-0.2642643451690674,
-0.033112429082393646,
0.07041537761688232,
0.06370636820793152,
0.12677060067653656,
-0.12888126075267792,
-0.058353882282972336,
-0.010310319252312183,
-0.04592818394303322,
-0.08240234851837158,
0.06467947363853455,
0.058315861970186234,
0.09284047037363052,
0.020109934732317924,
0.019121259450912476,
0.10160421580076218,
0.011888816021382809,
-0.0834352895617485,
0.20919658243656158,
-0.10691818594932556,
-0.33072149753570557,
-0.11730197072029114,
-0.11191093176603317,
-0.018273120746016502,
0.035177819430828094,
0.10866014659404755,
-0.09546976536512375,
0.0023446951527148485,
0.008298582397401333,
0.09128335118293762,
-0.12794998288154602,
0.0025515230372548103,
-0.0684722512960434,
0.03973132371902466,
-0.08970900624990463,
-0.08900129795074463,
-0.06379032880067825,
-0.04358667880296707,
-0.05044618621468544,
0.10409896820783615,
-0.11525647342205048,
0.07005775719881058,
0.1644221395254135,
0.047608498483896255,
0.08080147951841354,
-0.03030022233724594,
0.18990203738212585,
-0.09846929460763931,
-0.0659480020403862,
0.2298540621995926,
-0.028370128944516182,
0.08424650132656097,
0.0628245621919632,
0.01597674936056137,
-0.07621397078037262,
-0.030295180156826973,
-0.0339762419462204,
-0.09855901449918747,
-0.2284165620803833,
-0.08408627659082413,
-0.14842261373996735,
0.07971551269292831,
0.0411498099565506,
0.062271494418382645,
0.1165519580245018,
0.06165260821580887,
-0.002714910078793764,
0.000054978627304080874,
0.02230384759604931,
0.061018384993076324,
0.15415942668914795,
-0.022541670128703117,
0.09027498215436935,
-0.04255199804902077,
-0.09858961403369904,
0.07433100789785385,
0.05764248967170715,
0.209857776761055,
0.04435478523373604,
0.07476995885372162,
0.03653637692332268,
0.10550344735383987,
0.11695322394371033,
0.12069892883300781,
-0.02038560062646866,
-0.012955543585121632,
-0.031216325238347054,
-0.01577240601181984,
-0.05866384506225586,
0.020419230684638023,
0.10001645237207413,
-0.12741778790950775,
-0.08219089359045029,
-0.15515148639678955,
0.11313740909099579,
0.0962664857506752,
0.04331975430250168,
-0.20465125143527985,
0.004914415068924427,
0.08205442875623703,
-0.012022766284644604,
-0.10245295614004135,
0.08327359706163406,
0.005149856675416231,
-0.14275036752223969,
0.030621670186519623,
-0.060265444219112396,
0.13418631255626678,
-0.057447563856840134,
0.06945997476577759,
-0.05253184959292412,
-0.09419374912977219,
0.035760682076215744,
0.11686872690916061,
-0.21788617968559265,
0.21613872051239014,
0.0026431537698954344,
-0.03310718387365341,
-0.08006592094898224,
0.019969036802649498,
0.015163341537117958,
0.10334154963493347,
0.16158930957317352,
0.019047625362873077,
-0.03775472193956375,
-0.07836414128541946,
0.009233962744474411,
0.049238868057727814,
0.12409283220767975,
-0.07148358970880508,
-0.010493466630578041,
-0.031920094043016434,
0.010391357354819775,
-0.045084841549396515,
0.009252982214093208,
0.0597233772277832,
-0.16483980417251587,
0.06186329945921898,
-0.03629123792052269,
0.09790697693824768,
0.0032208876218646765,
0.014882716350257397,
-0.03578269109129906,
0.2101571261882782,
-0.09025534987449646,
-0.10193339735269547,
-0.10476431995630264,
-0.04868541285395622,
0.07686401903629303,
-0.06545232981443405,
0.05183243378996849,
-0.058126915246248245,
0.005683800671249628,
-0.056190673261880875,
-0.21592795848846436,
0.12118171900510788,
-0.08577093482017517,
0.01022536214441061,
-0.011741469614207745,
0.19437716901302338,
-0.07446233928203583,
0.004525590687990189,
0.034225184470415115,
0.019535740837454796,
-0.11910305917263031,
-0.11822032928466797,
0.019935302436351776,
0.019267160445451736,
0.06433448195457458,
0.08427267521619797,
-0.05389643833041191,
0.03955468535423279,
-0.02731584571301937,
0.042394526302814484,
0.3187265992164612,
0.11196407675743103,
-0.0353592112660408,
0.16189156472682953,
0.09700864553451538,
-0.06664200127124786,
-0.28562334179878235,
-0.05837123095989227,
-0.10460895299911499,
-0.05291508510708809,
-0.04825713858008385,
-0.1972295343875885,
0.07408171147108078,
0.05559791252017021,
-0.003069472499191761,
0.16237546503543854,
-0.2998795211315155,
-0.045410264283418655,
0.09083554893732071,
0.0029976507648825645,
0.35047677159309387,
-0.13662292063236237,
-0.1086484044790268,
-0.009997251443564892,
-0.21793033182621002,
0.15688547492027283,
-0.05111567676067352,
0.10258376598358154,
-0.026647847145795822,
0.04984891042113304,
0.03732454776763916,
-0.06094549968838692,
0.10685287415981293,
0.03344356268644333,
0.025576556101441383,
-0.0670226588845253,
-0.05422757565975189,
0.07657074183225632,
0.002487692516297102,
0.030322445556521416,
-0.032831352204084396,
0.028334131464362144,
-0.17057624459266663,
-0.029128391295671463,
-0.12150215357542038,
0.055270057171583176,
0.052710097283124924,
-0.0523073710501194,
-0.007498493883758783,
-0.04918438941240311,
0.01702260784804821,
0.04285073280334473,
0.2606382668018341,
-0.046974942088127136,
0.137732595205307,
0.02088603377342224,
0.05576815456151962,
-0.14392180740833282,
-0.13369505107402802,
-0.0650082528591156,
-0.030967850238084793,
0.10930955410003662,
-0.16018564999103546,
0.06498073041439056,
0.10285113751888275,
-0.02876986563205719,
0.07274043560028076,
0.13819840550422668,
-0.00472244480624795,
0.01669573038816452,
0.112991563975811,
-0.2267450988292694,
-0.03186233714222908,
-0.07737206667661667,
-0.07555244117975235,
0.0703311413526535,
0.060993991792201996,
0.14077068865299225,
0.035168327391147614,
-0.040421005338430405,
-0.014719605445861816,
0.015234951861202717,
-0.06634876877069473,
0.043525055050849915,
0.010127517394721508,
0.026140758767724037,
-0.15715591609477997,
0.08541101217269897,
-0.005174722988158464,
-0.21036988496780396,
-0.005740928929299116,
0.13913199305534363,
-0.11823615431785583,
-0.11555139720439911,
-0.08236245065927505,
0.10451383888721466,
-0.11297609657049179,
-0.019400622695684433,
-0.05483614280819893,
-0.11052791029214859,
0.081906758248806,
0.12431976944208145,
0.06726083159446716,
0.10916067659854889,
-0.03168467432260513,
-0.026300154626369476,
-0.015285271219909191,
-0.03668741509318352,
-0.013741668313741684,
-0.008208153769373894,
-0.07847334444522858,
0.014990213327109814,
-0.008566166274249554,
0.16698555648326874,
-0.07026736438274384,
-0.0700295940041542,
-0.141659677028656,
0.04910919442772865,
-0.05881010368466377,
-0.04814119637012482,
-0.1198590025305748,
-0.05525754392147064,
-0.012653029523789883,
0.006583063397556543,
-0.04662506654858589,
-0.024548327550292015,
-0.1374368518590927,
0.0030091346707195044,
-0.052475202828645706,
0.008287256583571434,
-0.0802515521645546,
-0.004945927299559116,
0.0840594694018364,
-0.05329854041337967,
0.13421107828617096,
0.1870245784521103,
-0.06989633291959763,
0.14337709546089172,
-0.1629752367734909,
-0.1284065842628479,
0.11090626567602158,
0.013513484038412571,
0.022571967914700508,
0.08140571415424347,
0.04480253532528877,
0.0463559664785862,
-0.022272175177931786,
0.03338497504591942,
-0.05073343962430954,
-0.13974528014659882,
-0.004085231106728315,
-0.029691126197576523,
-0.1317378729581833,
-0.052524253726005554,
-0.029826022684574127,
0.07296694815158844,
0.04760735109448433,
0.10114012658596039,
-0.013234374113380909,
0.11676392704248428,
-0.062080830335617065,
0.015033978968858719,
0.015335427597165108,
-0.16618193686008453,
-0.006365602370351553,
-0.08356084674596786,
0.022899894043803215,
-0.007652601692825556,
0.2549842894077301,
0.038511525839567184,
0.01804349757730961,
0.004097143188118935,
0.057547327131032944,
0.081858329474926,
0.018365686759352684,
0.24671779572963715,
0.11142686009407043,
-0.07029286026954651,
-0.07353771477937698,
0.09077855199575424,
0.019628295674920082,
0.04568719491362572,
0.14286483824253082,
0.09640691429376602,
0.045025210827589035,
0.10824672132730484,
0.005120187532156706,
0.022424109280109406,
-0.07746665179729462,
-0.13389267027378082,
0.03348182886838913,
0.07298589497804642,
0.023791633546352386,
0.10110621899366379,
0.16689391434192657,
-0.033899758011102676,
0.07309817522764206,
0.021548748016357422,
-0.04892140254378319,
-0.14492210745811462,
-0.09998001903295517,
-0.049641843885183334,
-0.1533062905073166,
0.007883194833993912,
-0.12904952466487885,
0.03898053616285324,
0.11353705823421478,
0.036048900336027145,
-0.04310021549463272,
0.11286387592554092,
0.09136443585157394,
-0.12094449251890182,
0.091287761926651,
-0.0397527813911438,
0.04845031723380089,
0.03778262808918953,
-0.002082721097394824,
-0.0356530137360096,
-0.0725937932729721,
0.010915938764810562,
0.0591718889772892,
-0.052339401096105576,
0.03717206045985222,
-0.16546553373336792,
-0.11471116542816162,
-0.045449063181877136,
0.07239927351474762,
-0.07715584337711334,
0.13048553466796875,
0.01874645985662937,
-0.0397508442401886,
0.039839860051870346,
0.22882109880447388,
-0.06349016726016998,
0.013690697029232979,
-0.037353772670030594,
0.19437791407108307,
0.09217066317796707,
0.08464698493480682,
-0.006502367090433836,
-0.011976183392107487,
-0.03962603583931923,
0.302339106798172,
0.2769756615161896,
-0.05413399264216423,
0.021783562377095222,
0.0516604445874691,
0.03920990973711014,
0.12926049530506134,
0.09020253270864487,
0.10022936761379242,
0.2921079397201538,
-0.0817907303571701,
-0.03348951041698456,
-0.02699616551399231,
-0.015302911400794983,
-0.09093115478754044,
0.07280806452035904,
0.0739109143614769,
-0.08109041303396225,
-0.03497714549303055,
0.11526436358690262,
-0.2400989979505539,
0.12282750755548477,
-0.002933987881988287,
-0.1795569211244583,
-0.056720834225416183,
-0.04313552752137184,
0.10610103607177734,
0.03206976503133774,
0.11494471877813339,
-0.019204318523406982,
-0.1244814544916153,
0.04311351478099823,
0.05702316388487816,
-0.2793540060520172,
-0.059641994535923004,
0.06888436526060104,
-0.015986070036888123,
0.014617225155234337,
-0.02941036783158779,
0.017946835607290268,
0.059734512120485306,
0.05357978865504265,
-0.018864421173930168,
0.02963065542280674,
0.0022093323059380054,
-0.03737632557749748,
-0.02556825615465641,
0.054018568247556686,
0.008156261406838894,
-0.12621402740478516,
0.0650884360074997,
-0.12567831575870514,
0.025649594143033028,
-0.03809667006134987,
-0.035866763442754745,
-0.011961621232330799,
-0.0526297464966774,
-0.06428948789834976,
0.03214455768465996,
0.09690435975790024,
0.0145519208163023,
-0.016329284757375717,
-0.0796203538775444,
-0.005787264090031385,
0.01997840218245983,
-0.05855453759431839,
-0.1362132430076599,
-0.08323870599269867,
-0.10483372211456299,
0.10783778131008148,
-0.03394462540745735,
-0.14413024485111237,
0.010058521293103695,
-0.058960650116205215,
0.06401345878839493,
-0.1452757716178894,
0.06445631384849548,
0.057790156453847885,
0.025990890339016914,
0.005028141662478447,
-0.04339994862675667,
0.06329231709241867,
0.09186097234487534,
-0.10053910315036774,
-0.0816805437207222
] |
null | null | null |
https://dl.fbaipublicfiles.com/avhubert/model/lrs3_vox/vsr/base_vox_433h.pt
|
{}
| null |
g30rv17ys/avhubert
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
URL
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
[
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null |
transformers
|
# Please use 'Bert' related functions to load this model!
## Chinese BERT with Whole Word Masking Fix MLM Parameters
Init parameters by https://huggingface.co/hfl/chinese-roberta-wwm-ext-large
miss mlm parameters issue https://github.com/ymcui/Chinese-BERT-wwm/issues/98
Only train MLM parameters and freeze other parameters
More info in github https://github.com/genggui001/chinese_roberta_wwm_large_ext_fix_mlm
|
{"language": ["zh"], "license": "apache-2.0", "tags": ["bert"]}
|
fill-mask
|
genggui001/chinese_roberta_wwm_large_ext_fix_mlm
|
[
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"zh",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"zh"
] |
TAGS
#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #zh #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# Please use 'Bert' related functions to load this model!
## Chinese BERT with Whole Word Masking Fix MLM Parameters
Init parameters by URL
miss mlm parameters issue URL
Only train MLM parameters and freeze other parameters
More info in github URL
|
[
"# Please use 'Bert' related functions to load this model!",
"## Chinese BERT with Whole Word Masking Fix MLM Parameters\n\nInit parameters by URL\n\nmiss mlm parameters issue URL\n\nOnly train MLM parameters and freeze other parameters\n\nMore info in github URL"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #zh #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Please use 'Bert' related functions to load this model!",
"## Chinese BERT with Whole Word Masking Fix MLM Parameters\n\nInit parameters by URL\n\nmiss mlm parameters issue URL\n\nOnly train MLM parameters and freeze other parameters\n\nMore info in github URL"
] |
[
57,
15,
47
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #safetensors #bert #fill-mask #zh #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Please use 'Bert' related functions to load this model!## Chinese BERT with Whole Word Masking Fix MLM Parameters\n\nInit parameters by URL\n\nmiss mlm parameters issue URL\n\nOnly train MLM parameters and freeze other parameters\n\nMore info in github URL"
] |
[
-0.09657584130764008,
0.014937174506485462,
-0.00013858656166121364,
0.08321996033191681,
0.15014703571796417,
-0.0393122173845768,
0.08162843436002731,
0.054948292672634125,
0.06704584509134293,
-0.027084313333034515,
0.11910072714090347,
0.10603979229927063,
0.03329062834382057,
0.10985727608203888,
-0.026906006038188934,
-0.21341829001903534,
0.05269964784383774,
0.018591051921248436,
-0.06543781608343124,
0.07530587911605835,
0.07597199082374573,
-0.05097714066505432,
0.16944816708564758,
0.05451894924044609,
-0.040385652333498,
0.01364955399185419,
0.0013797236606478691,
-0.10320337116718292,
0.0682741329073906,
0.05931353196501732,
0.005219019018113613,
0.014095436781644821,
0.0154954195022583,
-0.06991001218557358,
0.043227534741163254,
-0.008759918622672558,
-0.02268681488931179,
0.05923936143517494,
0.09599623829126358,
0.045343391597270966,
-0.04179689288139343,
-0.05975354462862015,
-0.04345576465129852,
0.05105319619178772,
-0.02177104540169239,
0.06791005283594131,
-0.09891175478696823,
0.04406121000647545,
0.1063317358493805,
0.0848664790391922,
0.02165343426167965,
0.24182720482349396,
-0.06952402740716934,
0.08407330513000488,
0.23103687167167664,
-0.3744494915008545,
0.010536257177591324,
0.1440165638923645,
0.03856809064745903,
-0.13306476175785065,
-0.012201428413391113,
0.10887439548969269,
0.04257219284772873,
-0.0007271781214512885,
0.0792469009757042,
-0.07259795814752579,
0.0717727467417717,
-0.004769572056829929,
-0.129052072763443,
0.027090078219771385,
0.11787810176610947,
0.007888921536505222,
-0.05261492356657982,
-0.03233960270881653,
-0.016248973086476326,
0.11521845310926437,
-0.04842352494597435,
0.03356530889868736,
0.006432632450014353,
0.03651784360408783,
-0.0451216995716095,
-0.06104887276887894,
-0.05135304480791092,
-0.01382110733538866,
-0.10834715515375137,
0.1362115740776062,
0.04441653564572334,
0.024125637486577034,
-0.14768296480178833,
-0.0064612808637320995,
-0.13584686815738678,
-0.1342565268278122,
-0.03144511207938194,
-0.05624491348862648,
0.0484783910214901,
-0.03476090729236603,
-0.05885599926114082,
-0.0820905789732933,
0.11320488154888153,
0.017868392169475555,
-0.01351193618029356,
0.04466462880373001,
0.04596681892871857,
0.035261910408735275,
0.047691237181425095,
0.09518437087535858,
-0.16620521247386932,
-0.027252966538071632,
0.15670780837535858,
-0.07818544656038284,
0.0343736857175827,
-0.021738355979323387,
-0.09869406372308731,
-0.01179395243525505,
0.03278611600399017,
0.08385586738586426,
0.00047059825737960637,
0.132233127951622,
-0.021631497889757156,
-0.018599966540932655,
0.05459883064031601,
-0.10020039975643158,
-0.00593297416344285,
-0.02454741671681404,
-0.024639086797833443,
-0.0346841886639595,
0.059278473258018494,
0.017467226833105087,
-0.06811930239200592,
0.09916167706251144,
-0.11322375386953354,
-0.011369609273970127,
-0.03289690613746643,
-0.08260856568813324,
0.08294074982404709,
-0.0881267860531807,
0.060903679579496384,
-0.16503678262233734,
-0.17898309230804443,
0.0895274356007576,
0.048101671040058136,
-0.006561225280165672,
0.014372190460562706,
0.06977656483650208,
-0.04978403076529503,
-0.004421835765242577,
-0.015038857236504555,
0.06074687838554382,
-0.0529966838657856,
0.02100421115756035,
0.003520377678796649,
0.05934092402458191,
-0.08611361682415009,
0.02704494260251522,
-0.09592200070619583,
0.04376194626092911,
-0.0912327989935875,
0.020364562049508095,
-0.0290698129683733,
0.10608115792274475,
-0.10675112903118134,
-0.026729905977845192,
0.060970108956098557,
0.006562802940607071,
0.08219712972640991,
0.20662221312522888,
-0.09835507720708847,
-0.03967885673046112,
0.1260351538658142,
-0.08784148097038269,
-0.13953576982021332,
0.10648848861455917,
-0.0013129723956808448,
-0.05531037226319313,
0.08981325477361679,
0.11516550928354263,
-0.08783110976219177,
-0.178773894906044,
-0.009696725755929947,
0.0651983916759491,
-0.06539896875619888,
-0.10450460761785507,
0.08717894554138184,
0.026711633428931236,
-0.2348553091287613,
0.04726128280162811,
0.0034956110175698996,
0.07881040126085281,
-0.053008005023002625,
-0.06201741099357605,
-0.04136822372674942,
-0.07276356965303421,
-0.003843032754957676,
-0.05711536481976509,
0.07059638947248459,
-0.029015861451625824,
-0.004217176698148251,
0.04702858626842499,
0.04123968631029129,
0.04633979871869087,
0.002895824844017625,
-0.11001784354448318,
0.044920001178979874,
-0.09260254353284836,
0.07035883516073227,
-0.10141842067241669,
-0.09955228865146637,
-0.004276891238987446,
-0.025882236659526825,
0.06495313346385956,
-0.0626540556550026,
0.0848526656627655,
0.06739258766174316,
-0.03648288920521736,
0.012910221703350544,
0.11308980733156204,
0.03756696358323097,
-0.09409507364034653,
-0.17028458416461945,
-0.019926967099308968,
-0.039917293936014175,
-0.015552055090665817,
0.02633637748658657,
0.0021314111072570086,
0.054443467408418655,
0.05815456062555313,
-0.0024851092603057623,
0.004598981235176325,
0.10433535277843475,
-0.0037253363989293575,
-0.02199205383658409,
-0.020817961543798447,
0.042670223861932755,
0.03093715012073517,
-0.10916714370250702,
0.20050419867038727,
-0.03485068306326866,
0.09129111468791962,
0.18241846561431885,
0.016789333894848824,
-0.0065140267834067345,
-0.003171975491568446,
0.007918808609247208,
-0.013932783156633377,
-0.014281253330409527,
0.0030459112022072077,
0.11647859215736389,
-0.002153227338567376,
0.14734996855258942,
-0.07776414602994919,
-0.015068543143570423,
0.026773186400532722,
-0.09358105808496475,
-0.1281954050064087,
0.07532815635204315,
0.11339034140110016,
-0.1538490653038025,
0.12571106851100922,
0.12266530096530914,
-0.023012196645140648,
0.16680221259593964,
0.02258407138288021,
-0.000331343908328563,
-0.06264621019363403,
-0.029975419864058495,
0.030422374606132507,
0.0438104085624218,
-0.11906123906373978,
-0.004533152095973492,
0.04591444879770279,
-0.03269970789551735,
0.05462952330708504,
-0.07720143347978592,
-0.04159404709935188,
0.027881985530257225,
0.018351996317505836,
-0.05052776634693146,
0.07543611526489258,
-0.05312255024909973,
0.0569864921271801,
0.010376937687397003,
-0.03330453857779503,
0.07029958814382553,
0.018408408388495445,
-0.11016593873500824,
0.16448494791984558,
-0.12186060845851898,
-0.2916882634162903,
-0.11853484064340591,
-0.2282705008983612,
-0.14647558331489563,
-0.0024357924703508615,
0.05330474302172661,
-0.12379321455955505,
-0.11613146960735321,
-0.04923342540860176,
-0.011587300337851048,
0.009979442693293095,
0.07105941325426102,
0.041176605969667435,
-0.010192700661718845,
0.013676798902451992,
-0.09922710061073303,
-0.04163896292448044,
-0.0005062131094746292,
-0.017578886821866035,
0.1013030931353569,
-0.049197517335414886,
0.07584225386381149,
0.06461423635482788,
0.006407574284821749,
0.055948685854673386,
-0.003946491051465273,
0.23323370516300201,
0.0164374690502882,
0.033615097403526306,
0.18185213208198547,
-0.00605816813185811,
0.04988984018564224,
0.1058925986289978,
0.02615499496459961,
-0.06285348534584045,
0.04732730612158775,
-0.02244916372001171,
-0.10698965936899185,
-0.18591099977493286,
-0.09866407513618469,
-0.08991699665784836,
-0.005281220655888319,
0.019757602363824844,
0.04585054889321327,
0.00826955121010542,
0.08201119303703308,
0.019646404311060905,
0.040956348180770874,
-0.035128798335790634,
0.06984846293926239,
-0.04187596216797829,
0.018057331442832947,
0.05624953657388687,
-0.0009484196198172867,
-0.028722604736685753,
0.11862750351428986,
-0.015758488327264786,
0.11037003248929977,
0.02921174094080925,
0.03733447939157486,
0.07589203119277954,
0.11813602596521378,
0.0858585312962532,
0.1465580016374588,
-0.08089591562747955,
-0.040852807462215424,
-0.03133681043982506,
-0.12767134606838226,
-0.03569617494940758,
0.03927375748753548,
-0.012352222576737404,
-0.07179244607686996,
-0.07320445030927658,
0.07900427281856537,
0.034816939383745193,
0.1908089965581894,
0.021199822425842285,
-0.20195169746875763,
-0.027513660490512848,
0.022808300331234932,
0.014353970065712929,
-0.004562463145703077,
0.02287190407514572,
0.00913733709603548,
-0.09884940832853317,
0.11237946152687073,
-0.013692251406610012,
0.1191372200846672,
0.06637269258499146,
0.03599855676293373,
-0.08343961089849472,
0.09928236901760101,
-0.03383052721619606,
0.08230041712522507,
-0.25072821974754333,
0.20172743499279022,
0.057255376130342484,
0.014757406897842884,
-0.10358022153377533,
0.01609918847680092,
0.09989552944898605,
0.1245117112994194,
0.11491654813289642,
0.039160702377557755,
0.04344695433974266,
-0.14631804823875427,
-0.1258472502231598,
0.023477155715227127,
-0.014967991970479488,
0.058580897748470306,
0.013944987207651138,
-0.009836441837251186,
-0.06556038558483124,
-0.0006956707802601159,
0.131492480635643,
-0.1298029124736786,
-0.07058214396238327,
0.01593460515141487,
0.08771531283855438,
0.014849203638732433,
-0.06285537034273148,
-0.06265737116336823,
-0.06864885240793228,
0.22264865040779114,
0.10987504571676254,
-0.06584649533033371,
-0.14333780109882355,
-0.04313330724835396,
0.052552610635757446,
-0.1112402155995369,
0.033138468861579895,
-0.07859606295824051,
0.11317916214466095,
-0.07266855239868164,
-0.17086774110794067,
0.11132838577032089,
-0.15723413228988647,
0.01152857206761837,
0.0018006520112976432,
0.12783700227737427,
-0.041395965963602066,
0.03526819869875908,
0.07885471731424332,
-0.02445421740412712,
-0.0010166834108531475,
-0.10354658216238022,
-0.04059736058115959,
0.021842731162905693,
0.11801981180906296,
0.08382220566272736,
-0.156023308634758,
-0.10229721665382385,
0.005449859891086817,
-0.05099089816212654,
0.17416897416114807,
0.26149502396583557,
-0.02886013686656952,
0.08149582147598267,
0.2593616247177124,
0.004925131797790527,
-0.3048955798149109,
-0.05537409335374832,
-0.05234770476818085,
-0.009685220196843147,
0.0034958638716489077,
-0.0940445140004158,
0.1417676955461502,
0.00412795040756464,
-0.021359462291002274,
-0.0015660596545785666,
-0.08627034723758698,
-0.1308320164680481,
0.1403779834508896,
0.0812269076704979,
0.2960852384567261,
-0.1123242899775505,
-0.07330050319433212,
-0.10911885648965836,
-0.04832100495696068,
0.07413745671510696,
-0.06982095539569855,
0.11196524649858475,
0.022999970242381096,
0.039407841861248016,
0.012253115884959698,
-0.09796695411205292,
0.10493840277194977,
-0.019965212792158127,
0.017240770161151886,
-0.07372424751520157,
-0.046471260488033295,
0.017387544736266136,
-0.02774488739669323,
0.1640423685312271,
-0.12152901291847229,
0.05658172070980072,
-0.018506992608308792,
-0.04794571176171303,
-0.042601171880960464,
0.10532649606466293,
0.0175783671438694,
-0.0894927904009819,
-0.006873581558465958,
0.07142969965934753,
0.03279026970267296,
-0.03056296892464161,
0.07901066541671753,
-0.004251774400472641,
-0.0847267359495163,
0.2349385768175125,
0.03640555217862129,
-0.2572192847728729,
0.01168602891266346,
-0.016326434910297394,
-0.08541767299175262,
0.10859896242618561,
-0.02724689058959484,
0.054566312581300735,
0.022108888253569603,
0.02374977432191372,
0.09641839563846588,
0.029175560921430588,
-0.026904573664069176,
-0.051015209406614304,
0.08247390389442444,
-0.16598227620124817,
-0.010299180634319782,
-0.10366304218769073,
0.07919304072856903,
0.025448041036725044,
0.03611773997545242,
0.12689445912837982,
-0.04997461661696434,
-0.01596699096262455,
-0.001270693144761026,
0.03035109117627144,
-0.05651301518082619,
0.12488909810781479,
-0.0038198502734303474,
0.03342735767364502,
-0.1721128672361374,
0.04465767368674278,
-0.01090244296938181,
0.05473979562520981,
0.06915812194347382,
0.07288806140422821,
-0.13612310588359833,
-0.0913078710436821,
0.03603778034448624,
0.1637980341911316,
-0.03152499347925186,
-0.03121103346347809,
-0.029095593839883804,
-0.07092340290546417,
0.03957090154290199,
0.13980638980865479,
0.09165248274803162,
-0.015490551479160786,
-0.004698721691966057,
-0.10476242750883102,
-0.06482146680355072,
0.02559036947786808,
0.055473826825618744,
0.019128357991576195,
-0.12133771926164627,
0.07733040302991867,
0.0013535722391679883,
0.13479620218276978,
-0.08664920181035995,
-0.014718115329742432,
-0.17296764254570007,
-0.0128286462277174,
-0.1515144407749176,
0.07329875230789185,
-0.13061314821243286,
-0.0387987457215786,
-0.026943618431687355,
-0.04101255536079407,
-0.05122196674346924,
0.056255925446748734,
-0.08133886754512787,
-0.035547979176044464,
-0.02608325704932213,
0.04917473718523979,
-0.08315104246139526,
-0.056970320641994476,
0.07124370336532593,
-0.02393740601837635,
0.067350834608078,
0.07734814286231995,
-0.05915144458413124,
0.059297315776348114,
-0.0904000774025917,
-0.13009750843048096,
0.08933408558368683,
0.023050377145409584,
0.07775334268808365,
-0.09990835934877396,
0.01959593966603279,
0.030791396275162697,
-0.007329442072659731,
0.009445780888199806,
0.18548732995986938,
-0.12301038950681686,
-0.08359726518392563,
-0.008522910065948963,
-0.038092754781246185,
-0.0103214206174016,
-0.01733418181538582,
0.14983491599559784,
0.011388805694878101,
0.177156001329422,
-0.06402318179607391,
0.009229161776602268,
-0.05379653349518776,
0.02822105586528778,
-0.04767250269651413,
-0.11633127182722092,
-0.12318183481693268,
-0.01706894487142563,
0.016438178718090057,
-0.03632618114352226,
0.1504037231206894,
0.028964655473828316,
-0.0931103453040123,
0.0346171110868454,
0.12326937168836594,
0.10302083194255829,
-0.02432297356426716,
0.19437922537326813,
-0.0008588695200160146,
0.04072872921824455,
-0.025991644710302353,
0.08037369698286057,
0.08219127357006073,
-0.05376812815666199,
0.10604795068502426,
0.007560450583696365,
0.029811566695570946,
0.08922675251960754,
0.05782898887991905,
0.022690357640385628,
-0.059444673359394073,
-0.14055681228637695,
-0.06729429215192795,
0.08783508092164993,
0.0050935084000229836,
0.03039439767599106,
0.1699099838733673,
-0.08946126699447632,
0.06534728407859802,
0.012898021377623081,
-0.04701438918709755,
-0.13914109766483307,
-0.19439351558685303,
-0.10204159468412399,
-0.005798848811537027,
0.03935721889138222,
-0.07807891070842743,
-0.034895382821559906,
0.06288080662488937,
0.060827963054180145,
0.00026800172054208815,
0.17422939836978912,
-0.16405637562274933,
-0.03514893725514412,
0.001809591893106699,
-0.01749492436647415,
-0.07192688435316086,
0.06465321034193039,
-0.03919408097863197,
-0.035359758883714676,
-0.03286362439393997,
-0.030647993087768555,
0.029219146817922592,
0.005812290124595165,
0.08786036819219589,
-0.0017884145490825176,
-0.08054936677217484,
-0.07115963101387024,
-0.02124135196208954,
0.07487579435110092,
0.10423769801855087,
0.014879556372761726,
-0.03143767639994621,
0.03084038943052292,
0.10778369009494781,
-0.03472728282213211,
-0.08230628818273544,
-0.10087224841117859,
0.1544203907251358,
0.017579423263669014,
0.008485387079417706,
0.007880287244915962,
-0.021380294114351273,
-0.035867542028427124,
0.3025079071521759,
0.26873987913131714,
-0.11048271507024765,
0.030512606725096703,
-0.09478974342346191,
0.02677311934530735,
0.01710430160164833,
0.1357365846633911,
0.05251891538500786,
0.13625271618366241,
-0.043837837874889374,
0.04702688008546829,
-0.08059968799352646,
-0.053277913480997086,
-0.1602313369512558,
0.027389967814087868,
0.06925853341817856,
-0.10281603038311005,
-0.011795255355536938,
0.07309921085834503,
-0.13590498268604279,
-0.0755874291062355,
-0.08936423808336258,
-0.08412931114435196,
-0.07175356894731522,
-0.01608634926378727,
-0.05779014900326729,
0.08181946724653244,
0.0844084769487381,
-0.01912613958120346,
0.09721868485212326,
-0.03515325486660004,
0.041738249361515045,
-0.17492417991161346,
-0.09655970335006714,
0.14622420072555542,
0.1488049477338791,
0.16527970135211945,
0.04288867861032486,
-0.010215920396149158,
0.07120759040117264,
0.013690588064491749,
-0.0830928236246109,
0.08852940797805786,
-0.01734819822013378,
-0.1502721607685089,
0.020574066787958145,
-0.01996886171400547,
-0.05628018453717232,
-0.028867848217487335,
-0.0547311007976532,
-0.026304082944989204,
-0.02084246091544628,
0.007071769330650568,
0.025512706488370895,
-0.06048503518104553,
0.12298916280269623,
-0.0952778309583664,
0.06709598004817963,
0.10640235245227814,
-0.04179181531071663,
-0.0313866063952446,
-0.06937964260578156,
0.09332847595214844,
0.011254659853875637,
-0.21634772419929504,
-0.09950727224349976,
-0.12192348390817642,
0.028006840497255325,
-0.08000063896179199,
0.026129815727472305,
-0.23037348687648773,
-0.04637816920876503,
-0.13177764415740967,
-0.0011951631167903543,
-0.15946990251541138,
0.06273354589939117,
0.1764431744813919,
0.002906900132074952,
0.010569503530859947,
-0.10997109115123749,
0.04067154973745346,
-0.007189431227743626,
-0.14438821375370026,
-0.1276816576719284
] |
null | null |
transformers
|
# xls-asr-vi-40h-1B
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on 40 hours of FPT Open Speech Dataset (FOSD) and Common Voice 7.0.
### Benchmark WER result:
| | [VIVOS](https://huggingface.co/datasets/vivos) | [COMMON VOICE 7.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0) | [COMMON VOICE 8.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0)
|---|---|---|---|
|without LM| 25.93 | 34.21 |
|with 4-grams LM| 24.11 | 25.84 | 31.158 |
### Benchmark CER result:
| | [VIVOS](https://huggingface.co/datasets/vivos) | [COMMON VOICE 7.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0) | [COMMON VOICE 8.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0)
|---|---|---|---|
|without LM| 9.24 | 19.94 |
|with 4-grams LM| 10.37 | 12.96 | 16.179 |
## Evaluation
Please use the eval.py file to run the evaluation
```python
python eval.py --model_id geninhu/xls-asr-vi-40h-1B --dataset mozilla-foundation/common_voice_7_0 --config vi --split test --log_outputs
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 4.6222 | 1.85 | 1500 | 5.9479 | 0.5474 |
| 1.1362 | 3.7 | 3000 | 7.9799 | 0.5094 |
| 0.7814 | 5.56 | 4500 | 5.0330 | 0.4724 |
| 0.6281 | 7.41 | 6000 | 2.3484 | 0.5020 |
| 0.5472 | 9.26 | 7500 | 2.2495 | 0.4793 |
| 0.4827 | 11.11 | 9000 | 1.1530 | 0.4768 |
| 0.4327 | 12.96 | 10500 | 1.6160 | 0.4646 |
| 0.3989 | 14.81 | 12000 | 3.2633 | 0.4703 |
| 0.3522 | 16.67 | 13500 | 2.2337 | 0.4708 |
| 0.3201 | 18.52 | 15000 | 3.6879 | 0.4565 |
| 0.2899 | 20.37 | 16500 | 5.4389 | 0.4599 |
| 0.2776 | 22.22 | 18000 | 3.5284 | 0.4537 |
| 0.2574 | 24.07 | 19500 | 2.1759 | 0.4649 |
| 0.2378 | 25.93 | 21000 | 3.3901 | 0.4448 |
| 0.217 | 27.78 | 22500 | 1.1632 | 0.4565 |
| 0.2115 | 29.63 | 24000 | 1.7441 | 0.4232 |
| 0.1959 | 31.48 | 25500 | 3.4992 | 0.4304 |
| 0.187 | 33.33 | 27000 | 3.6163 | 0.4369 |
| 0.1748 | 35.19 | 28500 | 3.6038 | 0.4467 |
| 0.17 | 37.04 | 30000 | 2.9708 | 0.4362 |
| 0.159 | 38.89 | 31500 | 3.2045 | 0.4279 |
| 0.153 | 40.74 | 33000 | 3.2427 | 0.4287 |
| 0.1463 | 42.59 | 34500 | 3.5439 | 0.4270 |
| 0.139 | 44.44 | 36000 | 3.9381 | 0.4150 |
| 0.1352 | 46.3 | 37500 | 4.1744 | 0.4092 |
| 0.1369 | 48.15 | 39000 | 4.2279 | 0.4154 |
| 0.1273 | 50.0 | 40500 | 4.1691 | 0.4133 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
{"language": ["vi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common-voice", "hf-asr-leaderboard", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_7_0"], "model-index": [{"name": "xls-asr-vi-40h-1B", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7.0", "type": "mozilla-foundation/common_voice_7_0", "args": "vi"}, "metrics": [{"type": "wer", "value": 25.846, "name": "Test WER (with LM)"}, {"type": "cer", "value": 12.961, "name": "Test CER (with LM)"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 8.0", "type": "mozilla-foundation/common_voice_8_0", "args": "vi"}, "metrics": [{"type": "wer", "value": 31.158, "name": "Test WER (with LM)"}, {"type": "cer", "value": 16.179, "name": "Test CER (with LM)"}]}]}]}
|
automatic-speech-recognition
|
geninhu/xls-asr-vi-40h-1B
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"common-voice",
"hf-asr-leaderboard",
"robust-speech-event",
"vi",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"vi"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
xls-asr-vi-40h-1B
=================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on 40 hours of FPT Open Speech Dataset (FOSD) and Common Voice 7.0.
### Benchmark WER result:
### Benchmark CER result:
Evaluation
----------
Please use the URL file to run the evaluation
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1500
* num\_epochs: 10.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
|
[
"### Benchmark WER result:",
"### Benchmark CER result:\n\n\n\nEvaluation\n----------\n\n\nPlease use the URL file to run the evaluation\n\n\nTraining procedure\n------------------",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Benchmark WER result:",
"### Benchmark CER result:\n\n\n\nEvaluation\n----------\n\n\nPlease use the URL file to run the evaluation\n\n\nTraining procedure\n------------------",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
98,
9,
25,
160,
4,
41
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Benchmark WER result:### Benchmark CER result:\n\n\n\nEvaluation\n----------\n\n\nPlease use the URL file to run the evaluation\n\n\nTraining procedure\n------------------### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 10.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
-0.08645769208669662,
0.11235358566045761,
-0.003629423677921295,
0.05070352926850319,
0.1305888593196869,
0.023242780938744545,
0.108546681702137,
0.12181372195482254,
-0.06051018089056015,
0.09309239685535431,
0.07602793723344803,
0.11011772602796555,
0.06748021394014359,
0.07952335476875305,
-0.03547321632504463,
-0.24307599663734436,
0.012254377827048302,
-0.013110860250890255,
-0.09246627986431122,
0.11242163926362991,
0.07265817373991013,
-0.11109045892953873,
0.0364060215651989,
0.024927597492933273,
-0.11268691718578339,
0.028092870488762856,
-0.03647970035672188,
-0.016231048852205276,
0.09720960259437561,
0.046008527278900146,
0.07455704361200333,
0.029322084039449692,
0.09958541393280029,
-0.2977120578289032,
0.012829623185098171,
0.08299090713262558,
0.02178795263171196,
0.0467359721660614,
0.0830119252204895,
0.002957546152174473,
0.15435245633125305,
-0.044664230197668076,
0.08047182112932205,
0.02882884442806244,
-0.09989416599273682,
-0.26471471786499023,
-0.07725134491920471,
0.016470152884721756,
0.12808649241924286,
0.10266765207052231,
-0.04035601764917374,
0.09893184155225754,
-0.08220133930444717,
0.0903669148683548,
0.2042102813720703,
-0.22890637814998627,
-0.07141339778900146,
-0.006671210750937462,
0.04039190337061882,
0.05782536417245865,
-0.10385343432426453,
-0.029949437826871872,
0.010995757766067982,
0.01889965869486332,
0.08748427778482437,
0.004427218344062567,
-0.0032943447586148977,
-0.006239715497940779,
-0.13675859570503235,
-0.051166608929634094,
0.08628330379724503,
0.08422962576150894,
-0.006950353737920523,
-0.062375880777835846,
-0.029490439221262932,
-0.16173525154590607,
-0.03795299306511879,
-0.007228440139442682,
0.05049052834510803,
-0.02585887536406517,
-0.05706518888473511,
0.04775754362344742,
-0.06753891706466675,
-0.05553145334124565,
0.047320686280727386,
0.12500926852226257,
0.04299309104681015,
-0.030993400141596794,
-0.014361471869051456,
0.07804066687822342,
0.00798949133604765,
-0.1596687287092209,
-0.005643187556415796,
0.051693204790353775,
-0.10544989258050919,
-0.03964998200535774,
-0.028826281428337097,
-0.06345747411251068,
0.03946658596396446,
0.1839977651834488,
0.0071895793080329895,
0.1222907304763794,
-0.010693677701056004,
0.005139909218996763,
-0.0812702402472496,
0.17305345833301544,
-0.04951660335063934,
-0.08749682456254959,
-0.04205307364463806,
0.11462057381868362,
0.0054734330624341965,
-0.006346965674310923,
-0.05758374556899071,
0.07480885088443756,
0.0810118094086647,
0.03118734061717987,
-0.026882313191890717,
0.026555446907877922,
-0.06242721527814865,
-0.01107538677752018,
-0.003851977176964283,
-0.10052124410867691,
0.0586983785033226,
0.04873456433415413,
-0.043050818145275116,
-0.0286333616822958,
-0.0013965173857286572,
0.04505731537938118,
-0.009714464657008648,
0.16853904724121094,
-0.06509172171354294,
0.009222906082868576,
-0.08999545127153397,
-0.10838949680328369,
0.031029287725687027,
-0.006744137965142727,
0.017277156934142113,
-0.058707091957330704,
-0.07229999452829361,
-0.08039398491382599,
0.0840291902422905,
-0.042690496891736984,
-0.05078808590769768,
-0.06704597920179367,
-0.05799989774823189,
0.05458961799740791,
-0.008972310461103916,
0.18260125815868378,
-0.0720904991030693,
0.07420987635850906,
0.025558186694979668,
0.030303534120321274,
0.04432669281959534,
0.056148212403059006,
-0.04748497158288956,
0.044074252247810364,
-0.13132277131080627,
0.04475166276097298,
-0.08725831657648087,
0.08124729245901108,
-0.13028135895729065,
-0.09477470815181732,
-0.03906365856528282,
-0.010605931282043457,
0.10740029066801071,
0.07688003033399582,
-0.1776442676782608,
-0.08382450044155121,
0.15732958912849426,
-0.05159464478492737,
-0.11714684218168259,
0.1351575404405594,
-0.036658622324466705,
-0.03450315445661545,
0.03654494881629944,
0.1697348803281784,
0.08354571461677551,
-0.09568766504526138,
-0.012503717094659805,
-0.10439880937337875,
0.12592901289463043,
0.045035380870103836,
0.07751942425966263,
-0.042186982929706573,
0.058944810181856155,
-0.016150791198015213,
-0.029010314494371414,
0.04551129788160324,
-0.06827405840158463,
-0.08115962147712708,
-0.01256327424198389,
-0.0745464563369751,
-0.052289485931396484,
0.06760422885417938,
0.0072487820871174335,
-0.08976341784000397,
-0.13153578341007233,
0.03950927406549454,
0.08309699594974518,
-0.10008278489112854,
0.03069784864783287,
-0.07832734286785126,
0.043111804872751236,
-0.023726116865873337,
0.0019772148225456476,
-0.1652512103319168,
-0.0050501395016908646,
0.02210124023258686,
-0.07690034806728363,
0.03049813024699688,
-0.03487991541624069,
0.07212304323911667,
0.03970123827457428,
-0.041369058191776276,
-0.029262786731123924,
-0.046919241547584534,
0.01421736553311348,
-0.08780694007873535,
-0.23001134395599365,
-0.05025957152247429,
-0.01037481240928173,
0.18101269006729126,
-0.244625061750412,
-0.0031516319140791893,
0.012737766839563847,
0.10756083577871323,
0.009487568400800228,
-0.06602262705564499,
-0.04193110764026642,
0.06585667282342911,
-0.023342354223132133,
-0.06413295865058899,
0.0298543069511652,
0.004182328935712576,
-0.09921203553676605,
-0.05335570126771927,
-0.17585299909114838,
0.0655449777841568,
0.10468672215938568,
-0.02440214902162552,
-0.06580469757318497,
-0.0067061190493404865,
-0.04442176967859268,
-0.05809647589921951,
-0.0304512120783329,
0.0035695102997124195,
0.205145001411438,
0.021967431530356407,
0.10387534648180008,
-0.06250914931297302,
-0.053655631840229034,
0.026680296286940575,
0.0261894129216671,
0.003464187728241086,
0.14100414514541626,
0.12631647288799286,
-0.09033504128456116,
0.08027344197034836,
0.06427756696939468,
-0.0788746029138565,
0.11732976138591766,
-0.0440494567155838,
-0.08257731050252914,
-0.03621227666735649,
0.0021934208925813437,
0.03247962146997452,
0.09889095276594162,
-0.17818227410316467,
0.01050016563385725,
0.012858239002525806,
0.009799198247492313,
0.013014651834964752,
-0.1876262128353119,
0.0007399911992251873,
0.06161496788263321,
-0.048548802733421326,
-0.051379621028900146,
-0.03775130212306976,
-0.01412428729236126,
0.07923360913991928,
0.007868227548897266,
-0.053253140300512314,
-0.044932082295417786,
-0.03654104843735695,
-0.09225908666849136,
0.1773398369550705,
-0.07374913990497589,
-0.11049552261829376,
-0.07716963440179825,
-0.04730688035488129,
-0.00891247671097517,
-0.021623214706778526,
0.059971366077661514,
-0.10111378878355026,
-0.06904784590005875,
-0.08136151731014252,
-0.017933107912540436,
-0.036164019256830215,
0.018185315653681755,
0.04882822558283806,
-0.014158179983496666,
0.0726422592997551,
-0.1004583016037941,
-0.002125188009813428,
-0.019382700324058533,
-0.004783526994287968,
0.004017723724246025,
0.04412250965833664,
0.10768207907676697,
0.1471455693244934,
0.04196864366531372,
0.03207050636410713,
-0.003989299759268761,
0.20557768642902374,
-0.10981723666191101,
-0.025744536891579628,
0.08912581950426102,
-0.036351412534713745,
0.04409117251634598,
0.1546631008386612,
0.058121971786022186,
-0.09522779285907745,
0.01875894144177437,
0.07159050554037094,
-0.030948104336857796,
-0.21436665952205658,
-0.012599975802004337,
-0.07109000533819199,
-0.08874131739139557,
0.1121666207909584,
0.030140908434987068,
-0.02808375656604767,
0.01312352903187275,
-0.00225817016325891,
0.000003796966211666586,
0.030551666393876076,
0.06596727669239044,
0.0828235074877739,
0.07489735633134842,
0.12451804429292679,
-0.014396119862794876,
-0.03044809214770794,
0.024081604555249214,
-0.03595452755689621,
0.2288791686296463,
0.0333523266017437,
0.15784361958503723,
0.06631722301244736,
0.11435027420520782,
-0.017686305567622185,
0.04042719304561615,
-0.0015160266775637865,
-0.030968086794018745,
0.020929615944623947,
-0.05424527823925018,
-0.008823497220873833,
0.03609322011470795,
0.07139574736356735,
0.009628665633499622,
-0.06990113854408264,
-0.014443171210587025,
0.028155239298939705,
0.28654277324676514,
0.06586935371160507,
-0.2885471284389496,
-0.05220824107527733,
0.004781648516654968,
-0.07078883796930313,
-0.026860661804676056,
0.008675385266542435,
0.061659086495637894,
-0.06935212016105652,
0.08559156209230423,
-0.06260918080806732,
0.1042381003499031,
-0.057855136692523956,
0.02152353711426258,
0.1057058647274971,
0.07617463171482086,
-0.012641863897442818,
0.04787693917751312,
-0.2694299817085266,
0.2525704801082611,
-0.015072508715093136,
0.09313814342021942,
-0.04818737506866455,
0.030265623703598976,
0.029710931703448296,
-0.04315957799553871,
0.04038234055042267,
-0.01030877884477377,
-0.09364787489175797,
-0.174949049949646,
-0.07109890878200531,
0.044525809586048126,
0.10261572152376175,
-0.053653158247470856,
0.13457228243350983,
-0.03700914978981018,
-0.029756661504507065,
0.059848248958587646,
0.005667181219905615,
-0.1308118999004364,
-0.12070105224847794,
0.03128715232014656,
0.033565420657396317,
0.03552282601594925,
-0.06710421293973923,
-0.09457240253686905,
-0.07844773679971695,
0.1681121289730072,
-0.03800816461443901,
-0.013151765801012516,
-0.1269448846578598,
0.07961656898260117,
0.1271532028913498,
-0.06547901779413223,
0.031493380665779114,
0.021382570266723633,
0.09723855555057526,
0.041207049041986465,
-0.023558566346764565,
0.1148933693766594,
-0.07905364781618118,
-0.17243357002735138,
-0.059727977961301804,
0.13457432389259338,
0.045182306319475174,
0.06660586595535278,
-0.015252656303346157,
0.057822734117507935,
-0.023687178269028664,
-0.08162704855203629,
0.04246104881167412,
0.01756388507783413,
0.04533650726079941,
0.018703460693359375,
-0.016471700742840767,
0.018164075911045074,
-0.07525631785392761,
-0.06742050498723984,
0.10097856819629669,
0.2979944944381714,
-0.06011170521378517,
0.01729699596762657,
0.012590309605002403,
-0.07130108773708344,
-0.10946524888277054,
0.0061233979649841785,
0.12312877178192139,
0.014474865980446339,
0.04573257267475128,
-0.18144366145133972,
0.07052131742238998,
0.08163474500179291,
-0.015231814235448837,
0.06711587309837341,
-0.31057921051979065,
-0.14783227443695068,
0.09732650965452194,
0.10388436168432236,
-0.015210965648293495,
-0.1710037738084793,
-0.051442693918943405,
-0.05319758504629135,
-0.1492210328578949,
0.09989786893129349,
-0.03192593529820442,
0.10492919385433197,
0.010302110575139523,
0.03224935010075569,
0.017007900401949883,
-0.05458017438650131,
0.1427781581878662,
0.001735708094201982,
0.05163935199379921,
-0.022248288616538048,
0.03184318169951439,
-0.028282828629016876,
-0.05235753580927849,
-0.006513987667858601,
-0.10343974828720093,
0.0057637752033770084,
-0.11542434990406036,
-0.034420520067214966,
-0.07522237300872803,
-0.008833901025354862,
-0.050836581736803055,
-0.016691816970705986,
-0.016324197873473167,
0.021952873095870018,
0.11120045930147171,
0.007631569169461727,
0.11512088775634766,
-0.05276695266366005,
0.1394202560186386,
0.07777825742959976,
0.10602203011512756,
-0.01708674430847168,
-0.09287694096565247,
-0.00640451442450285,
0.009163384325802326,
0.027251653373241425,
-0.09987995028495789,
0.05235596001148224,
0.15376058220863342,
0.03463244065642357,
0.16109143197536469,
0.058499835431575775,
-0.07027368247509003,
0.00458771176636219,
0.06571908295154572,
-0.08539655804634094,
-0.13640141487121582,
0.01142388116568327,
-0.0035124097485095263,
-0.14246338605880737,
-0.04737037420272827,
0.09846208989620209,
-0.04801008105278015,
-0.002685901941731572,
0.019841693341732025,
0.0417788065969944,
-0.03908541798591614,
0.22328242659568787,
0.0006947101792320609,
0.06972423195838928,
-0.09321939200162888,
0.05783641338348389,
0.05911965295672417,
-0.11368166655302048,
0.02999759279191494,
0.07176419347524643,
-0.041019588708877563,
-0.012681744992733002,
0.04593035206198692,
0.1291704922914505,
0.013235724531114101,
-0.044550660997629166,
-0.13922147452831268,
-0.11744027584791183,
0.07936743646860123,
0.07137662172317505,
0.03298443928360939,
0.006992476060986519,
-0.023107068613171577,
0.04354085773229599,
-0.11215771734714508,
0.10091108083724976,
0.07917647808790207,
0.07033255696296692,
-0.10340077430009842,
0.08179660886526108,
0.015458845533430576,
-0.047632161527872086,
0.013553254306316376,
0.020711779594421387,
-0.11688670516014099,
0.02059420756995678,
-0.12974022328853607,
-0.025595344603061676,
-0.025982236489653587,
-0.000995655544102192,
0.01796763390302658,
-0.058263953775167465,
-0.0799124538898468,
0.01974605582654476,
-0.1116270050406456,
-0.06265082210302353,
-0.004568329080939293,
0.06219283118844032,
-0.1255750209093094,
-0.004532551858574152,
0.05233706906437874,
-0.11166564375162125,
0.0674942284822464,
0.03639284148812294,
0.021818583831191063,
0.02850891277194023,
-0.13425162434577942,
-0.006060142070055008,
0.029834436252713203,
0.011291993781924248,
0.0400066114962101,
-0.17535556852817535,
-0.01319765392690897,
-0.006577851250767708,
0.025430336594581604,
-0.001563221332617104,
0.047682516276836395,
-0.11119644343852997,
-0.02299782633781433,
-0.042023301124572754,
-0.08351080119609833,
-0.014476542361080647,
0.024537386372685432,
0.048308320343494415,
0.0422864705324173,
0.1834123581647873,
-0.07342590391635895,
0.04979640617966652,
-0.2388366311788559,
-0.008892285637557507,
0.007321381941437721,
-0.05575892701745033,
-0.0492374412715435,
-0.03822389990091324,
0.09319020062685013,
-0.06907172501087189,
0.08599528670310974,
-0.021450206637382507,
0.07318488508462906,
0.046321380883455276,
-0.05372147634625435,
0.011554201133549213,
0.03233237937092781,
0.17135705053806305,
0.05063323676586151,
-0.027280904352664948,
0.08011668175458908,
-0.013667894527316093,
0.06096554547548294,
0.10752598196268082,
0.18695108592510223,
0.1714790016412735,
0.056580789387226105,
0.05506456643342972,
0.0427444763481617,
-0.12422103434801102,
-0.13752025365829468,
0.12874042987823486,
-0.07689075171947479,
0.1660289168357849,
-0.015944620594382286,
0.23452670872211456,
0.08911749720573425,
-0.19667170941829681,
0.050073087215423584,
-0.07354553788900375,
-0.08340149372816086,
-0.10819793492555618,
-0.06292968988418579,
-0.09321624040603638,
-0.15411971509456635,
0.023743508383631706,
-0.1083420068025589,
0.07715234160423279,
0.08671709895133972,
0.03744203969836235,
0.026265127584338188,
0.11793401837348938,
0.04096323251724243,
0.002189264167100191,
0.07068528234958649,
-0.0009594826842658222,
-0.03915099799633026,
-0.03970777988433838,
-0.06955959647893906,
0.04447165131568909,
-0.033065199851989746,
0.07240389287471771,
-0.03218726068735123,
-0.10524658858776093,
0.05726727098226547,
0.01777271367609501,
-0.1106262058019638,
0.03216206654906273,
-0.018951814621686935,
0.07885482907295227,
0.06795141100883484,
0.02191637083888054,
0.02818976156413555,
-0.015952955931425095,
0.22398525476455688,
-0.06797830015420914,
-0.04107541963458061,
-0.1131153479218483,
0.19150583446025848,
0.02012842893600464,
-0.010647568851709366,
0.021293306723237038,
-0.07696697860956192,
0.004920457024127245,
0.15875060856342316,
0.10522875189781189,
-0.05240916460752487,
-0.014210276305675507,
0.016474200412631035,
-0.005816313438117504,
-0.028837554156780243,
0.08647195994853973,
0.11810003966093063,
0.03302035853266716,
-0.046603817492723465,
-0.01830587349832058,
-0.03871959447860718,
-0.060488488525152206,
-0.007504473906010389,
0.06365958601236343,
0.051299482583999634,
-0.0059141297824680805,
-0.015739163383841515,
0.1180272325873375,
-0.0201724823564291,
-0.15742872655391693,
0.024959133937954903,
-0.18707922101020813,
-0.197760671377182,
-0.02240007556974888,
0.10319983959197998,
0.008299116045236588,
0.03880229592323303,
-0.009204890578985214,
-0.04581218212842941,
0.09719862043857574,
-0.003262386890128255,
-0.02142145112156868,
-0.12912800908088684,
0.06974484771490097,
-0.1042529046535492,
0.2098517268896103,
-0.03767974302172661,
0.036533042788505554,
0.12838147580623627,
0.06470408290624619,
-0.07186517119407654,
0.0530589297413826,
0.0758705660700798,
-0.15314647555351257,
0.05303621292114258,
0.16189610958099365,
-0.05833509936928749,
0.12033534049987793,
0.05262696370482445,
-0.09658287465572357,
0.03302144631743431,
-0.09505768120288849,
-0.04583501070737839,
-0.043754346668720245,
-0.016794303432106972,
-0.06653069704771042,
0.1418137401342392,
0.2163422852754593,
-0.05623919889330864,
0.002153589390218258,
-0.05268555507063866,
0.026285873726010323,
0.02888166531920433,
0.16653519868850708,
-0.06360316276550293,
-0.28604549169540405,
0.043678123503923416,
0.0015121691394597292,
0.011364252306520939,
-0.2272232174873352,
-0.08787646144628525,
0.033116698265075684,
-0.06622794270515442,
-0.06491579860448837,
0.11410225927829742,
0.08287786692380905,
0.03750669211149216,
-0.07662533968687057,
-0.15877464413642883,
-0.03040402941405773,
0.18407166004180908,
-0.1596408635377884,
-0.06560196727514267
] |
null | null |
transformers
|
# xls-asr-vi-40h
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common voice 7.0 vi & private dataset.
It achieves the following results on the evaluation set (Without Language Model):
- Loss: 1.1177
- Wer: 60.58
## Evaluation
Please run the eval.py file
```bash
!python eval_custom.py --model_id geninhu/xls-asr-vi-40h --dataset mozilla-foundation/common_voice_7_0 --config vi --split test
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1500
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 23.3878 | 0.93 | 1500 | 21.9179 | 1.0 |
| 8.8862 | 1.85 | 3000 | 6.0599 | 1.0 |
| 4.3701 | 2.78 | 4500 | 4.3837 | 1.0 |
| 4.113 | 3.7 | 6000 | 4.2698 | 0.9982 |
| 3.9666 | 4.63 | 7500 | 3.9726 | 0.9989 |
| 3.5965 | 5.56 | 9000 | 3.7124 | 0.9975 |
| 3.3944 | 6.48 | 10500 | 3.5005 | 1.0057 |
| 3.304 | 7.41 | 12000 | 3.3710 | 1.0043 |
| 3.2482 | 8.33 | 13500 | 3.4201 | 1.0155 |
| 3.212 | 9.26 | 15000 | 3.3732 | 1.0151 |
| 3.1778 | 10.19 | 16500 | 3.2763 | 1.0009 |
| 3.1027 | 11.11 | 18000 | 3.1943 | 1.0025 |
| 2.9905 | 12.04 | 19500 | 2.8082 | 0.9703 |
| 2.7095 | 12.96 | 21000 | 2.4993 | 0.9302 |
| 2.4862 | 13.89 | 22500 | 2.3072 | 0.9140 |
| 2.3271 | 14.81 | 24000 | 2.1398 | 0.8949 |
| 2.1968 | 15.74 | 25500 | 2.0594 | 0.8817 |
| 2.111 | 16.67 | 27000 | 1.9404 | 0.8630 |
| 2.0387 | 17.59 | 28500 | 1.8895 | 0.8497 |
| 1.9504 | 18.52 | 30000 | 1.7961 | 0.8315 |
| 1.9039 | 19.44 | 31500 | 1.7433 | 0.8213 |
| 1.8342 | 20.37 | 33000 | 1.6790 | 0.7994 |
| 1.7824 | 21.3 | 34500 | 1.6291 | 0.7825 |
| 1.7359 | 22.22 | 36000 | 1.5783 | 0.7706 |
| 1.7053 | 23.15 | 37500 | 1.5248 | 0.7492 |
| 1.6504 | 24.07 | 39000 | 1.4930 | 0.7406 |
| 1.6263 | 25.0 | 40500 | 1.4572 | 0.7348 |
| 1.5893 | 25.93 | 42000 | 1.4202 | 0.7161 |
| 1.5669 | 26.85 | 43500 | 1.3987 | 0.7143 |
| 1.5277 | 27.78 | 45000 | 1.3512 | 0.6991 |
| 1.501 | 28.7 | 46500 | 1.3320 | 0.6879 |
| 1.4781 | 29.63 | 48000 | 1.3112 | 0.6788 |
| 1.4477 | 30.56 | 49500 | 1.2850 | 0.6657 |
| 1.4483 | 31.48 | 51000 | 1.2813 | 0.6633 |
| 1.4065 | 32.41 | 52500 | 1.2475 | 0.6541 |
| 1.3779 | 33.33 | 54000 | 1.2244 | 0.6503 |
| 1.3788 | 34.26 | 55500 | 1.2116 | 0.6407 |
| 1.3428 | 35.19 | 57000 | 1.1938 | 0.6352 |
| 1.3453 | 36.11 | 58500 | 1.1927 | 0.6340 |
| 1.3137 | 37.04 | 60000 | 1.1699 | 0.6252 |
| 1.2984 | 37.96 | 61500 | 1.1666 | 0.6229 |
| 1.2927 | 38.89 | 63000 | 1.1585 | 0.6188 |
| 1.2919 | 39.81 | 64500 | 1.1618 | 0.6190 |
| 1.293 | 40.74 | 66000 | 1.1479 | 0.6181 |
| 1.2853 | 41.67 | 67500 | 1.1423 | 0.6202 |
| 1.2687 | 42.59 | 69000 | 1.1315 | 0.6131 |
| 1.2603 | 43.52 | 70500 | 1.1333 | 0.6128 |
| 1.2577 | 44.44 | 72000 | 1.1191 | 0.6079 |
| 1.2435 | 45.37 | 73500 | 1.1177 | 0.6079 |
| 1.251 | 46.3 | 75000 | 1.1211 | 0.6092 |
| 1.2482 | 47.22 | 76500 | 1.1177 | 0.6060 |
| 1.2422 | 48.15 | 78000 | 1.1227 | 0.6097 |
| 1.2485 | 49.07 | 79500 | 1.1187 | 0.6071 |
| 1.2425 | 50.0 | 81000 | 1.1177 | 0.6058 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
{"language": ["vi"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common-voice", "hf-asr-leaderboard", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_7_0"], "model-index": [{"name": "xls-asr-vi-40h", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice 7.0", "type": "mozilla-foundation/common_voice_7_0", "args": "vi"}, "metrics": [{"type": "wer", "value": 56.57, "name": "Test WER (with Language model)"}]}]}]}
|
automatic-speech-recognition
|
geninhu/xls-asr-vi-40h
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"common-voice",
"hf-asr-leaderboard",
"robust-speech-event",
"vi",
"dataset:mozilla-foundation/common_voice_7_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"vi"
] |
TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
xls-asr-vi-40h
==============
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common voice 7.0 vi & private dataset.
It achieves the following results on the evaluation set (Without Language Model):
* Loss: 1.1177
* Wer: 60.58
Evaluation
----------
Please run the URL file
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-06
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1500
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
98,
132,
4,
41
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #common-voice #hf-asr-leaderboard #robust-speech-event #vi #dataset-mozilla-foundation/common_voice_7_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] |
[
-0.12921930849552155,
0.09582560509443283,
-0.005140830297023058,
0.03635082766413689,
0.08545821160078049,
0.007314354181289673,
0.0993737280368805,
0.15782701969146729,
-0.07599297165870667,
0.10504039376974106,
0.09170394390821457,
0.07240106165409088,
0.08512310683727264,
0.1305723488330841,
-0.03631847724318504,
-0.26266300678253174,
0.04102582111954689,
0.006078404374420643,
-0.07277694344520569,
0.10571437329053879,
0.09994204342365265,
-0.10214182734489441,
0.022685162723064423,
0.028025124222040176,
-0.0907118171453476,
0.006293793674558401,
-0.023827001452445984,
-0.07070723176002502,
0.1030157282948494,
0.047349970787763596,
0.06639938801527023,
0.04097655415534973,
0.0654931291937828,
-0.25281471014022827,
0.016612710431218147,
0.06959305703639984,
0.03459213674068451,
0.07193673402070999,
0.09586033225059509,
-0.005584809463471174,
0.1050708070397377,
-0.06745044887065887,
0.05480118468403816,
0.05087483301758766,
-0.08389591425657272,
-0.2768101096153259,
-0.08299433439970016,
0.054892316460609436,
0.09915917366743088,
0.065074123442173,
-0.028354397043585777,
0.07510611414909363,
-0.05223170667886734,
0.10472702234983444,
0.2042669802904129,
-0.2208995372056961,
-0.06960584968328476,
-0.04005465656518936,
0.05444422736763954,
0.04353923350572586,
-0.09708338230848312,
-0.012882975861430168,
0.012483219616115093,
0.03452666103839874,
0.0974368005990982,
0.006579717621207237,
-0.02064603753387928,
-0.015352151356637478,
-0.13437482714653015,
-0.063388392329216,
0.11888650059700012,
0.06874822825193405,
-0.018405787646770477,
-0.10624171793460846,
-0.04847043752670288,
-0.15850555896759033,
-0.054252829402685165,
-0.013904215767979622,
0.026951026171445847,
-0.02494574896991253,
-0.04623197391629219,
0.0007179887033998966,
-0.07090982049703598,
-0.0692066177725792,
0.0258487518876791,
0.1435794234275818,
0.048119474202394485,
-0.023777589201927185,
-0.01993919536471367,
0.08137115836143494,
0.024609973654150963,
-0.1487445831298828,
-0.024127425625920296,
0.04194743186235428,
-0.08263900876045227,
-0.012962279841303825,
-0.017919324338436127,
-0.051376823335886,
0.051635418087244034,
0.11034386605024338,
-0.04619130864739418,
0.11397615075111389,
-0.010841081850230694,
0.01833983324468136,
-0.08307944238185883,
0.1567326784133911,
-0.04061860218644142,
-0.04655936360359192,
-0.007979840971529484,
0.12015139311552048,
0.019247345626354218,
-0.010128794237971306,
-0.0603100061416626,
0.010419465601444244,
0.11941079050302505,
0.04757833480834961,
-0.005689555313438177,
0.015895137563347816,
-0.06632374227046967,
-0.02564116008579731,
0.0002561075962148607,
-0.12250064313411713,
0.03428452089428902,
0.05731828883290291,
-0.05246197432279587,
0.019929591566324234,
0.01039100531488657,
0.003889190498739481,
-0.05383020266890526,
0.06561343371868134,
-0.04109078645706177,
-0.004093020688742399,
-0.06727022677659988,
-0.0934988334774971,
0.04559021070599556,
-0.04551389440894127,
-0.018985463306307793,
-0.07450687885284424,
-0.09481441229581833,
-0.05933082103729248,
0.05341033264994621,
-0.04433325678110123,
-0.06698959320783615,
-0.07993996143341064,
-0.07533705979585648,
0.059311430901288986,
-0.02971329353749752,
0.1608654260635376,
-0.05907350778579712,
0.096988245844841,
0.019909143447875977,
0.046519674360752106,
0.06126025691628456,
0.07157717645168304,
-0.012966245412826538,
0.04621414095163345,
-0.13871873915195465,
0.11023346334695816,
-0.1035492792725563,
0.008775644935667515,
-0.14198100566864014,
-0.09637437760829926,
-0.007703059818595648,
0.0005597439594566822,
0.11086972802877426,
0.11599578708410263,
-0.16961419582366943,
-0.09736768901348114,
0.16404710710048676,
-0.07368874549865723,
-0.06673435866832733,
0.13443201780319214,
-0.016790492460131645,
-0.03668556362390518,
0.048970215022563934,
0.19543857872486115,
0.11264784634113312,
-0.10893294215202332,
0.0076538389548659325,
-0.0481436587870121,
0.1004703938961029,
0.027389055117964745,
0.07640516012907028,
-0.043761979788541794,
0.022920725867152214,
0.0017914874479174614,
-0.02433609962463379,
0.05754814296960831,
-0.0709652379155159,
-0.07118892669677734,
-0.03543304279446602,
-0.06629087030887604,
0.009731595404446125,
0.046360839158296585,
0.013723723590373993,
-0.0951918512582779,
-0.1260402947664261,
0.018876081332564354,
0.11955444514751434,
-0.11063499003648758,
0.038236502557992935,
-0.09689544141292572,
0.06442918628454208,
-0.01883476786315441,
0.0015901937149465084,
-0.1647253781557083,
0.023408185690641403,
0.03940902650356293,
-0.07214462757110596,
0.01845579408109188,
-0.0070493039675056934,
0.07690697908401489,
0.02936316467821598,
-0.037634529173374176,
-0.06708075106143951,
-0.04420303553342819,
0.0013167874421924353,
-0.05842078849673271,
-0.22391046583652496,
-0.06207550689578056,
-0.03267884626984596,
0.1501985490322113,
-0.18184269964694977,
0.00949402991682291,
0.06451418250799179,
0.13479165732860565,
0.028963392600417137,
-0.04374378174543381,
0.02311260998249054,
0.06475379317998886,
-0.00818665698170662,
-0.06922055035829544,
0.03418191522359848,
0.008262470364570618,
-0.11543212831020355,
0.030756236985325813,
-0.14489905536174774,
0.06835639476776123,
0.09228009730577469,
0.02124013938009739,
-0.05950695648789406,
-0.03230789676308632,
-0.048192985355854034,
-0.059304989874362946,
-0.01924222707748413,
0.0009672157466411591,
0.18868966400623322,
0.029355114325881004,
0.11480159312486649,
-0.07970767468214035,
-0.03860078379511833,
0.033834390342235565,
0.021488193422555923,
-0.01989220455288887,
0.14133162796497345,
0.03661681339144707,
-0.05394434556365013,
0.08074156194925308,
0.05244322866201401,
-0.06673219054937363,
0.15086236596107483,
-0.08101557195186615,
-0.08863827586174011,
-0.024457041174173355,
0.027092959731817245,
0.01900477334856987,
0.09955613315105438,
-0.17326588928699493,
-0.006621692329645157,
0.027510397136211395,
0.02215612307190895,
0.022536233067512512,
-0.19166533648967743,
0.022813279181718826,
0.03593282774090767,
-0.09739789366722107,
-0.007747363299131393,
0.012856206856667995,
0.007249746005982161,
0.0803348645567894,
-0.004504700656980276,
-0.076241634786129,
-0.01063124742358923,
-0.0343964658677578,
-0.0942002609372139,
0.16180458664894104,
-0.11059083044528961,
-0.17120787501335144,
-0.11406541615724564,
-0.019018379971385002,
-0.021891525015234947,
-0.022146686911582947,
0.060113199055194855,
-0.09930378943681717,
-0.04464387148618698,
-0.06242380291223526,
0.025731591507792473,
-0.03433602675795555,
0.018125610426068306,
0.02712017297744751,
-0.015147852711379528,
0.08563809841871262,
-0.11947425454854965,
0.008540390990674496,
-0.011400454677641392,
-0.00841277465224266,
0.011984463781118393,
0.02909025177359581,
0.06332641839981079,
0.1567821502685547,
0.03940818831324577,
0.031922321766614914,
-0.041972432285547256,
0.174235001206398,
-0.11034466326236725,
0.001668052515015006,
0.13403207063674927,
-0.008023533970117569,
0.04867953807115555,
0.131742924451828,
0.03647204861044884,
-0.06952349841594696,
-0.006001651752740145,
0.014717658050358295,
-0.00785854272544384,
-0.2428174614906311,
-0.034916751086711884,
-0.06795517355203629,
-0.04259743168950081,
0.08835143595933914,
0.038177601993083954,
-0.005137791391462088,
0.014214372262358665,
-0.032901715487241745,
-0.009100011549890041,
0.039328187704086304,
0.052046455442905426,
0.14707715809345245,
0.03454501926898956,
0.10808394849300385,
-0.021844398230314255,
-0.018336093053221703,
0.03440335765480995,
0.00257870857603848,
0.20798668265342712,
0.009215275757014751,
0.18855232000350952,
0.04121636971831322,
0.14536282420158386,
0.015990229323506355,
0.03741816058754921,
0.005638800095766783,
0.011670581996440887,
0.019053025171160698,
-0.05452755466103554,
-0.05134586617350578,
0.014452609233558178,
0.1125936508178711,
0.025800613686442375,
-0.09308072179555893,
0.002358578145503998,
0.012025394476950169,
0.3595947027206421,
0.06167171522974968,
-0.2742220461368561,
-0.08723434805870056,
0.006266193930059671,
-0.09158821403980255,
-0.053618431091308594,
0.025337256491184235,
0.12834419310092926,
-0.08342844247817993,
0.08540914207696915,
-0.05962800979614258,
0.09282765537500381,
-0.06934083998203278,
0.007574557326734066,
0.03231736645102501,
0.12176435440778732,
0.011225070804357529,
0.047678519040346146,
-0.2709769308567047,
0.26561808586120605,
-0.0007416065782308578,
0.09745465964078903,
-0.04166797921061516,
0.044219665229320526,
0.04654628038406372,
-0.016059963032603264,
0.05588279664516449,
-0.014890288934111595,
-0.11114299297332764,
-0.12676209211349487,
-0.112175852060318,
0.02598528377711773,
0.09072136878967285,
-0.018239831551909447,
0.09541697055101395,
-0.02980082482099533,
-0.0344371534883976,
0.04222419857978821,
-0.07136692106723785,
-0.13131214678287506,
-0.09361666440963745,
0.031159797683358192,
0.07503556460142136,
0.07308486104011536,
-0.09048595279455185,
-0.10551334917545319,
-0.05969042703509331,
0.12144220620393753,
-0.11233042180538177,
-0.028929024934768677,
-0.12157563120126724,
0.05158839002251625,
0.14661632478237152,
-0.07028575241565704,
0.0402592271566391,
0.008736890740692616,
0.13711166381835938,
0.011766849085688591,
-0.02837684005498886,
0.09137089550495148,
-0.07790470123291016,
-0.19104914367198944,
-0.04012249782681465,
0.1801808625459671,
0.031056761741638184,
0.06841646879911423,
-0.015583249740302563,
0.032633472234010696,
-0.011195892468094826,
-0.05476181581616402,
0.07105939835309982,
0.07855876535177231,
-0.009677937254309654,
0.04250074923038483,
-0.011904249899089336,
-0.06329316645860672,
-0.07116740196943283,
-0.05861533060669899,
0.13905209302902222,
0.2548410892486572,
-0.07126061618328094,
0.07679283618927002,
0.06021152064204216,
-0.06802839785814285,
-0.1472318470478058,
-0.00682780472561717,
0.12963518500328064,
0.04774974659085274,
-0.016930045560002327,
-0.20962083339691162,
-0.0011507287854328752,
0.060515958815813065,
-0.02168053202331066,
0.07729532569646835,
-0.3129158616065979,
-0.1385566145181656,
0.11282259970903397,
0.07176773995161057,
0.00927567295730114,
-0.1326429545879364,
-0.0641416534781456,
-0.011409426108002663,
-0.08221668750047684,
0.046322356909513474,
-0.0247177891433239,
0.1246228814125061,
-0.005264870822429657,
0.05765097588300705,
0.018450023606419563,
-0.03966255858540535,
0.1337670534849167,
-0.022233324125409126,
0.0355469174683094,
-0.009259209968149662,
0.05620884150266647,
-0.01632501184940338,
-0.07103443890810013,
0.022387854754924774,
-0.08875420689582825,
0.02215263806283474,
-0.1265544444322586,
-0.021333441138267517,
-0.08435571193695068,
0.032819028943777084,
-0.033871766179800034,
-0.002936192089691758,
-0.017921781167387962,
0.01617381162941456,
0.07392831891775131,
0.005409633740782738,
0.12182993441820145,
-0.061321698129177094,
0.12230566889047623,
0.1340378373861313,
0.12776991724967957,
-0.022142115980386734,
-0.10153466463088989,
-0.014526383019983768,
-0.015126723796129227,
0.056091491132974625,
-0.11454766243696213,
0.03776746243238449,
0.12972813844680786,
0.02774711698293686,
0.1475890576839447,
0.037460267543792725,
-0.09715639054775238,
0.020625941455364227,
0.044870197772979736,
-0.08860673010349274,
-0.16336576640605927,
-0.01464766450226307,
0.05379033833742142,
-0.11614669114351273,
-0.005002709571272135,
0.128911554813385,
-0.05064086243510246,
-0.010895886458456516,
0.013690243475139141,
0.03825618699193001,
-0.03596037998795509,
0.2146073281764984,
0.029222870245575905,
0.07728971540927887,
-0.10006430000066757,
0.07999877631664276,
0.05190572887659073,
-0.15050120651721954,
0.06167528033256531,
0.09678681939840317,
-0.04390646889805794,
-0.020281504839658737,
0.03615576773881912,
0.1099342331290245,
0.054886288940906525,
-0.0745840072631836,
-0.11120680719614029,
-0.14350879192352295,
0.09630316495895386,
0.12597313523292542,
0.02204110659658909,
0.03177318349480629,
-0.037117816507816315,
0.032196301966905594,
-0.08597030490636826,
0.10327745974063873,
0.09310555458068848,
0.05797259509563446,
-0.12143532186746597,
0.11751432716846466,
0.019612153992056847,
-0.0001290283544221893,
-0.002213866449892521,
-0.02522185631096363,
-0.08326780796051025,
0.021046075969934464,
-0.1440035104751587,
0.015816202387213707,
-0.04547019675374031,
0.0055748834274709225,
0.013030605390667915,
-0.06312267482280731,
-0.05144547298550606,
0.042092014104127884,
-0.10987138748168945,
-0.03331806883215904,
-0.027887562289834023,
0.06623007357120514,
-0.10966457426548004,
-0.013413612730801105,
0.02750532142817974,
-0.11843030899763107,
0.0958719253540039,
0.058466386049985886,
-0.010657471604645252,
0.028841372579336166,
-0.10670790076255798,
-0.02474972791969776,
0.03322326019406319,
0.012634510174393654,
0.01893361657857895,
-0.18135912716388702,
-0.0016162380343303084,
-0.005169184412807226,
0.01297331228852272,
-0.012083995155990124,
0.05915585160255432,
-0.10593343526124954,
-0.02753313072025776,
-0.04075270891189575,
-0.04541069641709328,
-0.05236072093248367,
0.06408382952213287,
0.07670531421899796,
0.037723835557699203,
0.16017256677150726,
-0.09584778547286987,
0.052539825439453125,
-0.214065283536911,
0.015373238362371922,
-0.020989984273910522,
-0.07464155554771423,
-0.07556121796369553,
-0.019441818818449974,
0.10258057713508606,
-0.05949237197637558,
0.04979550838470459,
-0.03960627689957619,
0.033205412328243256,
0.03452586755156517,
-0.11885502189397812,
-0.027013838291168213,
0.04062090069055557,
0.1635345220565796,
0.03936326131224632,
-0.013255071826279163,
0.08294682204723358,
-0.03389635309576988,
0.06289716064929962,
0.13752686977386475,
0.13507220149040222,
0.1641046106815338,
0.05621650815010071,
0.10723114013671875,
0.07359906286001205,
-0.0947713553905487,
-0.11851027607917786,
0.13729643821716309,
-0.05491563305258751,
0.14798474311828613,
-0.017200203612446785,
0.22104744613170624,
0.08627039194107056,
-0.17849132418632507,
0.06623644381761551,
-0.04197041317820549,
-0.07614050060510635,
-0.1100221574306488,
-0.07656578719615936,
-0.08795180916786194,
-0.17408870160579681,
0.008478846400976181,
-0.10244033485651016,
0.07002852857112885,
0.027731111273169518,
0.04836354777216911,
0.01777523197233677,
0.09471315145492554,
0.0462234765291214,
-0.01355016604065895,
0.12385562062263489,
0.008232451044023037,
-0.020231755450367928,
-0.05866916850209236,
-0.10390511155128479,
0.07275921106338501,
-0.012409222312271595,
0.06825020164251328,
-0.01723938249051571,
-0.07302656024694443,
0.05502147600054741,
-0.000656501273624599,
-0.10558518767356873,
0.0328686349093914,
-0.012886861339211464,
0.06622941046953201,
0.09775371104478836,
0.04594484716653824,
-0.0102658960968256,
-0.007214676588773727,
0.19471074640750885,
-0.07820000499486923,
-0.06477836519479752,
-0.12870506942272186,
0.1843818873167038,
-0.000208022422157228,
-0.015210597775876522,
0.03235707804560661,
-0.06387028098106384,
-0.0220834631472826,
0.15915097296237946,
0.11959078907966614,
-0.003539466764777899,
-0.014228113926947117,
0.0015629807021468878,
-0.012551993131637573,
-0.034921251237392426,
0.07804851233959198,
0.12137160450220108,
0.04094824194908142,
-0.04419289901852608,
-0.01930605247616768,
-0.032177411019802094,
-0.051522232592105865,
-0.03275608643889427,
0.06320884823799133,
-0.011609723791480064,
-0.026334097608923912,
-0.012386542744934559,
0.09380096942186356,
-0.04502009227871895,
-0.15922312438488007,
0.010673745535314083,
-0.1762053519487381,
-0.17697879672050476,
-0.03341049700975418,
0.09040577709674835,
0.04525652155280113,
0.029402878135442734,
-0.004053321201354265,
-0.013657920062541962,
0.1338253617286682,
-0.008677374571561813,
-0.038360413163900375,
-0.10840950161218643,
0.07945915311574936,
-0.12443725764751434,
0.15505348145961761,
-0.031051356345415115,
0.05018695071339607,
0.10592906922101974,
0.07195962220430374,
-0.08211217820644379,
0.03830019012093544,
0.06692869961261749,
-0.12216033786535263,
0.03995834290981293,
0.19260963797569275,
-0.03626728057861328,
0.14172735810279846,
0.040342509746551514,
-0.09136060625314713,
0.004331603180617094,
-0.071822389960289,
-0.05617082491517067,
-0.0438409149646759,
0.002587549854069948,
-0.04538764804601669,
0.12515898048877716,
0.17968346178531647,
-0.059171684086322784,
-0.024069173261523247,
-0.06139843910932541,
0.014312204904854298,
0.03485066816210747,
0.10913939774036407,
-0.03552111238241196,
-0.2674100995063782,
0.017558535560965538,
-0.015917997807264328,
0.026055876165628433,
-0.20350854098796844,
-0.09955164045095444,
0.007918242365121841,
-0.04945630207657814,
-0.06875605136156082,
0.11288122087717056,
0.05681051313877106,
0.04501071199774742,
-0.051234643906354904,
-0.0757930651307106,
-0.020369242876768112,
0.179460808634758,
-0.1637953519821167,
-0.0455336757004261
] |
null | null |
transformers
|
# MechDistilGPT2
## Table of Contents
- [Model Details](#model-details)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Environmental Impact](#environmental-impact)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
## Model Details
- **Model Description:**
This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
- **Developed by:** [Ashwin](https://huggingface.co/geralt)
- **Model Type:** Causal Language modeling
- **Language(s):** English
- **License:** [More Information Needed]
- **Parent Model:** See the [DistilGPT2model](https://huggingface.co/distilgpt2) for more information about the Distilled-GPT2 base model.
- **Resources for more information:**
- [Research Paper](https://arxiv.org/abs/2105.09680)
- [GitHub Repo](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb)
## Uses
#### Direct Use
The model can be used for tasks including topic classification, Causal Language modeling and text generation
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
## Training
#### Training Data
This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
#### Training Procedure
###### Fine-Tuning
* Default Training Args
* Epochs = 3
* Training set = 200k sentences
* Validation set = 40k sentences
###### Framework versions
* Transformers 4.7.0.dev0
* Pytorch 1.8.1+cu111
* Datasets 1.6.2
* Tokenizers 0.10.2
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More information needed]
- **Hours used:** [More information needed]
- **Cloud Provider:** [More information needed]
- **Compute Region:** [More information needed"]
- **Carbon Emitted:** [More information needed]
## How to Get Started With the Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("geralt/MechDistilGPT2")
model = AutoModelForCausalLM.from_pretrained("geralt/MechDistilGPT2")
```
|
{"tags": ["Causal Language modeling", "text-generation", "CLM"], "model_index": [{"name": "MechDistilGPT2", "results": [{"task": {"name": "Causal Language modeling", "type": "Causal Language modeling"}}]}]}
|
text-generation
|
geralt/MechDistilGPT2
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"Causal Language modeling",
"CLM",
"arxiv:2105.09680",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2105.09680",
"1910.09700"
] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #Causal Language modeling #CLM #arxiv-2105.09680 #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# MechDistilGPT2
## Table of Contents
- Model Details
- Uses
- Risks, Limitations and Biases
- Training
- Environmental Impact
- How to Get Started With the Model
## Model Details
- Model Description:
This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
- Developed by: Ashwin
- Model Type: Causal Language modeling
- Language(s): English
- License:
- Parent Model: See the DistilGPT2model for more information about the Distilled-GPT2 base model.
- Resources for more information:
- Research Paper
- GitHub Repo
## Uses
#### Direct Use
The model can be used for tasks including topic classification, Causal Language modeling and text generation
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.
Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).
## Training
#### Training Data
This model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.
#### Training Procedure
###### Fine-Tuning
* Default Training Args
* Epochs = 3
* Training set = 200k sentences
* Validation set = 40k sentences
###### Framework versions
* Transformers 4.7.0.dev0
* Pytorch 1.8.1+cu111
* Datasets 1.6.2
* Tokenizers 0.10.2
# Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: [More information needed]
- Hours used: [More information needed]
- Cloud Provider: [More information needed]
- Compute Region: [More information needed"]
- Carbon Emitted: [More information needed]
## How to Get Started With the Model
|
[
"# MechDistilGPT2",
"## Table of Contents\n- Model Details \n- Uses\n- Risks, Limitations and Biases\n- Training\n- Environmental Impact\n- How to Get Started With the Model",
"## Model Details\n- Model Description: \nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.\n\n\n- Developed by: Ashwin\n\n- Model Type: Causal Language modeling\n- Language(s): English\n- License: \n- Parent Model: See the DistilGPT2model for more information about the Distilled-GPT2 base model.\n- Resources for more information:\n - Research Paper\n - GitHub Repo",
"## Uses",
"#### Direct Use\n\nThe model can be used for tasks including topic classification, Causal Language modeling and text generation",
"#### Misuse and Out-of-scope Use\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.",
"## Risks, Limitations and Biases\n\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).",
"## Training",
"#### Training Data\n\nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.",
"#### Training Procedure",
"###### Fine-Tuning\n\n* Default Training Args\n* Epochs = 3\n* Training set = 200k sentences\n* Validation set = 40k sentences",
"###### Framework versions\n\n* Transformers 4.7.0.dev0\n* Pytorch 1.8.1+cu111\n* Datasets 1.6.2\n* Tokenizers 0.10.2",
"# Environmental Impact\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: [More information needed]\n- Hours used: [More information needed]\n- Cloud Provider: [More information needed]\n- Compute Region: [More information needed\"]\n- Carbon Emitted: [More information needed]\n",
"## How to Get Started With the Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #Causal Language modeling #CLM #arxiv-2105.09680 #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# MechDistilGPT2",
"## Table of Contents\n- Model Details \n- Uses\n- Risks, Limitations and Biases\n- Training\n- Environmental Impact\n- How to Get Started With the Model",
"## Model Details\n- Model Description: \nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.\n\n\n- Developed by: Ashwin\n\n- Model Type: Causal Language modeling\n- Language(s): English\n- License: \n- Parent Model: See the DistilGPT2model for more information about the Distilled-GPT2 base model.\n- Resources for more information:\n - Research Paper\n - GitHub Repo",
"## Uses",
"#### Direct Use\n\nThe model can be used for tasks including topic classification, Causal Language modeling and text generation",
"#### Misuse and Out-of-scope Use\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.",
"## Risks, Limitations and Biases\n\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).",
"## Training",
"#### Training Data\n\nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.",
"#### Training Procedure",
"###### Fine-Tuning\n\n* Default Training Args\n* Epochs = 3\n* Training set = 200k sentences\n* Validation set = 40k sentences",
"###### Framework versions\n\n* Transformers 4.7.0.dev0\n* Pytorch 1.8.1+cu111\n* Datasets 1.6.2\n* Tokenizers 0.10.2",
"# Environmental Impact\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: [More information needed]\n- Hours used: [More information needed]\n- Cloud Provider: [More information needed]\n- Compute Region: [More information needed\"]\n- Carbon Emitted: [More information needed]\n",
"## How to Get Started With the Model"
] |
[
75,
8,
35,
98,
3,
25,
76,
85,
2,
27,
5,
38,
37,
79,
9
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #Causal Language modeling #CLM #arxiv-2105.09680 #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# MechDistilGPT2## Table of Contents\n- Model Details \n- Uses\n- Risks, Limitations and Biases\n- Training\n- Environmental Impact\n- How to Get Started With the Model## Model Details\n- Model Description: \nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.\n\n\n- Developed by: Ashwin\n\n- Model Type: Causal Language modeling\n- Language(s): English\n- License: \n- Parent Model: See the DistilGPT2model for more information about the Distilled-GPT2 base model.\n- Resources for more information:\n - Research Paper\n - GitHub Repo## Uses#### Direct Use\n\nThe model can be used for tasks including topic classification, Causal Language modeling and text generation#### Misuse and Out-of-scope Use\n\nThe model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.## Risks, Limitations and Biases\n\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).## Training#### Training Data\n\nThis model is fine-tuned on text scraped from 100+ Mechanical/Automotive pdf books.#### Training Procedure###### Fine-Tuning\n\n* Default Training Args\n* Epochs = 3\n* Training set = 200k sentences\n* Validation set = 40k sentences"
] |
[
-0.04021003097295761,
0.056853581219911575,
-0.0034831170924007893,
0.036520346999168396,
0.11358537524938583,
-0.002769047161564231,
0.08211992681026459,
0.07953162491321564,
-0.024698685854673386,
0.044010721147060394,
-0.004428686574101448,
0.022166211158037186,
0.11463095992803574,
0.021637730300426483,
0.08870045840740204,
-0.23361124098300934,
0.07317283004522324,
-0.058575406670570374,
0.08635813742876053,
0.10549803823232651,
0.12726321816444397,
-0.03095654398202896,
0.09422872960567474,
0.012892508879303932,
-0.08082586526870728,
-0.03636214882135391,
-0.016533350571990013,
-0.04981333017349243,
0.11044789850711823,
0.073710598051548,
0.08603641390800476,
-0.034583985805511475,
0.07758603245019913,
-0.18426856398582458,
0.006483540870249271,
0.09402696043252945,
-0.024654125794768333,
0.06506643444299698,
0.07265398651361465,
-0.01496586762368679,
0.14205652475357056,
-0.07221589237451553,
0.0820116475224495,
0.07898098975419998,
-0.13786691427230835,
-0.1673230528831482,
-0.09090597927570343,
0.09736036509275436,
0.11437103897333145,
0.14351129531860352,
-0.06573238223791122,
0.04794418066740036,
-0.02081737481057644,
0.03502709046006203,
0.1372065246105194,
-0.09525734931230545,
-0.042084112763404846,
0.08608623594045639,
0.04824269562959671,
0.08165552467107773,
-0.0708656832575798,
-0.013651207089424133,
0.004274445120245218,
0.058737386018037796,
0.05400235578417778,
-0.014187787659466267,
-0.026036789640784264,
-0.01855994388461113,
-0.0975637435913086,
-0.02040216512978077,
0.11636944115161896,
0.004447640851140022,
-0.08466051518917084,
-0.18221351504325867,
-0.026882704347372055,
0.0771314725279808,
0.042891137301921844,
-0.0986250564455986,
0.018848147243261337,
0.018473336473107338,
0.0324837788939476,
-0.09692367166280746,
-0.10255902260541916,
0.0001081598165910691,
-0.026199333369731903,
0.15378481149673462,
0.006119360215961933,
0.028849637135863304,
-0.015036869794130325,
0.12545040249824524,
-0.054936639964580536,
-0.061877064406871796,
-0.08078408986330032,
-0.06670729070901871,
-0.11136005818843842,
-0.03891221806406975,
-0.04572887346148491,
-0.06442523002624512,
-0.05018455162644386,
0.016862653195858,
-0.1014518141746521,
0.04360930621623993,
-0.0019356401171535254,
0.03357761353254318,
0.07254161685705185,
0.14029362797737122,
-0.07172060012817383,
0.0030822311528027058,
-0.017484821379184723,
-0.005453433375805616,
0.008818604983389378,
-0.00537698483094573,
-0.06771504878997803,
-0.04811381176114082,
0.034488189965486526,
0.05290743708610535,
0.0009437095141038299,
0.057250991463661194,
-0.07165136188268661,
-0.04252389818429947,
0.040219444781541824,
-0.10260352492332458,
0.02077576145529747,
0.004684238228946924,
-0.05059877783060074,
0.015538856387138367,
0.045114923268556595,
0.014431966468691826,
-0.06600669771432877,
0.013093877583742142,
-0.044306810945272446,
0.012585924938321114,
-0.12403649836778641,
-0.0981006845831871,
0.014390455558896065,
0.04592214524745941,
-0.07263265550136566,
-0.07784212380647659,
-0.16940025985240936,
-0.09901706874370575,
0.07373596727848053,
-0.05517539754509926,
-0.024134866893291473,
-0.06335703283548355,
-0.027911866083741188,
-0.02079927548766136,
0.03674957901239395,
-0.03889145329594612,
0.0036653855349868536,
0.03769891336560249,
-0.0606955848634243,
0.04417891427874565,
0.06140120327472687,
0.05188450589776039,
-0.06049574911594391,
0.0025680623948574066,
-0.23669973015785217,
0.14528171718120575,
-0.039610523730516434,
-0.00523405522108078,
-0.09678566455841064,
-0.055742066353559494,
0.002073598327115178,
0.0712423324584961,
0.021929951384663582,
0.12253876030445099,
-0.1698385775089264,
-0.030308084562420845,
0.1849503368139267,
-0.08492495864629745,
-0.027966884896159172,
0.08978645503520966,
-0.0711449533700943,
0.1642250418663025,
0.12941902875900269,
-0.03897085413336754,
-0.002792239421978593,
-0.027819087728857994,
0.04167782887816429,
-0.04780369997024536,
-0.055475663393735886,
0.21407143771648407,
0.029940331354737282,
-0.07870439440011978,
-0.001271408167667687,
0.009028487838804722,
-0.04519196227192879,
-0.021085573360323906,
0.024089520797133446,
-0.04698731377720833,
0.009078252129256725,
0.02706283889710903,
0.06055031344294548,
-0.037718530744314194,
-0.05581069737672806,
-0.017381373792886734,
-0.1354418247938156,
-0.01829259842634201,
0.07542447000741959,
-0.006675477139651775,
0.023809857666492462,
-0.04607401415705681,
-0.019543077796697617,
0.03505373373627663,
0.01332911103963852,
-0.17249324917793274,
-0.0730365663766861,
0.024419914931058884,
-0.07888287305831909,
0.09347806125879288,
0.10085022449493408,
0.040142860263586044,
0.023547610267996788,
-0.06976702064275742,
-0.007742525078356266,
-0.03268227353692055,
-0.017544614151120186,
-0.05148827284574509,
-0.10600919276475906,
0.025056064128875732,
-0.031461168080568314,
0.14217154681682587,
-0.16119349002838135,
0.03717312216758728,
0.05972924828529358,
0.055059824138879776,
0.028014814481139183,
-0.0509101003408432,
0.010870260186493397,
-0.014257877133786678,
0.003347113961353898,
-0.06259242445230484,
0.03637707233428955,
0.024528907611966133,
-0.05523906648159027,
0.07482890039682388,
-0.1896848976612091,
-0.1220289021730423,
0.059603117406368256,
0.0473996102809906,
-0.1980735808610916,
-0.046519920229911804,
0.003208923852071166,
0.004536756314337254,
-0.12502022087574005,
-0.09302064031362534,
0.2654433846473694,
0.01682277023792267,
0.030088694766163826,
-0.08020871877670288,
-0.07277537882328033,
-0.03054407425224781,
-0.022973936051130295,
-0.022187067195773125,
0.03931765258312225,
-0.04813503846526146,
-0.23628486692905426,
0.1097245067358017,
0.03224620595574379,
-0.03002318926155567,
0.0798766165971756,
0.06534350663423538,
-0.045611634850502014,
-0.0492401197552681,
0.018231553956866264,
-0.021692121401429176,
0.08331290632486343,
-0.005199492443352938,
-0.007548299618065357,
0.01948319748044014,
0.009761378169059753,
-0.008152027614414692,
-0.1221969798207283,
0.05699566379189491,
0.06694962084293365,
-0.011928174644708633,
0.040632233023643494,
-0.025620169937610626,
0.015242690220475197,
0.1056833490729332,
0.037519197911024094,
0.02683272957801819,
0.04831099510192871,
-0.03308525308966637,
-0.16843825578689575,
0.15789981186389923,
-0.1109975129365921,
-0.24462909996509552,
-0.10142534971237183,
0.037814874202013016,
-0.01310014445334673,
0.010415027849376202,
0.022094599902629852,
-0.08182055503129959,
-0.05317653343081474,
-0.06547588855028152,
0.0853143185377121,
-0.03264276683330536,
-0.14226536452770233,
-0.01650678925216198,
-0.004537203814834356,
0.024176981300115585,
-0.10775118321180344,
-0.006206775549799204,
0.03571590036153793,
-0.12330233305692673,
0.038467127829790115,
0.02003110945224762,
0.014397463761270046,
0.10088968276977539,
0.036985550075769424,
-0.021046746522188187,
-0.026252558454871178,
0.20706872642040253,
-0.1262310892343521,
0.1002432331442833,
0.089960977435112,
-0.06068702042102814,
0.08750615268945694,
0.1317475587129593,
0.03211789205670357,
-0.029686739668250084,
0.03702884167432785,
0.034453801810741425,
-0.03764946386218071,
-0.22335897386074066,
-0.06089932098984718,
-0.019413575530052185,
-0.09150070697069168,
0.03995111584663391,
0.0340774767100811,
0.1621207296848297,
0.09201010316610336,
-0.08392928540706635,
-0.023922843858599663,
0.12909522652626038,
0.10264214873313904,
0.036147553473711014,
-0.029634157195687294,
0.06502646207809448,
0.0003961194306612015,
-0.03314422816038132,
0.05427224934101105,
-0.04061068221926689,
0.2600196897983551,
0.021233832463622093,
0.028458066284656525,
0.031269945204257965,
-0.043794505298137665,
0.0058619678020477295,
0.04457348585128784,
-0.03344176709651947,
0.016345679759979248,
-0.054647669196128845,
-0.04891043156385422,
-0.015181849710643291,
0.11187869310379028,
0.09288030862808228,
-0.025194518268108368,
-0.01923956163227558,
-0.08332783728837967,
0.08389033377170563,
0.13968124985694885,
-0.033702924847602844,
-0.14364634454250336,
-0.056217972189188004,
0.06321431696414948,
-0.012161624617874622,
-0.12586642801761627,
0.0017107705352827907,
0.05526547133922577,
-0.20721982419490814,
0.03542614355683327,
-0.03640291094779968,
0.076665960252285,
-0.07677492499351501,
0.002147952327504754,
0.008460533805191517,
0.11349538713693619,
-0.03285268321633339,
0.08577676862478256,
-0.18116043508052826,
0.12431453168392181,
0.003032323205843568,
0.04503920301795006,
-0.08106869459152222,
0.03985583409667015,
0.028662681579589844,
-0.006514126900583506,
0.1528593748807907,
0.054379209876060486,
-0.12651318311691284,
-0.05215327814221382,
0.0431407168507576,
0.0010493822628632188,
0.07858239859342575,
-0.08230361342430115,
0.07526259869337082,
-0.0006791189662180841,
0.02731599286198616,
-0.010491321794688702,
0.06407583504915237,
-0.12582412362098694,
-0.15379959344863892,
0.05190280079841614,
-0.07920219749212265,
0.1242293268442154,
-0.08123506605625153,
-0.033506687730550766,
-0.09467688202857971,
0.19887153804302216,
-0.08528012782335281,
-0.12421026825904846,
-0.11285664886236191,
-0.06115369498729706,
0.06778405606746674,
-0.07198683172464371,
0.016896046698093414,
0.008703019469976425,
0.13767656683921814,
-0.05449330434203148,
-0.034036409109830856,
0.07144421339035034,
-0.04834986478090286,
-0.12709292769432068,
-0.051909465342760086,
0.10055897384881973,
0.16156317293643951,
0.062235645949840546,
0.004470381885766983,
0.03740028291940689,
0.01286284625530243,
-0.10169762372970581,
0.022009631618857384,
0.22835397720336914,
0.0398360975086689,
0.07383216917514801,
-0.10761801898479462,
-0.08612826466560364,
-0.13040503859519958,
-0.07659229636192322,
0.14958412945270538,
0.09728787839412689,
-0.053518977016210556,
0.11736764013767242,
0.20451441407203674,
-0.10917633771896362,
-0.17827758193016052,
0.0491790845990181,
0.047007087618112564,
0.04576478153467178,
0.011586934328079224,
-0.25121942162513733,
0.00001678221087786369,
0.0576339028775692,
0.020100241526961327,
0.027793413028120995,
-0.18595407903194427,
-0.10991747677326202,
0.12687374651432037,
0.024767786264419556,
-0.030351171270012856,
-0.08241089433431625,
-0.030817771330475807,
-0.03256578743457794,
-0.12116966396570206,
0.09884357452392578,
-0.07674352824687958,
0.06952428817749023,
0.03359856829047203,
0.12600956857204437,
0.03657756745815277,
-0.01430295780301094,
0.14673559367656708,
-0.004027682822197676,
0.038459330797195435,
-0.13113752007484436,
-0.09272661060094833,
0.04486915469169617,
-0.0585634745657444,
0.11534660309553146,
0.028382519260048866,
-0.00206344248726964,
-0.05777111276984215,
-0.030377298593521118,
-0.11947754770517349,
0.012067317962646484,
-0.06811971962451935,
-0.08553226292133331,
-0.08934551477432251,
0.12394025176763535,
0.1017034649848938,
-0.06441833823919296,
-0.06308048218488693,
-0.07202912867069244,
0.11980193108320236,
0.04619049280881882,
0.16674594581127167,
0.07470636069774628,
-0.12516340613365173,
-0.007643573917448521,
-0.014650067314505577,
0.08244946599006653,
-0.06974516063928604,
0.008849918842315674,
0.07816872000694275,
-0.012512109242379665,
0.19494839012622833,
0.017885658890008926,
-0.16294437646865845,
-0.012911820784211159,
0.0016550053842365742,
-0.04710175842046738,
-0.16479472815990448,
-0.052579134702682495,
0.1519542932510376,
-0.13827446103096008,
-0.08243530988693237,
0.10409299284219742,
-0.05263899639248848,
-0.050900962203741074,
0.017341524362564087,
0.09711884707212448,
-0.03160468116402626,
0.062417980283498764,
-0.021471630781888962,
0.04428251460194588,
-0.05838324874639511,
0.08053255826234818,
0.06425007432699203,
-0.03784165158867836,
0.05434828996658325,
0.10134361684322357,
-0.08721616864204407,
-0.0414278507232666,
0.011959431692957878,
0.0913979634642601,
-0.09211646020412445,
-0.03693461790680885,
0.0816669762134552,
-0.18278944492340088,
0.01563618704676628,
0.06742684543132782,
0.016279054805636406,
0.02527819573879242,
-0.007328606676310301,
0.04522152990102768,
-0.014351880177855492,
0.03404100611805916,
0.03821560740470886,
0.012251059524714947,
-0.0241936556994915,
0.11906544864177704,
0.02867496944963932,
0.008259712718427181,
-0.007604176644235849,
-0.06394419074058533,
-0.14327172935009003,
-0.0010419490281492472,
-0.14265869557857513,
-0.04529331624507904,
-0.08700363337993622,
-0.0009862114675343037,
-0.011494084261357784,
-0.0029247920028865337,
0.06395388394594193,
0.03508353605866432,
-0.07603916525840759,
-0.048013996332883835,
-0.020757898688316345,
0.047067269682884216,
-0.09007465839385986,
-0.017814213410019875,
0.07780197262763977,
-0.07337375730276108,
0.07860340923070908,
0.04460890218615532,
-0.047882385551929474,
-0.005353660322725773,
-0.20272475481033325,
0.06553715467453003,
-0.02587893046438694,
-0.017756899818778038,
0.022006439045071602,
-0.2655273973941803,
-0.005942304152995348,
-0.03522487357258797,
-0.03143000602722168,
0.004092542454600334,
0.0002627215289976448,
-0.045607876032590866,
0.05651487410068512,
0.0006891177035868168,
-0.05039753019809723,
-0.08147836476564407,
0.05349737033247948,
0.05782625079154968,
0.020000625401735306,
0.10344469547271729,
-0.04154619202017784,
0.10027968138456345,
-0.1222744956612587,
0.00941743515431881,
0.03123178333044052,
0.024769112467765808,
0.048376668244600296,
-0.07999660819768906,
0.048012472689151764,
-0.007708278018981218,
0.11546657234430313,
-0.018352486193180084,
-0.060516033321619034,
0.042160771787166595,
0.01992003060877323,
-0.07329397648572922,
-0.004676405806094408,
-0.06033801659941673,
-0.05537401884794235,
-0.028525564819574356,
-0.03727731108665466,
-0.011601299047470093,
-0.039248351007699966,
-0.14482812583446503,
0.13477258384227753,
0.05395982414484024,
0.13781803846359253,
-0.0016267284518107772,
0.068557508289814,
-0.06031075865030289,
-0.01763605698943138,
0.00037282274570316076,
0.010375384241342545,
-0.050441544502973557,
-0.07431185245513916,
0.11809670180082321,
0.11499667167663574,
-0.13163334131240845,
0.15043190121650696,
-0.06224016845226288,
-0.05942843481898308,
-0.050340283662080765,
-0.1812853366136551,
-0.02439139597117901,
-0.00965964887291193,
-0.02640860714018345,
-0.09526079893112183,
0.04663621634244919,
0.009683202020823956,
0.004290648270398378,
-0.03599705547094345,
0.14680792391300201,
-0.0600169375538826,
-0.15372267365455627,
0.07470608502626419,
0.026439769193530083,
0.05067269876599312,
0.012545451521873474,
0.030400175601243973,
0.011155635118484497,
0.08137309551239014,
0.07453823834657669,
0.06606844067573547,
0.01996871456503868,
-0.02458568662405014,
-0.023057419806718826,
-0.009872527793049812,
0.01960107311606407,
-0.0027913276571780443,
-0.014736711978912354,
0.18872180581092834,
0.041007738560438156,
-0.017608875408768654,
0.004950494971126318,
0.1329483836889267,
-0.08248741924762726,
-0.12448601424694061,
-0.14497680962085724,
0.23802635073661804,
-0.0149042047560215,
-0.027146635577082634,
0.041964881122112274,
-0.1004183292388916,
0.05985286831855774,
0.12784284353256226,
0.17968884110450745,
-0.043795645236968994,
0.012068390846252441,
-0.014919710345566273,
0.003849203698337078,
0.024412618950009346,
0.029840724542737007,
0.06564367562532425,
0.23671209812164307,
-0.10139975696802139,
0.22125199437141418,
-0.01719229854643345,
-0.005267994478344917,
-0.022204404696822166,
0.10958362370729446,
-0.015192036516964436,
0.026087692007422447,
-0.06885181367397308,
0.10675613582134247,
-0.07663796097040176,
-0.21587595343589783,
-0.05476555973291397,
-0.03109039179980755,
-0.06666023284196854,
0.008182482793927193,
0.0010287076001986861,
0.029605289921164513,
0.10696499049663544,
0.045792803168296814,
-0.017161713913083076,
0.17364662885665894,
0.026799188926815987,
-0.04800315573811531,
-0.04758671671152115,
0.08374399691820145,
-0.06627806276082993,
0.17497198283672333,
0.02212666906416416,
0.06216522306203842,
0.09241059422492981,
-0.007167183794081211,
-0.06383069604635239,
0.07003746181726456,
0.0003556244191713631,
-0.04866398125886917,
-0.0021842012647539377,
0.15468183159828186,
0.0005348807899281383,
0.05226729065179825,
0.08591124415397644,
-0.021922463551163673,
0.09756199270486832,
0.06037723645567894,
-0.024342292919754982,
-0.09739477187395096,
0.10978855937719345,
-0.10497357696294785,
0.1309775561094284,
0.18918082118034363,
-0.027434637770056725,
-0.023563547059893608,
-0.06449409574270248,
-0.005842267069965601,
0.004462642595171928,
0.07985293865203857,
-0.018102401867508888,
-0.11002728343009949,
0.018056370317935944,
0.037664927542209625,
0.0931667611002922,
-0.1872759610414505,
-0.1044016182422638,
0.0454641655087471,
-0.04321925342082977,
-0.0009334649075753987,
0.06908278912305832,
0.037169259041547775,
0.024733396247029305,
0.0004384252242743969,
-0.08004851639270782,
-0.0015366500010713935,
0.14265114068984985,
-0.07447333633899689,
-0.0026267676148563623
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# biobert_v1.1_pubmed-finetuned-squad
This model is a fine-tuned version of [gerardozq/biobert_v1.1_pubmed-finetuned-squad](https://huggingface.co/gerardozq/biobert_v1.1_pubmed-finetuned-squad) on the squad_v2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad_v2"], "model-index": [{"name": "biobert_v1.1_pubmed-finetuned-squad", "results": []}]}
|
question-answering
|
gerardozq/biobert_v1.1_pubmed-finetuned-squad
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad_v2",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad_v2 #endpoints_compatible #region-us
|
# biobert_v1.1_pubmed-finetuned-squad
This model is a fine-tuned version of gerardozq/biobert_v1.1_pubmed-finetuned-squad on the squad_v2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
|
[
"# biobert_v1.1_pubmed-finetuned-squad\n\nThis model is a fine-tuned version of gerardozq/biobert_v1.1_pubmed-finetuned-squad on the squad_v2 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.0+cu111\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad_v2 #endpoints_compatible #region-us \n",
"# biobert_v1.1_pubmed-finetuned-squad\n\nThis model is a fine-tuned version of gerardozq/biobert_v1.1_pubmed-finetuned-squad on the squad_v2 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.0+cu111\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
49,
56,
6,
12,
8,
3,
90,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #question-answering #generated_from_trainer #dataset-squad_v2 #endpoints_compatible #region-us \n# biobert_v1.1_pubmed-finetuned-squad\n\nThis model is a fine-tuned version of gerardozq/biobert_v1.1_pubmed-finetuned-squad on the squad_v2 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1### Framework versions\n\n- Transformers 4.12.3\n- Pytorch 1.9.0+cu111\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
-0.08447594195604324,
0.10482409596443176,
-0.002368863672018051,
0.07820463925600052,
0.14654402434825897,
0.02948390506207943,
0.13207519054412842,
0.1037808209657669,
-0.06803998351097107,
0.07994741201400757,
0.07733793556690216,
0.050932876765728,
0.037311896681785583,
0.08603612333536148,
-0.022094137966632843,
-0.25816187262535095,
0.0010628366144374013,
0.03722606226801872,
-0.08427954465150833,
0.1011868417263031,
0.08447283506393433,
-0.12127042561769485,
0.0649983212351799,
0.017991166561841965,
-0.16659234464168549,
0.04367659613490105,
-0.011777333915233612,
-0.026127031072974205,
0.11425893753767014,
0.013944440521299839,
0.12076196819543839,
0.008621707558631897,
0.13595405220985413,
-0.21114540100097656,
0.0074106911197304726,
0.07355096191167831,
0.026759563013911247,
0.074317067861557,
0.04163716360926628,
0.025787770748138428,
0.10751038044691086,
-0.14002026617527008,
0.08676812052726746,
0.01651313342154026,
-0.06710784882307053,
-0.14715608954429626,
-0.06391297280788422,
0.07720255106687546,
0.06789553910493851,
0.08794939517974854,
-0.004442108329385519,
0.1405286341905594,
-0.10002733021974564,
0.07436642795801163,
0.20135673880577087,
-0.2661706507205963,
-0.08334562182426453,
0.059913475066423416,
0.06505422294139862,
0.09004682302474976,
-0.10684046149253845,
-0.007945162244141102,
0.0461040697991848,
0.028535708785057068,
0.08653423935174942,
-0.033371452242136,
-0.0549672394990921,
0.009105253033339977,
-0.15040455758571625,
-0.010884715244174004,
0.14855621755123138,
0.03262089565396309,
-0.018311290070414543,
-0.07601296901702881,
-0.07181999087333679,
-0.054297178983688354,
-0.015917818993330002,
-0.07071057707071304,
0.03175247460603714,
-0.06758566945791245,
-0.07488318532705307,
-0.04925943911075592,
-0.07616816461086273,
-0.06858149915933609,
-0.014509719796478748,
0.11387010663747787,
0.055548492819070816,
0.0134284608066082,
-0.05060514435172081,
0.0948064774274826,
-0.01240076869726181,
-0.1120428517460823,
0.005928013473749161,
0.00900530070066452,
-0.08366253226995468,
-0.062379900366067886,
-0.04915270209312439,
-0.026353415101766586,
0.003268286120146513,
0.15320897102355957,
-0.047049712389707565,
0.05523373931646347,
0.05114753544330597,
-0.0015743292169645429,
-0.0008401550003327429,
0.13959112763404846,
-0.0748107060790062,
-0.046816136687994,
-0.010828646831214428,
0.08494758605957031,
-0.012867116369307041,
-0.011978582479059696,
-0.10135498642921448,
-0.01658589579164982,
0.1033540889620781,
0.041191451251506805,
-0.049661681056022644,
0.03080512210726738,
-0.000567030452657491,
-0.05442135035991669,
-0.025862758979201317,
-0.10873860120773315,
0.05317758768796921,
0.0027765880804508924,
-0.07390104979276657,
0.01605178788304329,
-0.010887045413255692,
-0.010990828275680542,
-0.0469011627137661,
0.11277155578136444,
-0.11126705259084702,
0.011431384831666946,
-0.08195369690656662,
-0.09763538837432861,
0.016071468591690063,
-0.09088432788848877,
-0.0027431428898125887,
-0.0677843987941742,
-0.13661064207553864,
-0.037492215633392334,
0.0395163856446743,
-0.04174220934510231,
-0.031768783926963806,
-0.032365091145038605,
-0.04660322517156601,
-0.0021366304717957973,
-0.00633727153763175,
0.1024107038974762,
-0.04147524759173393,
0.07578261941671371,
0.0032786400988698006,
0.03283348307013512,
0.012243606150150299,
0.042052313685417175,
-0.09051734954118729,
0.01987317204475403,
-0.11237460374832153,
0.045260608196258545,
-0.09472239017486572,
-0.005901258904486895,
-0.10871801525354385,
-0.12458579987287521,
0.005054968409240246,
-0.012595923617482185,
0.05998177453875542,
0.09221738576889038,
-0.16292279958724976,
-0.06269553303718567,
0.15646101534366608,
-0.06220313906669617,
-0.07019854336977005,
0.1179555356502533,
-0.055579423904418945,
0.011188282631337643,
0.06483344733715057,
0.182540163397789,
0.12863971292972565,
-0.15721331536769867,
-0.014999614097177982,
-0.004086455330252647,
0.08783091604709625,
0.004604021552950144,
0.04906763508915901,
0.005263046361505985,
0.0407073050737381,
0.007855160161852837,
-0.09863243252038956,
0.0021945296321064234,
-0.0992550477385521,
-0.08215811848640442,
-0.03843117132782936,
-0.0805370956659317,
0.070171058177948,
0.058724865317344666,
0.054314617067575455,
-0.05107734352350235,
-0.09173484891653061,
0.1484292596578598,
0.10491763055324554,
-0.08343362808227539,
0.00533662224188447,
-0.06569365411996841,
0.0559205561876297,
-0.0518890842795372,
-0.017400454729795456,
-0.1918143630027771,
-0.13674166798591614,
0.036880649626255035,
-0.07339518517255783,
0.05603545531630516,
0.07203590124845505,
0.06430090963840485,
0.07312050461769104,
-0.05300386622548103,
-0.0021982204634696245,
-0.09752678871154785,
-0.0024170696269720793,
-0.10594537109136581,
-0.18980687856674194,
-0.06346108019351959,
-0.01676693558692932,
0.1283496469259262,
-0.23142258822917938,
0.015245589427649975,
-0.030278615653514862,
0.12077899277210236,
0.002341712126508355,
-0.05153669789433479,
-0.037610895931720734,
0.07343332469463348,
-0.01567590795457363,
-0.06959281861782074,
0.05065134912729263,
-0.02129075862467289,
-0.062059272080659866,
-0.10106341540813446,
-0.1261368840932846,
0.07607811689376831,
0.07214665412902832,
0.014660889282822609,
-0.07735481858253479,
-0.030514439567923546,
-0.06865094602108002,
-0.04170112684369087,
-0.08875259011983871,
0.001043166616000235,
0.19775225222110748,
-0.005853098817169666,
0.11657889932394028,
-0.06829890608787537,
-0.06098224222660065,
0.012840470299124718,
0.0020865732803940773,
0.0042458800598979,
0.08669912815093994,
0.13881883025169373,
-0.11544843763113022,
0.10589344054460526,
0.07274938374757767,
-0.07825719565153122,
0.1560458242893219,
-0.04003112390637398,
-0.09090795367956161,
-0.030975013971328735,
-0.005408486817032099,
-0.022647380828857422,
0.152577742934227,
-0.14408868551254272,
-0.006924184039235115,
0.021973542869091034,
0.008840457536280155,
0.03946339711546898,
-0.17685523629188538,
-0.007052593398839235,
0.0232821237295866,
-0.02942730486392975,
-0.02459150180220604,
-0.025912616401910782,
0.033138155937194824,
0.10422439873218536,
0.013838812708854675,
-0.0199649166315794,
-0.005697478540241718,
0.0013295816024765372,
-0.09040093421936035,
0.2006857544183731,
-0.0832393541932106,
-0.1367180198431015,
-0.1003008484840393,
0.027394063770771027,
-0.08085947483778,
-0.02003464475274086,
0.013903920538723469,
-0.09616827219724655,
-0.03647523745894432,
-0.05609548091888428,
0.024200160056352615,
-0.03733973577618599,
-0.0007827534573152661,
0.02668377198278904,
-0.0031662925612181425,
0.06419511884450912,
-0.12861785292625427,
0.011188148520886898,
-0.05651901662349701,
-0.08990021049976349,
-0.007667290512472391,
0.05388380214571953,
0.1360607147216797,
0.1511487513780594,
-0.023487411439418793,
0.018497560173273087,
-0.033720746636390686,
0.24086569249629974,
-0.0643104761838913,
0.002526992466300726,
0.10858122259378433,
-0.0008857279899530113,
0.04423368722200394,
0.09900899231433868,
0.05720818042755127,
-0.08671591430902481,
0.008206583559513092,
0.08372431993484497,
-0.026353104040026665,
-0.22100567817687988,
-0.06721512973308563,
-0.04573751613497734,
-0.05656971037387848,
0.12215598672628403,
0.029870450496673584,
0.018738457933068275,
0.04890257120132446,
-0.0025303035508841276,
0.0767493024468422,
-0.034240614622831345,
0.07299356162548065,
0.1316107213497162,
0.02640235237777233,
0.1076449304819107,
-0.014524267055094242,
-0.07069237530231476,
0.0545618012547493,
0.016668619588017464,
0.27753570675849915,
-0.004458397626876831,
0.10893065482378006,
0.04637841880321503,
0.14204484224319458,
-0.019807644188404083,
0.049130331724882126,
-0.00044437553151510656,
-0.024928994476795197,
0.012078676372766495,
-0.055283322930336,
-0.015502803027629852,
0.011637079529464245,
0.019809069111943245,
0.02328646555542946,
-0.09379613399505615,
0.014223498292267323,
0.027523240074515343,
0.20582315325737,
0.020989328622817993,
-0.26135367155075073,
-0.09036892652511597,
0.00022021585027687252,
-0.026543384417891502,
-0.043590646237134933,
0.009401388466358185,
0.16526208817958832,
-0.11015978455543518,
0.02363225817680359,
-0.06745300441980362,
0.09426779299974442,
-0.03405284509062767,
0.011757497675716877,
0.051221564412117004,
0.12450307607650757,
-0.0014169532805681229,
0.09921541810035706,
-0.17273560166358948,
0.22588151693344116,
0.019224708899855614,
0.09520803391933441,
-0.08010773360729218,
0.01732662133872509,
-0.0033478226978331804,
0.07679958641529083,
0.09542295336723328,
0.01033795066177845,
0.0027459440752863884,
-0.17354510724544525,
-0.057448990643024445,
0.04769589751958847,
0.12796810269355774,
-0.026527173817157745,
0.09595856070518494,
-0.040878936648368835,
0.008040080778300762,
0.051376622170209885,
-0.06472277641296387,
-0.14370708167552948,
-0.1337447613477707,
0.028185779228806496,
-0.013764417730271816,
-0.08566982299089432,
-0.061130091547966,
-0.11282303929328918,
-0.055983543395996094,
0.18851496279239655,
-0.018872810527682304,
-0.026162777096033096,
-0.1344117373228073,
0.11654984951019287,
0.11665941029787064,
-0.08113787323236465,
-0.0004973100149072707,
0.02484080195426941,
0.11940159648656845,
0.00890879612416029,
-0.10072027891874313,
0.05058690160512924,
-0.07078026235103607,
-0.13400641083717346,
-0.06390318274497986,
0.12435571849346161,
0.06851375102996826,
0.053297389298677444,
0.0018791506299749017,
0.012641850858926773,
0.005215424578636885,
-0.11075885593891144,
0.010695049539208412,
0.04323355481028557,
0.05893001705408096,
0.036120470613241196,
-0.10582306981086731,
0.029742663726210594,
-0.0352194607257843,
0.022290799766778946,
0.13727477192878723,
0.17261303961277008,
-0.09259150922298431,
0.03934052586555481,
0.08745395392179489,
-0.08720460534095764,
-0.19317805767059326,
0.0679933950304985,
0.10767194628715515,
-0.0014389981515705585,
0.03111564740538597,
-0.20255112648010254,
0.15665404498577118,
0.12292886525392532,
-0.00009430837963009253,
0.07533136755228043,
-0.3392775356769562,
-0.1308862864971161,
0.07705748081207275,
0.12694115936756134,
0.04644518718123436,
-0.14653229713439941,
-0.03252242133021355,
-0.0034149603452533484,
-0.1348462849855423,
0.12948904931545258,
-0.07831677794456482,
0.10968673974275589,
-0.008889096789062023,
0.10660409182310104,
0.01839585229754448,
-0.050398699939250946,
0.12625059485435486,
0.04937633499503136,
0.06551333516836166,
-0.039086226373910904,
-0.037614598870277405,
0.10769999772310257,
-0.047671325504779816,
0.028002804145216942,
-0.01372736319899559,
0.04683566838502884,
-0.13981667160987854,
-0.03882378712296486,
-0.08037608861923218,
0.06645799428224564,
-0.05346514657139778,
-0.07044311612844467,
-0.05901961028575897,
0.057019174098968506,
0.05386275053024292,
-0.028645290061831474,
0.0710538923740387,
0.015962781384587288,
0.1184593141078949,
0.06165168434381485,
0.09182490408420563,
-0.03205494582653046,
-0.13217787444591522,
-0.011285338550806046,
-0.0006600287742912769,
0.07569685578346252,
-0.08021130412817001,
0.02055499143898487,
0.1567383110523224,
0.06981690973043442,
0.12959979474544525,
0.05375586822628975,
-0.034834425896406174,
-0.005559925921261311,
0.016227317973971367,
-0.1368044912815094,
-0.19190064072608948,
-0.0108585050329566,
-0.09152137488126755,
-0.14030449092388153,
0.04610290750861168,
0.09436267614364624,
-0.05337804555892944,
-0.016777008771896362,
-0.016072629019618034,
0.0013272605137899518,
-0.018250243738293648,
0.20594288408756256,
0.04705513268709183,
0.05861882120370865,
-0.08823111653327942,
0.07154097408056259,
0.07096294313669205,
-0.09415982663631439,
0.03146132454276085,
0.06464259326457977,
-0.08038199692964554,
-0.021307826042175293,
0.05623968690633774,
0.17768922448158264,
-0.0769123062491417,
-0.021333109587430954,
-0.10444730520248413,
-0.1021832749247551,
0.052133359014987946,
0.15447762608528137,
0.059277042746543884,
-0.041176725178956985,
-0.06537459790706635,
0.04540340229868889,
-0.15275150537490845,
0.09285499155521393,
0.029874475672841072,
0.07881855219602585,
-0.12388764321804047,
0.12442536652088165,
0.011415044777095318,
0.04648905247449875,
-0.025344595313072205,
0.01672639138996601,
-0.12437856197357178,
-0.01842067949473858,
-0.14339037239551544,
-0.026534074917435646,
-0.02353333681821823,
0.007354241795837879,
-0.01618935726583004,
-0.055881012231111526,
-0.05103229731321335,
0.04393621161580086,
-0.07544267922639847,
-0.03592296689748764,
0.023665858432650566,
0.04746861383318901,
-0.15811632573604584,
0.015442200005054474,
0.017787402495741844,
-0.08263655006885529,
0.07321325689554214,
0.05259246751666069,
0.026496028527617455,
0.03980905935168266,
-0.10529309511184692,
-0.028989898040890694,
0.029148798435926437,
0.05397528037428856,
0.09009154886007309,
-0.08152509480714798,
0.0026250071823596954,
-0.009812122210860252,
0.07155301421880722,
0.02384539134800434,
0.05230860784649849,
-0.10661089420318604,
0.0014073805650696158,
-0.06507281213998795,
-0.08239024132490158,
-0.05979141965508461,
0.038242798298597336,
0.079190194606781,
0.03475331515073776,
0.15861555933952332,
-0.07284825295209885,
0.062399208545684814,
-0.21576516330242157,
-0.0369584746658802,
0.022209811955690384,
-0.04405311867594719,
-0.03365738317370415,
-0.04466237127780914,
0.0714247077703476,
-0.04955756291747093,
0.13003423810005188,
0.00024255613971035928,
0.0966903567314148,
0.03162547945976257,
-0.008671928197145462,
-0.012784244492650032,
0.0030816004145890474,
0.1895562708377838,
0.05371074005961418,
-0.029190726578235626,
0.05776407569646835,
0.010252953507006168,
0.0614224411547184,
0.08435530215501785,
0.2268494814634323,
0.12334297597408295,
-0.04943384230136871,
0.03762770816683769,
0.07766035199165344,
-0.10024408251047134,
-0.1762067824602127,
0.0695793628692627,
-0.02829216606914997,
0.10960428416728973,
-0.029776334762573242,
0.1978502869606018,
0.1011655405163765,
-0.18354175984859467,
0.054290786385536194,
-0.045783720910549164,
-0.1260918825864792,
-0.10970587283372879,
-0.06257735937833786,
-0.07848825305700302,
-0.1164521872997284,
0.02194168046116829,
-0.14317113161087036,
0.018464913591742516,
0.05914923548698425,
0.02487368881702423,
0.00022359982540365309,
0.18229950964450836,
-0.013461535796523094,
0.012555183842778206,
0.04005473479628563,
0.027516396716237068,
-0.0028394130058586597,
-0.06583155691623688,
-0.036146122962236404,
0.03056950308382511,
-0.01186904776841402,
0.0778684988617897,
-0.06136588007211685,
-0.007081089075654745,
0.021182840690016747,
-0.01861601695418358,
-0.0468924306333065,
0.009726906195282936,
0.009404847398400307,
0.0313684344291687,
0.054296962916851044,
0.03091995231807232,
-0.008047446608543396,
-0.050856661051511765,
0.22996410727500916,
-0.05319256708025932,
-0.08758916705846786,
-0.12059076130390167,
0.23632138967514038,
0.02918066829442978,
0.0005218457663431764,
0.07021192461252213,
-0.09119750559329987,
0.0006375906523317099,
0.21988770365715027,
0.1604371964931488,
-0.0636284202337265,
-0.015329070389270782,
0.008239940740168095,
-0.008898471482098103,
-0.02473573014140129,
0.11904282122850418,
0.09089525043964386,
0.11078386753797531,
-0.051374856382608414,
-0.03745817393064499,
-0.02176491916179657,
-0.023680826649069786,
-0.0834459438920021,
0.07244499772787094,
0.07268588244915009,
0.005031130276620388,
-0.05237995460629463,
0.06638562679290771,
-0.016083501279354095,
-0.16564302146434784,
0.07348982244729996,
-0.16751186549663544,
-0.16770778596401215,
-0.028423430398106575,
0.10400690138339996,
-0.04339003562927246,
0.05763501301407814,
-0.023888645693659782,
-0.013787313364446163,
0.11173176765441895,
-0.007931682281196117,
-0.07467229664325714,
-0.10323810577392578,
0.10055510699748993,
-0.08573336899280548,
0.2149142175912857,
-0.017688976600766182,
0.07740425318479538,
0.11173797398805618,
0.03148217126727104,
-0.09407132863998413,
0.02823707088828087,
0.07525713741779327,
-0.1013101190328598,
0.005386603530496359,
0.14275166392326355,
-0.040475163608789444,
0.08265707641839981,
0.03651842102408409,
-0.13166511058807373,
-0.00995270162820816,
-0.026687055826187134,
-0.014080625958740711,
-0.07043496519327164,
0.011838269419968128,
-0.07559840381145477,
0.14298740029335022,
0.20812460780143738,
-0.016691027209162712,
0.0049669621512293816,
-0.08828898519277573,
0.029913825914263725,
0.06807530671358109,
0.12567202746868134,
-0.0432436540722847,
-0.22360259294509888,
0.011851204559206963,
-0.026399867609143257,
0.009221786633133888,
-0.2433549463748932,
-0.09184207767248154,
0.043387606739997864,
-0.03583141043782234,
-0.05099441856145859,
0.10025402903556824,
0.07456492632627487,
0.04706854373216629,
-0.042354751378297806,
-0.15373720228672028,
-0.05869060754776001,
0.13890624046325684,
-0.15635579824447632,
-0.044388093054294586
] |
null | null |
transformers
|
# German Electra Uncased
<img width="300px" src="https://raw.githubusercontent.com/German-NLP-Group/german-transformer-training/master/model_cards/german-electra-logo.png">
[¹]
## Version 2 Release
We released an improved version of this model. Version 1 was trained for 766,000 steps. For this new version we continued the training for an additional 734,000 steps. It therefore follows that version 2 was trained on a total of 1,500,000 steps. See "Evaluation of Version 2: GermEval18 Coarse" below for details.
## Model Info
This Model is suitable for training on many downstream tasks in German (Q&A, Sentiment Analysis, etc.).
It can be used as a drop-in replacement for **BERT** in most down-stream tasks (**ELECTRA** is even implemented as an extended **BERT** Class).
At the time of release (August 2020) this model is the best performing publicly available German NLP model on various German evaluation metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon.
## Installation
This model has the special feature that it is **uncased** but does **not strip accents**.
This possibility was added by us with [PR #6280](https://github.com/huggingface/transformers/pull/6280).
To use it you have to use Transformers version 3.1.0 or newer.
```bash
pip install transformers -U
```
## Uncase and Umlauts ('Ö', 'Ä', 'Ü')
This model is uncased. This helps especially for domains where colloquial terms with uncorrect capitalization is often used.
The special characters 'ö', 'ü', 'ä' are included through the `strip_accent=False` option, as this leads to an improved precision.
## Creators
This model was trained and open sourced in conjunction with the [**German NLP Group**](https://github.com/German-NLP-Group) in equal parts by:
- [**Philip May**](https://May.la) - [Deutsche Telekom](https://www.telekom.de/)
- [**Philipp Reißel**](https://www.linkedin.com/in/philipp-reissel/) - [ambeRoad](https://amberoad.de/)
## Evaluation of Version 2: GermEval18 Coarse
We evaluated all language models on GermEval18 with the F1 macro score. For each model we did an extensive automated hyperparameter search. With the best hyperparmeters we did fit the moodel multiple times on GermEval18. This is done to cancel random effects and get results of statistical relevance.

## Checkpoint evaluation
Since it it not guaranteed that the last checkpoint is the best, we evaluated the checkpoints on GermEval18. We found that the last checkpoint is indeed the best. The training was stable and did not overfit the text corpus.
## Pre-training details
### Data
- Cleaned Common Crawl Corpus 2019-09 German: [CC_net](https://github.com/facebookresearch/cc_net) (Only head coprus and filtered for language_score > 0.98) - 62 GB
- German Wikipedia Article Pages Dump (20200701) - 5.5 GB
- German Wikipedia Talk Pages Dump (20200620) - 1.1 GB
- Subtitles - 823 MB
- News 2018 - 4.1 GB
The sentences were split with [SojaMo](https://github.com/tsproisl/SoMaJo). We took the German Wikipedia Article Pages Dump 3x to oversample. This approach was also used in a similar way in GPT-3 (Table 2.2).
More Details can be found here [Preperaing Datasets for German Electra Github](https://github.com/German-NLP-Group/german-transformer-training)
### Electra Branch no_strip_accents
Because we do not want to stip accents in our training data we made a change to Electra and used this repo [Electra no_strip_accents](https://github.com/PhilipMay/electra/tree/no_strip_accents) (branch `no_strip_accents`). Then created the tf dataset with:
```bash
python build_pretraining_dataset.py --corpus-dir <corpus_dir> --vocab-file <dir>/vocab.txt --output-dir ./tf_data --max-seq-length 512 --num-processes 8 --do-lower-case --no-strip-accents
```
### The training
The training itself can be performed with the Original Electra Repo (No special case for this needed).
We run it with the following Config:
<details>
<summary>The exact Training Config</summary>
<br/>debug False
<br/>disallow_correct False
<br/>disc_weight 50.0
<br/>do_eval False
<br/>do_lower_case True
<br/>do_train True
<br/>electra_objective True
<br/>embedding_size 768
<br/>eval_batch_size 128
<br/>gcp_project None
<br/>gen_weight 1.0
<br/>generator_hidden_size 0.33333
<br/>generator_layers 1.0
<br/>iterations_per_loop 200
<br/>keep_checkpoint_max 0
<br/>learning_rate 0.0002
<br/>lr_decay_power 1.0
<br/>mask_prob 0.15
<br/>max_predictions_per_seq 79
<br/>max_seq_length 512
<br/>model_dir gs://XXX
<br/>model_hparam_overrides {}
<br/>model_name 02_Electra_Checkpoints_32k_766k_Combined
<br/>model_size base
<br/>num_eval_steps 100
<br/>num_tpu_cores 8
<br/>num_train_steps 766000
<br/>num_warmup_steps 10000
<br/>pretrain_tfrecords gs://XXX
<br/>results_pkl gs://XXX
<br/>results_txt gs://XXX
<br/>save_checkpoints_steps 5000
<br/>temperature 1.0
<br/>tpu_job_name None
<br/>tpu_name electrav5
<br/>tpu_zone None
<br/>train_batch_size 256
<br/>uniform_generator False
<br/>untied_generator True
<br/>untied_generator_embeddings False
<br/>use_tpu True
<br/>vocab_file gs://XXX
<br/>vocab_size 32767
<br/>weight_decay_rate 0.01
</details>

Please Note: *Due to the GAN like strucutre of Electra the loss is not that meaningful*
It took about 7 Days on a preemtible TPU V3-8. In total, the Model went through approximately 10 Epochs. For an automatically recreation of a cancelled TPUs we used [tpunicorn](https://github.com/shawwn/tpunicorn). The total cost of training summed up to about 450 $ for one run. The Data-pre processing and Vocab Creation needed approximately 500-1000 CPU hours. Servers were fully provided by [T-Systems on site services GmbH](https://www.t-systems-onsite.de/), [ambeRoad](https://amberoad.de/).
Special thanks to [Stefan Schweter](https://github.com/stefan-it) for your feedback and providing parts of the text corpus.
[¹]: Source for the picture [Pinterest](https://www.pinterest.cl/pin/371828512984142193/)
### Negative Results
We tried the following approaches which we found had no positive influence:
- **Increased Vocab Size**: Leads to more parameters and thus reduced examples/sec while no visible Performance gains were measured
- **Decreased Batch-Size**: The original Electra was trained with a Batch Size per TPU Core of 16 whereas this Model was trained with 32 BS / TPU Core. We found out that 32 BS leads to better results when you compare metrics over computation time
## License - The MIT License
Copyright 2020-2021 Philip May<br>
Copyright 2020-2021 Philipp Reissel
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
{"language": "de", "license": "mit", "tags": ["electra", "commoncrawl", "uncased", "umlaute", "umlauts", "german", "deutsch"], "thumbnail": "https://raw.githubusercontent.com/German-NLP-Group/german-transformer-training/master/model_cards/german-electra-logo.png"}
| null |
german-nlp-group/electra-base-german-uncased
|
[
"transformers",
"pytorch",
"electra",
"pretraining",
"commoncrawl",
"uncased",
"umlaute",
"umlauts",
"german",
"deutsch",
"de",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #electra #pretraining #commoncrawl #uncased #umlaute #umlauts #german #deutsch #de #license-mit #endpoints_compatible #region-us
|
# German Electra Uncased
<img width="300px" src="URL
[¹]
## Version 2 Release
We released an improved version of this model. Version 1 was trained for 766,000 steps. For this new version we continued the training for an additional 734,000 steps. It therefore follows that version 2 was trained on a total of 1,500,000 steps. See "Evaluation of Version 2: GermEval18 Coarse" below for details.
## Model Info
This Model is suitable for training on many downstream tasks in German (Q&A, Sentiment Analysis, etc.).
It can be used as a drop-in replacement for BERT in most down-stream tasks (ELECTRA is even implemented as an extended BERT Class).
At the time of release (August 2020) this model is the best performing publicly available German NLP model on various German evaluation metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon.
## Installation
This model has the special feature that it is uncased but does not strip accents.
This possibility was added by us with PR #6280.
To use it you have to use Transformers version 3.1.0 or newer.
## Uncase and Umlauts ('Ö', 'Ä', 'Ü')
This model is uncased. This helps especially for domains where colloquial terms with uncorrect capitalization is often used.
The special characters 'ö', 'ü', 'ä' are included through the 'strip_accent=False' option, as this leads to an improved precision.
## Creators
This model was trained and open sourced in conjunction with the German NLP Group in equal parts by:
- Philip May - Deutsche Telekom
- Philipp Reißel - ambeRoad
## Evaluation of Version 2: GermEval18 Coarse
We evaluated all language models on GermEval18 with the F1 macro score. For each model we did an extensive automated hyperparameter search. With the best hyperparmeters we did fit the moodel multiple times on GermEval18. This is done to cancel random effects and get results of statistical relevance.
!GermEval18 Coarse Model Evaluation for Version 2
## Checkpoint evaluation
Since it it not guaranteed that the last checkpoint is the best, we evaluated the checkpoints on GermEval18. We found that the last checkpoint is indeed the best. The training was stable and did not overfit the text corpus.
## Pre-training details
### Data
- Cleaned Common Crawl Corpus 2019-09 German: CC_net (Only head coprus and filtered for language_score > 0.98) - 62 GB
- German Wikipedia Article Pages Dump (20200701) - 5.5 GB
- German Wikipedia Talk Pages Dump (20200620) - 1.1 GB
- Subtitles - 823 MB
- News 2018 - 4.1 GB
The sentences were split with SojaMo. We took the German Wikipedia Article Pages Dump 3x to oversample. This approach was also used in a similar way in GPT-3 (Table 2.2).
More Details can be found here Preperaing Datasets for German Electra Github
### Electra Branch no_strip_accents
Because we do not want to stip accents in our training data we made a change to Electra and used this repo Electra no_strip_accents (branch 'no_strip_accents'). Then created the tf dataset with:
### The training
The training itself can be performed with the Original Electra Repo (No special case for this needed).
We run it with the following Config:
<details>
<summary>The exact Training Config</summary>
<br/>debug False
<br/>disallow_correct False
<br/>disc_weight 50.0
<br/>do_eval False
<br/>do_lower_case True
<br/>do_train True
<br/>electra_objective True
<br/>embedding_size 768
<br/>eval_batch_size 128
<br/>gcp_project None
<br/>gen_weight 1.0
<br/>generator_hidden_size 0.33333
<br/>generator_layers 1.0
<br/>iterations_per_loop 200
<br/>keep_checkpoint_max 0
<br/>learning_rate 0.0002
<br/>lr_decay_power 1.0
<br/>mask_prob 0.15
<br/>max_predictions_per_seq 79
<br/>max_seq_length 512
<br/>model_dir gs://XXX
<br/>model_hparam_overrides {}
<br/>model_name 02_Electra_Checkpoints_32k_766k_Combined
<br/>model_size base
<br/>num_eval_steps 100
<br/>num_tpu_cores 8
<br/>num_train_steps 766000
<br/>num_warmup_steps 10000
<br/>pretrain_tfrecords gs://XXX
<br/>results_pkl gs://XXX
<br/>results_txt gs://XXX
<br/>save_checkpoints_steps 5000
<br/>temperature 1.0
<br/>tpu_job_name None
<br/>tpu_name electrav5
<br/>tpu_zone None
<br/>train_batch_size 256
<br/>uniform_generator False
<br/>untied_generator True
<br/>untied_generator_embeddings False
<br/>use_tpu True
<br/>vocab_file gs://XXX
<br/>vocab_size 32767
<br/>weight_decay_rate 0.01
</details>
!Training Loss
Please Note: *Due to the GAN like strucutre of Electra the loss is not that meaningful*
It took about 7 Days on a preemtible TPU V3-8. In total, the Model went through approximately 10 Epochs. For an automatically recreation of a cancelled TPUs we used tpunicorn. The total cost of training summed up to about 450 $ for one run. The Data-pre processing and Vocab Creation needed approximately 500-1000 CPU hours. Servers were fully provided by T-Systems on site services GmbH, ambeRoad.
Special thanks to Stefan Schweter for your feedback and providing parts of the text corpus.
[¹]: Source for the picture Pinterest
### Negative Results
We tried the following approaches which we found had no positive influence:
- Increased Vocab Size: Leads to more parameters and thus reduced examples/sec while no visible Performance gains were measured
- Decreased Batch-Size: The original Electra was trained with a Batch Size per TPU Core of 16 whereas this Model was trained with 32 BS / TPU Core. We found out that 32 BS leads to better results when you compare metrics over computation time
## License - The MIT License
Copyright 2020-2021 Philip May<br>
Copyright 2020-2021 Philipp Reissel
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
[
"# German Electra Uncased\n<img width=\"300px\" src=\"URL\n[¹]",
"## Version 2 Release\nWe released an improved version of this model. Version 1 was trained for 766,000 steps. For this new version we continued the training for an additional 734,000 steps. It therefore follows that version 2 was trained on a total of 1,500,000 steps. See \"Evaluation of Version 2: GermEval18 Coarse\" below for details.",
"## Model Info\nThis Model is suitable for training on many downstream tasks in German (Q&A, Sentiment Analysis, etc.).\n\nIt can be used as a drop-in replacement for BERT in most down-stream tasks (ELECTRA is even implemented as an extended BERT Class).\n\nAt the time of release (August 2020) this model is the best performing publicly available German NLP model on various German evaluation metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon.",
"## Installation\nThis model has the special feature that it is uncased but does not strip accents.\nThis possibility was added by us with PR #6280.\nTo use it you have to use Transformers version 3.1.0 or newer.",
"## Uncase and Umlauts ('Ö', 'Ä', 'Ü')\nThis model is uncased. This helps especially for domains where colloquial terms with uncorrect capitalization is often used.\n\nThe special characters 'ö', 'ü', 'ä' are included through the 'strip_accent=False' option, as this leads to an improved precision.",
"## Creators\nThis model was trained and open sourced in conjunction with the German NLP Group in equal parts by:\n- Philip May - Deutsche Telekom\n- Philipp Reißel - ambeRoad",
"## Evaluation of Version 2: GermEval18 Coarse\nWe evaluated all language models on GermEval18 with the F1 macro score. For each model we did an extensive automated hyperparameter search. With the best hyperparmeters we did fit the moodel multiple times on GermEval18. This is done to cancel random effects and get results of statistical relevance.\n\n!GermEval18 Coarse Model Evaluation for Version 2",
"## Checkpoint evaluation\nSince it it not guaranteed that the last checkpoint is the best, we evaluated the checkpoints on GermEval18. We found that the last checkpoint is indeed the best. The training was stable and did not overfit the text corpus.",
"## Pre-training details",
"### Data\n- Cleaned Common Crawl Corpus 2019-09 German: CC_net (Only head coprus and filtered for language_score > 0.98) - 62 GB\n- German Wikipedia Article Pages Dump (20200701) - 5.5 GB\n- German Wikipedia Talk Pages Dump (20200620) - 1.1 GB\n- Subtitles - 823 MB\n- News 2018 - 4.1 GB\n\nThe sentences were split with SojaMo. We took the German Wikipedia Article Pages Dump 3x to oversample. This approach was also used in a similar way in GPT-3 (Table 2.2).\n\nMore Details can be found here Preperaing Datasets for German Electra Github",
"### Electra Branch no_strip_accents\nBecause we do not want to stip accents in our training data we made a change to Electra and used this repo Electra no_strip_accents (branch 'no_strip_accents'). Then created the tf dataset with:",
"### The training\nThe training itself can be performed with the Original Electra Repo (No special case for this needed).\nWe run it with the following Config:\n\n<details>\n<summary>The exact Training Config</summary>\n<br/>debug False\n<br/>disallow_correct False\n<br/>disc_weight 50.0\n<br/>do_eval False\n<br/>do_lower_case True\n<br/>do_train True\n<br/>electra_objective True\n<br/>embedding_size 768\n<br/>eval_batch_size 128\n<br/>gcp_project None\n<br/>gen_weight 1.0\n<br/>generator_hidden_size 0.33333\n<br/>generator_layers 1.0\n<br/>iterations_per_loop 200\n<br/>keep_checkpoint_max 0\n<br/>learning_rate 0.0002\n<br/>lr_decay_power 1.0\n<br/>mask_prob 0.15\n<br/>max_predictions_per_seq 79\n<br/>max_seq_length 512\n<br/>model_dir gs://XXX\n<br/>model_hparam_overrides {}\n<br/>model_name 02_Electra_Checkpoints_32k_766k_Combined\n<br/>model_size base\n<br/>num_eval_steps 100\n<br/>num_tpu_cores 8\n<br/>num_train_steps 766000\n<br/>num_warmup_steps 10000\n<br/>pretrain_tfrecords gs://XXX\n<br/>results_pkl gs://XXX\n<br/>results_txt gs://XXX\n<br/>save_checkpoints_steps 5000\n<br/>temperature 1.0\n<br/>tpu_job_name None\n<br/>tpu_name electrav5\n<br/>tpu_zone None\n<br/>train_batch_size 256\n<br/>uniform_generator False\n<br/>untied_generator True\n<br/>untied_generator_embeddings False\n<br/>use_tpu True\n<br/>vocab_file gs://XXX\n<br/>vocab_size 32767\n<br/>weight_decay_rate 0.01\n </details>\n\n!Training Loss\n\nPlease Note: *Due to the GAN like strucutre of Electra the loss is not that meaningful*\n\nIt took about 7 Days on a preemtible TPU V3-8. In total, the Model went through approximately 10 Epochs. For an automatically recreation of a cancelled TPUs we used tpunicorn. The total cost of training summed up to about 450 $ for one run. The Data-pre processing and Vocab Creation needed approximately 500-1000 CPU hours. Servers were fully provided by T-Systems on site services GmbH, ambeRoad.\nSpecial thanks to Stefan Schweter for your feedback and providing parts of the text corpus.\n\n[¹]: Source for the picture Pinterest",
"### Negative Results\nWe tried the following approaches which we found had no positive influence:\n\n- Increased Vocab Size: Leads to more parameters and thus reduced examples/sec while no visible Performance gains were measured\n- Decreased Batch-Size: The original Electra was trained with a Batch Size per TPU Core of 16 whereas this Model was trained with 32 BS / TPU Core. We found out that 32 BS leads to better results when you compare metrics over computation time",
"## License - The MIT License\nCopyright 2020-2021 Philip May<br>\nCopyright 2020-2021 Philipp Reissel\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
] |
[
"TAGS\n#transformers #pytorch #electra #pretraining #commoncrawl #uncased #umlaute #umlauts #german #deutsch #de #license-mit #endpoints_compatible #region-us \n",
"# German Electra Uncased\n<img width=\"300px\" src=\"URL\n[¹]",
"## Version 2 Release\nWe released an improved version of this model. Version 1 was trained for 766,000 steps. For this new version we continued the training for an additional 734,000 steps. It therefore follows that version 2 was trained on a total of 1,500,000 steps. See \"Evaluation of Version 2: GermEval18 Coarse\" below for details.",
"## Model Info\nThis Model is suitable for training on many downstream tasks in German (Q&A, Sentiment Analysis, etc.).\n\nIt can be used as a drop-in replacement for BERT in most down-stream tasks (ELECTRA is even implemented as an extended BERT Class).\n\nAt the time of release (August 2020) this model is the best performing publicly available German NLP model on various German evaluation metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon.",
"## Installation\nThis model has the special feature that it is uncased but does not strip accents.\nThis possibility was added by us with PR #6280.\nTo use it you have to use Transformers version 3.1.0 or newer.",
"## Uncase and Umlauts ('Ö', 'Ä', 'Ü')\nThis model is uncased. This helps especially for domains where colloquial terms with uncorrect capitalization is often used.\n\nThe special characters 'ö', 'ü', 'ä' are included through the 'strip_accent=False' option, as this leads to an improved precision.",
"## Creators\nThis model was trained and open sourced in conjunction with the German NLP Group in equal parts by:\n- Philip May - Deutsche Telekom\n- Philipp Reißel - ambeRoad",
"## Evaluation of Version 2: GermEval18 Coarse\nWe evaluated all language models on GermEval18 with the F1 macro score. For each model we did an extensive automated hyperparameter search. With the best hyperparmeters we did fit the moodel multiple times on GermEval18. This is done to cancel random effects and get results of statistical relevance.\n\n!GermEval18 Coarse Model Evaluation for Version 2",
"## Checkpoint evaluation\nSince it it not guaranteed that the last checkpoint is the best, we evaluated the checkpoints on GermEval18. We found that the last checkpoint is indeed the best. The training was stable and did not overfit the text corpus.",
"## Pre-training details",
"### Data\n- Cleaned Common Crawl Corpus 2019-09 German: CC_net (Only head coprus and filtered for language_score > 0.98) - 62 GB\n- German Wikipedia Article Pages Dump (20200701) - 5.5 GB\n- German Wikipedia Talk Pages Dump (20200620) - 1.1 GB\n- Subtitles - 823 MB\n- News 2018 - 4.1 GB\n\nThe sentences were split with SojaMo. We took the German Wikipedia Article Pages Dump 3x to oversample. This approach was also used in a similar way in GPT-3 (Table 2.2).\n\nMore Details can be found here Preperaing Datasets for German Electra Github",
"### Electra Branch no_strip_accents\nBecause we do not want to stip accents in our training data we made a change to Electra and used this repo Electra no_strip_accents (branch 'no_strip_accents'). Then created the tf dataset with:",
"### The training\nThe training itself can be performed with the Original Electra Repo (No special case for this needed).\nWe run it with the following Config:\n\n<details>\n<summary>The exact Training Config</summary>\n<br/>debug False\n<br/>disallow_correct False\n<br/>disc_weight 50.0\n<br/>do_eval False\n<br/>do_lower_case True\n<br/>do_train True\n<br/>electra_objective True\n<br/>embedding_size 768\n<br/>eval_batch_size 128\n<br/>gcp_project None\n<br/>gen_weight 1.0\n<br/>generator_hidden_size 0.33333\n<br/>generator_layers 1.0\n<br/>iterations_per_loop 200\n<br/>keep_checkpoint_max 0\n<br/>learning_rate 0.0002\n<br/>lr_decay_power 1.0\n<br/>mask_prob 0.15\n<br/>max_predictions_per_seq 79\n<br/>max_seq_length 512\n<br/>model_dir gs://XXX\n<br/>model_hparam_overrides {}\n<br/>model_name 02_Electra_Checkpoints_32k_766k_Combined\n<br/>model_size base\n<br/>num_eval_steps 100\n<br/>num_tpu_cores 8\n<br/>num_train_steps 766000\n<br/>num_warmup_steps 10000\n<br/>pretrain_tfrecords gs://XXX\n<br/>results_pkl gs://XXX\n<br/>results_txt gs://XXX\n<br/>save_checkpoints_steps 5000\n<br/>temperature 1.0\n<br/>tpu_job_name None\n<br/>tpu_name electrav5\n<br/>tpu_zone None\n<br/>train_batch_size 256\n<br/>uniform_generator False\n<br/>untied_generator True\n<br/>untied_generator_embeddings False\n<br/>use_tpu True\n<br/>vocab_file gs://XXX\n<br/>vocab_size 32767\n<br/>weight_decay_rate 0.01\n </details>\n\n!Training Loss\n\nPlease Note: *Due to the GAN like strucutre of Electra the loss is not that meaningful*\n\nIt took about 7 Days on a preemtible TPU V3-8. In total, the Model went through approximately 10 Epochs. For an automatically recreation of a cancelled TPUs we used tpunicorn. The total cost of training summed up to about 450 $ for one run. The Data-pre processing and Vocab Creation needed approximately 500-1000 CPU hours. Servers were fully provided by T-Systems on site services GmbH, ambeRoad.\nSpecial thanks to Stefan Schweter for your feedback and providing parts of the text corpus.\n\n[¹]: Source for the picture Pinterest",
"### Negative Results\nWe tried the following approaches which we found had no positive influence:\n\n- Increased Vocab Size: Leads to more parameters and thus reduced examples/sec while no visible Performance gains were measured\n- Decreased Batch-Size: The original Electra was trained with a Batch Size per TPU Core of 16 whereas this Model was trained with 32 BS / TPU Core. We found out that 32 BS leads to better results when you compare metrics over computation time",
"## License - The MIT License\nCopyright 2020-2021 Philip May<br>\nCopyright 2020-2021 Philipp Reissel\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
] |
[
58,
21,
78,
136,
50,
88,
43,
100,
58,
5,
145,
68,
726,
114,
310
] |
[
"passage: TAGS\n#transformers #pytorch #electra #pretraining #commoncrawl #uncased #umlaute #umlauts #german #deutsch #de #license-mit #endpoints_compatible #region-us \n# German Electra Uncased\n<img width=\"300px\" src=\"URL\n[¹]## Version 2 Release\nWe released an improved version of this model. Version 1 was trained for 766,000 steps. For this new version we continued the training for an additional 734,000 steps. It therefore follows that version 2 was trained on a total of 1,500,000 steps. See \"Evaluation of Version 2: GermEval18 Coarse\" below for details.## Model Info\nThis Model is suitable for training on many downstream tasks in German (Q&A, Sentiment Analysis, etc.).\n\nIt can be used as a drop-in replacement for BERT in most down-stream tasks (ELECTRA is even implemented as an extended BERT Class).\n\nAt the time of release (August 2020) this model is the best performing publicly available German NLP model on various German evaluation metrics (CONLL03-DE, GermEval18 Coarse, GermEval18 Fine). For GermEval18 Coarse results see below. More will be published soon.## Installation\nThis model has the special feature that it is uncased but does not strip accents.\nThis possibility was added by us with PR #6280.\nTo use it you have to use Transformers version 3.1.0 or newer.## Uncase and Umlauts ('Ö', 'Ä', 'Ü')\nThis model is uncased. This helps especially for domains where colloquial terms with uncorrect capitalization is often used.\n\nThe special characters 'ö', 'ü', 'ä' are included through the 'strip_accent=False' option, as this leads to an improved precision.## Creators\nThis model was trained and open sourced in conjunction with the German NLP Group in equal parts by:\n- Philip May - Deutsche Telekom\n- Philipp Reißel - ambeRoad",
"passage: ## Evaluation of Version 2: GermEval18 Coarse\nWe evaluated all language models on GermEval18 with the F1 macro score. For each model we did an extensive automated hyperparameter search. With the best hyperparmeters we did fit the moodel multiple times on GermEval18. This is done to cancel random effects and get results of statistical relevance.\n\n!GermEval18 Coarse Model Evaluation for Version 2## Checkpoint evaluation\nSince it it not guaranteed that the last checkpoint is the best, we evaluated the checkpoints on GermEval18. We found that the last checkpoint is indeed the best. The training was stable and did not overfit the text corpus.## Pre-training details### Data\n- Cleaned Common Crawl Corpus 2019-09 German: CC_net (Only head coprus and filtered for language_score > 0.98) - 62 GB\n- German Wikipedia Article Pages Dump (20200701) - 5.5 GB\n- German Wikipedia Talk Pages Dump (20200620) - 1.1 GB\n- Subtitles - 823 MB\n- News 2018 - 4.1 GB\n\nThe sentences were split with SojaMo. We took the German Wikipedia Article Pages Dump 3x to oversample. This approach was also used in a similar way in GPT-3 (Table 2.2).\n\nMore Details can be found here Preperaing Datasets for German Electra Github### Electra Branch no_strip_accents\nBecause we do not want to stip accents in our training data we made a change to Electra and used this repo Electra no_strip_accents (branch 'no_strip_accents'). Then created the tf dataset with:"
] |
[
-0.08595561236143112,
0.07469285279512405,
-0.002903687534853816,
0.026860006153583527,
0.08950883150100708,
-0.014771746471524239,
0.10798262804746628,
0.06676065921783447,
0.02112850546836853,
0.09931674599647522,
-0.005492813885211945,
-0.07606561481952667,
0.03631618618965149,
0.0898762196302414,
0.07368408888578415,
-0.22344285249710083,
0.053313277661800385,
-0.09149042516946793,
0.03952084109187126,
0.050255682319402695,
0.11511009931564331,
-0.09990297257900238,
0.0665723979473114,
0.0001806090585887432,
-0.009363006800413132,
0.08243374526500702,
-0.06644771993160248,
-0.02369193732738495,
0.08203525096178055,
0.01214582845568657,
0.039598919451236725,
-0.0045494865626096725,
0.03144540637731552,
-0.1309434026479721,
0.0039701908826828,
0.06074938923120499,
0.0035197455435991287,
0.02301521599292755,
0.030972670763731003,
-0.0339960977435112,
0.13745222985744476,
-0.07263787090778351,
-0.013073917478322983,
0.03421700745820999,
-0.11213868856430054,
-0.03264786675572395,
-0.11902782320976257,
0.1347716748714447,
0.059789419174194336,
0.02983267977833748,
-0.035915911197662354,
0.026114994660019875,
0.0006559030152857304,
0.0016607230063527822,
0.1432298868894577,
-0.2713576853275299,
-0.040137410163879395,
0.07146501541137695,
0.030356597155332565,
0.08359155803918839,
-0.08716004341840744,
0.013299327343702316,
0.05606430396437645,
0.011268791742622852,
-0.06206151470541954,
-0.008411355316638947,
0.13246798515319824,
-0.029630491510033607,
-0.14645399153232574,
-0.05706503987312317,
0.1023716926574707,
0.0318966768682003,
-0.13350805640220642,
-0.09670880436897278,
-0.024678830057382584,
-0.00819282978773117,
0.020399417728185654,
-0.058040741831064224,
0.0278940312564373,
0.014141973108053207,
0.039560332894325256,
-0.07867730408906937,
-0.11119586229324341,
0.03929918631911278,
-0.023812148720026016,
0.29959771037101746,
0.017768658697605133,
0.007571037858724594,
0.08107538521289825,
0.09031486511230469,
-0.16306492686271667,
-0.07719730585813522,
-0.08313434571027756,
-0.035999689251184464,
-0.06778868287801743,
-0.03657364100217819,
-0.03959399461746216,
-0.12010572850704193,
0.005510643124580383,
0.16764667630195618,
-0.019827645272016525,
0.0239490307867527,
0.02479967474937439,
0.032656583935022354,
0.07912670075893402,
0.1235761046409607,
-0.03055259771645069,
-0.07532719522714615,
-0.00450444221496582,
-0.01434597559273243,
0.05947030708193779,
0.037132248282432556,
-0.042120348662137985,
0.006163676735013723,
0.022064782679080963,
0.020225176587700844,
-0.026030538603663445,
-0.000624433159828186,
-0.048020556569099426,
-0.025173425674438477,
0.15672016143798828,
-0.12514084577560425,
-0.010297233238816261,
-0.027201542630791664,
-0.03846012055873871,
0.056468524038791656,
0.009405622258782387,
-0.018809696659445763,
-0.07690202444791794,
0.10006333887577057,
-0.049986016005277634,
-0.037893496453762054,
-0.07793596386909485,
-0.10417458415031433,
0.019080182537436485,
-0.09735362231731415,
-0.04145825654268265,
-0.10423107445240021,
-0.11060851812362671,
-0.057352885603904724,
0.040639862418174744,
0.02666977234184742,
-0.004452275112271309,
-0.01615046337246895,
-0.020454270765185356,
-0.007265729364007711,
-0.014321882277727127,
-0.10741216689348221,
-0.035991959273815155,
-0.013048307970166206,
-0.03562449291348457,
0.031522348523139954,
-0.04939579591155052,
0.002986484207212925,
-0.1059403121471405,
-0.0009206330869346857,
-0.24130073189735413,
0.03909793868660927,
-0.11436110734939575,
-0.02767937071621418,
-0.09068645536899567,
-0.04264641925692558,
-0.027850046753883362,
0.040745727717876434,
0.05822787061333656,
0.10002367198467255,
-0.1271941363811493,
-0.041433434933423996,
0.10860106348991394,
-0.14787228405475616,
-0.003087778575718403,
0.1291218101978302,
-0.0007918477058410645,
-0.020348189398646355,
0.08253923803567886,
0.04845057800412178,
0.1137852668762207,
-0.15238627791404724,
-0.1249234601855278,
-0.01998697593808174,
0.009260859340429306,
0.10244942456483841,
0.04905040189623833,
-0.056596022099256516,
0.04465879499912262,
0.03554346412420273,
-0.08439384400844574,
-0.03398609533905983,
-0.021098900586366653,
-0.009890962392091751,
-0.023287199437618256,
-0.021091055124998093,
0.05748024582862854,
0.004293318837881088,
-0.06308180093765259,
-0.06680866330862045,
-0.11539120972156525,
0.10377003997564316,
0.10206690430641174,
-0.028119515627622604,
-0.00663406727835536,
-0.046442821621894836,
0.04741594195365906,
-0.05399366468191147,
-0.02634469047188759,
-0.13595972955226898,
-0.10427005589008331,
0.04204174131155014,
-0.08914786577224731,
0.10410749912261963,
0.13401161134243011,
0.062074076384305954,
0.05338037759065628,
-0.0622064545750618,
-0.03759681433439255,
-0.1229865700006485,
-0.03298023343086243,
0.017192240804433823,
-0.08309097588062286,
-0.007890603505074978,
-0.04762250930070877,
0.042201172560453415,
-0.0947214663028717,
-0.026420455425977707,
0.041579749435186386,
0.14199298620224,
0.0022435314022004604,
-0.0899759829044342,
-0.022085942327976227,
0.017672238871455193,
-0.055861227214336395,
-0.053471069782972336,
-0.005206424742937088,
0.04099017009139061,
0.011052479036152363,
0.039661720395088196,
-0.09212858974933624,
-0.1381070911884308,
0.049833185970783234,
0.10044659674167633,
-0.08550579845905304,
-0.005600287578999996,
-0.09401482343673706,
-0.00506682600826025,
-0.10209627449512482,
-0.08245711028575897,
0.25849103927612305,
0.008529947139322758,
0.05681629851460457,
-0.06415468454360962,
-0.04732053354382515,
0.03247787430882454,
-0.011093948036432266,
-0.05214191600680351,
0.057557106018066406,
0.00557342916727066,
-0.09073415398597717,
0.050609417259693146,
0.02274469844996929,
-0.010706139728426933,
0.16980528831481934,
0.008443426340818405,
-0.09521251171827316,
-0.011191315948963165,
0.08476655930280685,
0.019852224737405777,
0.1154189258813858,
-0.031091518700122833,
-0.023534750565886497,
0.012723172083497047,
0.02380632981657982,
0.054691314697265625,
-0.052224740386009216,
0.06795775890350342,
-0.013349778950214386,
-0.027771547436714172,
0.06457257270812988,
0.02245856449007988,
-0.04781968519091606,
0.06839818507432938,
0.022928090766072273,
0.01467902772128582,
-0.039024196565151215,
-0.006535657681524754,
-0.05425579845905304,
0.14056044816970825,
-0.08948849141597748,
-0.22669871151447296,
-0.1624770164489746,
0.04868499934673309,
-0.13888941705226898,
0.026678822934627533,
0.03121526725590229,
-0.07414790242910385,
-0.11124755442142487,
-0.05364230275154114,
0.06313236057758331,
-0.0024140095338225365,
-0.013468766584992409,
-0.070134237408638,
0.0027499007992446423,
0.07732873409986496,
-0.1296454519033432,
-0.03519715368747711,
-0.0067892055958509445,
-0.08992615342140198,
-0.01103256642818451,
0.07346828281879425,
0.0624222531914711,
0.04231049865484238,
-0.05307000130414963,
-0.03132753074169159,
0.024113252758979797,
0.12157934904098511,
-0.09892815351486206,
0.04004910960793495,
0.10261649638414383,
-0.049681760370731354,
0.05110631883144379,
0.02434084191918373,
0.003397295717149973,
-0.0758104994893074,
0.04659580811858177,
0.07150442153215408,
-0.052437201142311096,
-0.2218627631664276,
-0.11439289897680283,
-0.05672891438007355,
-0.017463181167840958,
0.07331405580043793,
0.06376732140779495,
-0.06195393204689026,
0.016524603590369225,
-0.10116153210401535,
0.003819776698946953,
-0.030188530683517456,
0.07456488162279129,
0.046220093965530396,
0.013528792187571526,
0.03361499309539795,
-0.04747376963496208,
-0.05366633087396622,
0.13296857476234436,
-0.03921873867511749,
0.14767363667488098,
0.006796377710998058,
0.032543133944272995,
0.07775354385375977,
-0.02198236994445324,
0.014708095230162144,
0.07582944631576538,
0.010807719081640244,
-0.02928834781050682,
-0.019883494824171066,
-0.06850619614124298,
-0.023872848600149155,
0.09553500264883041,
0.023626744747161865,
-0.0028853658586740494,
-0.04768064618110657,
0.01891578733921051,
0.07646584510803223,
0.1681859791278839,
0.07310298085212708,
-0.12136019766330719,
-0.12047085911035538,
-0.00612702127546072,
-0.07913365960121155,
-0.07303009927272797,
-0.00383273814804852,
0.09510786831378937,
-0.0806337296962738,
0.09200036525726318,
0.0035349950194358826,
0.05606580525636673,
-0.10890771448612213,
-0.00952136144042015,
-0.06420337408781052,
0.12879256904125214,
-0.02028561569750309,
0.07473689317703247,
-0.12115731835365295,
0.11234749108552933,
0.0037284353747963905,
0.10252775251865387,
-0.028200697153806686,
0.027731552720069885,
0.027341635897755623,
-0.043126173317432404,
0.11583150923252106,
0.045250438153743744,
-0.017905864864587784,
0.042856357991695404,
-0.11491172015666962,
0.030194271355867386,
0.12707364559173584,
-0.0834536924958229,
0.0919574648141861,
0.006762316916137934,
0.018050534650683403,
-0.054286837577819824,
-0.010105044580996037,
-0.17376413941383362,
-0.07786600291728973,
0.05117305368185043,
-0.07483276724815369,
0.01055285707116127,
-0.036783043295145035,
-0.051956743001937866,
-0.09207050502300262,
0.31268972158432007,
-0.05796920508146286,
-0.041988443583250046,
-0.10654814541339874,
0.07156850397586823,
0.11835138499736786,
-0.05186722055077553,
0.038700394332408905,
0.028216958045959473,
0.18027564883232117,
-0.04907665401697159,
-0.05323943868279457,
-0.005075731780380011,
-0.09991440176963806,
-0.125100240111351,
-0.01156368013471365,
0.09671525657176971,
0.10712234675884247,
0.017679762095212936,
0.018126750364899635,
0.04716979339718819,
0.027446914464235306,
-0.11906561255455017,
0.043790511786937714,
0.06339920312166214,
0.0008811289444565773,
0.09780523180961609,
-0.07169800996780396,
-0.017823029309511185,
-0.015278542414307594,
-0.024991584941744804,
0.046695780009031296,
0.23072679340839386,
-0.024753253906965256,
0.08012567460536957,
0.1419839859008789,
-0.0803956463932991,
-0.2333107888698578,
-0.015852423384785652,
0.10470838844776154,
0.0745122954249382,
-0.03082427568733692,
-0.17188146710395813,
0.13666144013404846,
0.10215001553297043,
-0.014167249202728271,
0.030765417963266373,
-0.1275288164615631,
-0.11087428033351898,
0.06880339980125427,
-0.03243902325630188,
0.11652319878339767,
-0.05743817985057831,
-0.03470804542303085,
-0.04030868411064148,
0.020801959559321404,
0.02192366123199463,
-0.05298047512769699,
0.07192311435937881,
0.020382212474942207,
0.00951533392071724,
0.06199388951063156,
-0.017236020416021347,
0.13755440711975098,
0.06229657679796219,
0.006135052535682917,
-0.04451431706547737,
0.06030331552028656,
0.050822485238313675,
-0.03793470934033394,
0.13802701234817505,
0.00848142709583044,
0.04307475686073303,
-0.11787382513284683,
-0.03191147744655609,
-0.03234660252928734,
0.12797680497169495,
-0.012137778103351593,
-0.06163977086544037,
-0.0673021450638771,
0.08019360899925232,
0.02343793213367462,
0.036515627056360245,
0.07416859269142151,
-0.02897951565682888,
0.006178785115480423,
0.06741553544998169,
0.13079440593719482,
0.019358262419700623,
-0.10078996419906616,
0.000029348768293857574,
-0.014687641523778439,
0.040278948843479156,
-0.08056948333978653,
0.03469568118453026,
0.1081504225730896,
0.031231839209794998,
0.1042255312204361,
-0.006568351294845343,
-0.1328132301568985,
0.018590761348605156,
0.11512023210525513,
-0.16869650781154633,
-0.1030079573392868,
-0.031364865601062775,
0.023678801953792572,
-0.04126827418804169,
0.04759863018989563,
0.2049683928489685,
-0.08673106878995895,
-0.006963728927075863,
-0.001856176182627678,
0.015390729531645775,
-0.041505225002765656,
0.09679290652275085,
0.006407718174159527,
0.033704593777656555,
-0.043047502636909485,
0.15243443846702576,
0.04412315785884857,
-0.11537715047597885,
0.013336645439267159,
0.0068171219900250435,
-0.07240979373455048,
-0.03139437362551689,
-0.04186388850212097,
0.008705474436283112,
-0.08689916133880615,
-0.05281594395637512,
-0.08584518730640411,
-0.007673334330320358,
0.00923958234488964,
0.015939831733703613,
0.05558036267757416,
0.006037143990397453,
0.001839539036154747,
0.05157475918531418,
-0.04654574394226074,
0.029481124132871628,
0.09822682291269302,
0.07066550850868225,
-0.043932221829891205,
0.094144806265831,
-0.025429029017686844,
-0.011919129639863968,
-0.021018140017986298,
-0.014093512669205666,
-0.059207331389188766,
-0.0552077442407608,
-0.09376615285873413,
-0.02531331777572632,
0.0008721756748855114,
-0.03934268653392792,
0.020140554755926132,
-0.03244384378194809,
-0.024381456896662712,
0.0439249686896801,
-0.07060623168945312,
-0.029473688453435898,
-0.012806031852960587,
0.07263720035552979,
-0.08103759586811066,
0.021063871681690216,
0.05803728848695755,
-0.07856631278991699,
0.09447550773620605,
0.03356737643480301,
0.020522691309452057,
0.041328731924295425,
-0.046009767800569534,
-0.004804752767086029,
0.04801950603723526,
0.043021854013204575,
0.005557895638048649,
-0.03953060507774353,
0.04352133721113205,
0.03007386438548565,
0.011413684114813805,
-0.01605120301246643,
-0.004031152464449406,
-0.07794763147830963,
0.020699791610240936,
0.038670409470796585,
-0.021515285596251488,
-0.07725328952074051,
0.028018604964017868,
0.05503012612462044,
0.12118975818157196,
0.12077115476131439,
-0.08799467980861664,
-0.0341251865029335,
-0.08582354336977005,
0.013820512220263481,
0.012988988310098648,
-0.03154943510890007,
0.007508700247853994,
0.006408079527318478,
0.030843660235404968,
0.00112257176078856,
0.13059410452842712,
0.00478958897292614,
0.04924274981021881,
0.05556001886725426,
-0.03855838626623154,
-0.101503387093544,
0.013971487991511822,
0.06971115618944168,
0.02045491896569729,
0.01154027134180069,
-0.12581636011600494,
-0.016619082540273666,
-0.004144613165408373,
-0.05381403863430023,
0.1314477026462555,
0.08185571432113647,
0.01029128022491932,
0.06373307853937149,
0.03300384059548378,
-0.05793038010597229,
-0.0980672687292099,
0.03193594887852669,
-0.043028246611356735,
0.11733707040548325,
-0.010374406352639198,
0.07315413653850555,
0.13942694664001465,
-0.12574051320552826,
0.08972062170505524,
0.001513608731329441,
-0.01312121469527483,
-0.053949013352394104,
-0.05773674696683884,
-0.037588439881801605,
-0.09032069146633148,
0.05652506649494171,
-0.13117694854736328,
0.053689099848270416,
0.03944728150963783,
0.049734704196453094,
-0.016873663291335106,
0.10446593910455704,
-0.12874989211559296,
-0.10244202613830566,
0.10954529047012329,
-0.01308516226708889,
0.06149034574627876,
0.05557984858751297,
-0.034874897450208664,
-0.015934433788061142,
0.059857744723558426,
0.07716426253318787,
0.04006313160061836,
0.08870893716812134,
-0.00970035046339035,
-0.010762657970190048,
-0.04584715515375137,
-0.010628089308738708,
-0.040183812379837036,
0.004381924401968718,
0.09108957648277283,
0.04607119411230087,
-0.02502090111374855,
-0.005001685582101345,
0.1501373052597046,
-0.03842545300722122,
-0.08553194999694824,
-0.13050079345703125,
0.1556014120578766,
0.07542534172534943,
0.07744529843330383,
-0.013039134442806244,
-0.1171640083193779,
-0.010483612306416035,
0.0703660324215889,
0.04199924319982529,
0.06970766931772232,
0.0016195607604458928,
0.002260626293718815,
-0.0025091846473515034,
0.029802944511175156,
0.0418388769030571,
-0.032140716910362244,
0.26950448751449585,
-0.009495972655713558,
0.029661353677511215,
-0.03109898790717125,
0.00961016584187746,
-0.02747281640768051,
0.10512091219425201,
-0.04703667014837265,
-0.017358146607875824,
-0.09631867706775665,
0.09286293387413025,
-0.004198204725980759,
-0.25592759251594543,
-0.04381842166185379,
-0.03947816789150238,
-0.12005649507045746,
-0.027655743062496185,
-0.014310749247670174,
0.05947306379675865,
0.10617363452911377,
0.001531657762825489,
-0.039246637374162674,
0.1369447112083435,
-0.021505186334252357,
-0.04607342183589935,
-0.05843012034893036,
0.037284836173057556,
0.0057195378467440605,
0.17401328682899475,
0.011267703026533127,
0.05302081257104874,
0.06792332231998444,
-0.012140672653913498,
-0.09262712299823761,
-0.027965443208813667,
-0.009210830554366112,
-0.13685867190361023,
0.0346224308013916,
0.138689786195755,
-0.05492733418941498,
-0.010577536188066006,
0.058487020432949066,
-0.015300355851650238,
0.009705622680485249,
0.10730038583278656,
-0.01204790361225605,
-0.07780135422945023,
0.12121523916721344,
-0.08179985731840134,
0.13773652911186218,
0.13989168405532837,
0.00036689918488264084,
-0.012426063418388367,
-0.07989732921123505,
0.08631502091884613,
0.02796735055744648,
0.11014322191476822,
0.02970055118203163,
-0.1859557032585144,
0.001222985447384417,
0.000454634428024292,
-0.007643251214176416,
-0.18377158045768738,
-0.04261075332760811,
0.06641294062137604,
0.02680695615708828,
-0.03292322903871536,
0.1044023260474205,
0.033128589391708374,
-0.015433150343596935,
-0.02457171492278576,
-0.007172204554080963,
-0.001197297591716051,
0.049598708748817444,
-0.05678121745586395,
-0.07914681732654572
] |
null | null |
transformers
|
# SlovakBERT (base-sized model)
SlovakBERT pretrained model on Slovak language using a masked language modeling (MLM) objective. This model is case-sensitive: it makes a difference between slovensko and Slovensko.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
**IMPORTANT**: The model was not trained on the “ and ” (direct quote) character -> so before tokenizing the text, it is advised to replace all “ and ” (direct quote marks) with a single "(double quote marks).
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
from transformers import pipeline
unmasker = pipeline('fill-mask', model='gerulata/slovakbert')
unmasker("Deti sa <mask> na ihrisku.")
[{'sequence': 'Deti sa hrali na ihrisku.',
'score': 0.6355380415916443,
'token': 5949,
'token_str': ' hrali'},
{'sequence': 'Deti sa hrajú na ihrisku.',
'score': 0.14731724560260773,
'token': 9081,
'token_str': ' hrajú'},
{'sequence': 'Deti sa zahrali na ihrisku.',
'score': 0.05016357824206352,
'token': 32553,
'token_str': ' zahrali'},
{'sequence': 'Deti sa stretli na ihrisku.',
'score': 0.041727423667907715,
'token': 5964,
'token_str': ' stretli'},
{'sequence': 'Deti sa učia na ihrisku.',
'score': 0.01886524073779583,
'token': 18099,
'token_str': ' učia'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import RobertaTokenizer, RobertaModel
tokenizer = RobertaTokenizer.from_pretrained('gerulata/slovakbert')
model = RobertaModel.from_pretrained('gerulata/slovakbert')
text = "Text ktorý sa má embedovať."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import RobertaTokenizer, TFRobertaModel
tokenizer = RobertaTokenizer.from_pretrained('gerulata/slovakbert')
model = TFRobertaModel.from_pretrained('gerulata/slovakbert')
text = "Text ktorý sa má embedovať."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
Or extract information from the model like this:
```python
from transformers import pipeline
unmasker = pipeline('fill-mask', model='gerulata/slovakbert')
unmasker("Slovenské národne povstanie sa uskutočnilo v roku <mask>.")
[{'sequence': 'Slovenske narodne povstanie sa uskutočnilo v roku 1944.',
'score': 0.7383289933204651,
'token': 16621,
'token_str': ' 1944'},...]
```
# Training data
The SlovakBERT model was pretrained on these datasets:
- Wikipedia (326MB of text),
- OpenSubtitles (415MB of text),
- Oscar (4.6GB of text),
- Gerulata WebCrawl (12.7GB of text) ,
- Gerulata Monitoring (214 MB of text),
- blbec.online (4.5GB of text)
The text was then processed with the following steps:
- URL and email addresses were replaced with special tokens ("url", "email").
- Elongated interpunction was reduced (e.g. -- to -).
- Markdown syntax was deleted.
- All text content in braces f.g was eliminated to reduce the amount of markup and programming language text.
We segmented the resulting corpus into sentences and removed duplicates to get 181.6M unique sentences. In total, the final corpus has 19.35GB of text.
# Pretraining
The model was trained in **fairseq** on 4 x Nvidia A100 GPUs for 300K steps with a batch size of 512 and a sequence length of 512. The optimizer used is Adam with a learning rate of 5e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and \\(\epsilon = 1e-6\\), a weight decay of 0.01, dropout rate 0.1, learning rate warmup for 10k steps and linear decay of the learning rate after. We used 16-bit float precision.
## About us
<a href="https://www.gerulata.com/">
<img width="300px" src="https://www.gerulata.com/assets/images/Logo_Blue.svg">
</a>
Gerulata Technologies is a tech company on a mission to provide tools for fighting disinformation and hostile propaganda.
At Gerulata, we focus on providing state-of-the-art AI-powered tools that empower human analysts and provide them with the ability to make informed decisions.
Our tools allow for the monitoring and analysis of online activity, as well as the detection and tracking of disinformation and hostile propaganda campaigns. With our products, our clients are better equipped to identify and respond to threats in real-time.
### BibTeX entry and citation info
If you find our resource or paper is useful, please consider including the following citation in your paper.
- https://arxiv.org/abs/2109.15254
```
@misc{pikuliak2021slovakbert,
title={SlovakBERT: Slovak Masked Language Model},
author={Matúš Pikuliak and Štefan Grivalský and Martin Konôpka and Miroslav Blšták and Martin Tamajka and Viktor Bachratý and Marián Šimko and Pavol Balážik and Michal Trnka and Filip Uhlárik},
year={2021},
eprint={2109.15254},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
{"language": "sk", "license": "mit", "tags": ["SlovakBERT"], "datasets": ["wikipedia", "opensubtitles", "oscar", "gerulatawebcrawl", "gerulatamonitoring", "blbec.online"]}
|
fill-mask
|
gerulata/slovakbert
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"roberta",
"fill-mask",
"SlovakBERT",
"sk",
"dataset:wikipedia",
"dataset:opensubtitles",
"dataset:oscar",
"dataset:gerulatawebcrawl",
"dataset:gerulatamonitoring",
"dataset:blbec.online",
"arxiv:2109.15254",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.15254"
] |
[
"sk"
] |
TAGS
#transformers #pytorch #tf #safetensors #roberta #fill-mask #SlovakBERT #sk #dataset-wikipedia #dataset-opensubtitles #dataset-oscar #dataset-gerulatawebcrawl #dataset-gerulatamonitoring #dataset-blbec.online #arxiv-2109.15254 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# SlovakBERT (base-sized model)
SlovakBERT pretrained model on Slovak language using a masked language modeling (MLM) objective. This model is case-sensitive: it makes a difference between slovensko and Slovensko.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
IMPORTANT: The model was not trained on the “ and ” (direct quote) character -> so before tokenizing the text, it is advised to replace all “ and ” (direct quote marks) with a single "(double quote marks).
### How to use
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given text in PyTorch:
and in TensorFlow:
Or extract information from the model like this:
# Training data
The SlovakBERT model was pretrained on these datasets:
- Wikipedia (326MB of text),
- OpenSubtitles (415MB of text),
- Oscar (4.6GB of text),
- Gerulata WebCrawl (12.7GB of text) ,
- Gerulata Monitoring (214 MB of text),
- URL (4.5GB of text)
The text was then processed with the following steps:
- URL and email addresses were replaced with special tokens ("url", "email").
- Elongated interpunction was reduced (e.g. -- to -).
- Markdown syntax was deleted.
- All text content in braces f.g was eliminated to reduce the amount of markup and programming language text.
We segmented the resulting corpus into sentences and removed duplicates to get 181.6M unique sentences. In total, the final corpus has 19.35GB of text.
# Pretraining
The model was trained in fairseq on 4 x Nvidia A100 GPUs for 300K steps with a batch size of 512 and a sequence length of 512. The optimizer used is Adam with a learning rate of 5e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and \\(\epsilon = 1e-6\\), a weight decay of 0.01, dropout rate 0.1, learning rate warmup for 10k steps and linear decay of the learning rate after. We used 16-bit float precision.
## About us
<a href="URL
<img width="300px" src="URL
</a>
Gerulata Technologies is a tech company on a mission to provide tools for fighting disinformation and hostile propaganda.
At Gerulata, we focus on providing state-of-the-art AI-powered tools that empower human analysts and provide them with the ability to make informed decisions.
Our tools allow for the monitoring and analysis of online activity, as well as the detection and tracking of disinformation and hostile propaganda campaigns. With our products, our clients are better equipped to identify and respond to threats in real-time.
### BibTeX entry and citation info
If you find our resource or paper is useful, please consider including the following citation in your paper.
- URL
|
[
"# SlovakBERT (base-sized model)\nSlovakBERT pretrained model on Slovak language using a masked language modeling (MLM) objective. This model is case-sensitive: it makes a difference between slovensko and Slovensko.",
"## Intended uses & limitations\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nIMPORTANT: The model was not trained on the “ and ” (direct quote) character -> so before tokenizing the text, it is advised to replace all “ and ” (direct quote marks) with a single \"(double quote marks).",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\nand in TensorFlow:\n\nOr extract information from the model like this:",
"# Training data\nThe SlovakBERT model was pretrained on these datasets:\n\n- Wikipedia (326MB of text),\n- OpenSubtitles (415MB of text),\n- Oscar (4.6GB of text),\n- Gerulata WebCrawl (12.7GB of text) ,\n- Gerulata Monitoring (214 MB of text),\n- URL (4.5GB of text)\n\nThe text was then processed with the following steps:\n- URL and email addresses were replaced with special tokens (\"url\", \"email\").\n- Elongated interpunction was reduced (e.g. -- to -).\n- Markdown syntax was deleted.\n- All text content in braces f.g was eliminated to reduce the amount of markup and programming language text.\n\nWe segmented the resulting corpus into sentences and removed duplicates to get 181.6M unique sentences. In total, the final corpus has 19.35GB of text.",
"# Pretraining\nThe model was trained in fairseq on 4 x Nvidia A100 GPUs for 300K steps with a batch size of 512 and a sequence length of 512. The optimizer used is Adam with a learning rate of 5e-4, \\\\(\\beta_{1} = 0.9\\\\), \\\\(\\beta_{2} = 0.98\\\\) and \\\\(\\epsilon = 1e-6\\\\), a weight decay of 0.01, dropout rate 0.1, learning rate warmup for 10k steps and linear decay of the learning rate after. We used 16-bit float precision.",
"## About us\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>\n\nGerulata Technologies is a tech company on a mission to provide tools for fighting disinformation and hostile propaganda.\n\nAt Gerulata, we focus on providing state-of-the-art AI-powered tools that empower human analysts and provide them with the ability to make informed decisions. \n\nOur tools allow for the monitoring and analysis of online activity, as well as the detection and tracking of disinformation and hostile propaganda campaigns. With our products, our clients are better equipped to identify and respond to threats in real-time.",
"### BibTeX entry and citation info\nIf you find our resource or paper is useful, please consider including the following citation in your paper.\n- URL"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #roberta #fill-mask #SlovakBERT #sk #dataset-wikipedia #dataset-opensubtitles #dataset-oscar #dataset-gerulatawebcrawl #dataset-gerulatamonitoring #dataset-blbec.online #arxiv-2109.15254 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# SlovakBERT (base-sized model)\nSlovakBERT pretrained model on Slovak language using a masked language modeling (MLM) objective. This model is case-sensitive: it makes a difference between slovensko and Slovensko.",
"## Intended uses & limitations\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nIMPORTANT: The model was not trained on the “ and ” (direct quote) character -> so before tokenizing the text, it is advised to replace all “ and ” (direct quote marks) with a single \"(double quote marks).",
"### How to use\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\nand in TensorFlow:\n\nOr extract information from the model like this:",
"# Training data\nThe SlovakBERT model was pretrained on these datasets:\n\n- Wikipedia (326MB of text),\n- OpenSubtitles (415MB of text),\n- Oscar (4.6GB of text),\n- Gerulata WebCrawl (12.7GB of text) ,\n- Gerulata Monitoring (214 MB of text),\n- URL (4.5GB of text)\n\nThe text was then processed with the following steps:\n- URL and email addresses were replaced with special tokens (\"url\", \"email\").\n- Elongated interpunction was reduced (e.g. -- to -).\n- Markdown syntax was deleted.\n- All text content in braces f.g was eliminated to reduce the amount of markup and programming language text.\n\nWe segmented the resulting corpus into sentences and removed duplicates to get 181.6M unique sentences. In total, the final corpus has 19.35GB of text.",
"# Pretraining\nThe model was trained in fairseq on 4 x Nvidia A100 GPUs for 300K steps with a batch size of 512 and a sequence length of 512. The optimizer used is Adam with a learning rate of 5e-4, \\\\(\\beta_{1} = 0.9\\\\), \\\\(\\beta_{2} = 0.98\\\\) and \\\\(\\epsilon = 1e-6\\\\), a weight decay of 0.01, dropout rate 0.1, learning rate warmup for 10k steps and linear decay of the learning rate after. We used 16-bit float precision.",
"## About us\n<a href=\"URL\n\t<img width=\"300px\" src=\"URL\n</a>\n\nGerulata Technologies is a tech company on a mission to provide tools for fighting disinformation and hostile propaganda.\n\nAt Gerulata, we focus on providing state-of-the-art AI-powered tools that empower human analysts and provide them with the ability to make informed decisions. \n\nOur tools allow for the monitoring and analysis of online activity, as well as the detection and tracking of disinformation and hostile propaganda campaigns. With our products, our clients are better equipped to identify and respond to threats in real-time.",
"### BibTeX entry and citation info\nIf you find our resource or paper is useful, please consider including the following citation in your paper.\n- URL"
] |
[
116,
51,
95,
58,
202,
137,
142,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #roberta #fill-mask #SlovakBERT #sk #dataset-wikipedia #dataset-opensubtitles #dataset-oscar #dataset-gerulatawebcrawl #dataset-gerulatamonitoring #dataset-blbec.online #arxiv-2109.15254 #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# SlovakBERT (base-sized model)\nSlovakBERT pretrained model on Slovak language using a masked language modeling (MLM) objective. This model is case-sensitive: it makes a difference between slovensko and Slovensko.## Intended uses & limitations\nYou can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.\nIMPORTANT: The model was not trained on the “ and ” (direct quote) character -> so before tokenizing the text, it is advised to replace all “ and ” (direct quote marks) with a single \"(double quote marks).### How to use\nYou can use this model directly with a pipeline for masked language modeling:\n\n\n\nHere is how to use this model to get the features of a given text in PyTorch:\n\nand in TensorFlow:\n\nOr extract information from the model like this:"
] |
[
-0.08059541136026382,
0.03810124471783638,
-0.006586214527487755,
0.05819258093833923,
0.09734372049570084,
-0.02424590103328228,
0.1656661033630371,
0.045161403715610504,
0.055447567254304886,
0.07215909659862518,
0.07268457859754562,
0.058423418551683426,
0.02449628710746765,
0.15844720602035522,
0.04232151806354523,
-0.288804829120636,
0.06261594593524933,
-0.11372952163219452,
0.07137145102024078,
0.10616446286439896,
0.14667728543281555,
-0.07984370738267899,
0.021960049867630005,
0.028870131820440292,
0.04287939518690109,
0.022132588550448418,
0.019669508561491966,
-0.04506106674671173,
0.06853210180997849,
0.02932654693722725,
0.07170386612415314,
-0.014110676944255829,
0.07448258250951767,
-0.12495197355747223,
0.027907121926546097,
0.05233050510287285,
-0.014721190556883812,
0.052051424980163574,
0.07752873003482819,
-0.07370712608098984,
0.1274581253528595,
-0.15979337692260742,
0.06779004633426666,
0.009217264130711555,
-0.09004047513008118,
-0.1480550616979599,
-0.043218038976192474,
0.09467199444770813,
0.0602395199239254,
0.09865579754114151,
-0.05513768270611763,
0.15521620213985443,
-0.0162961408495903,
0.12066499888896942,
0.0556827075779438,
-0.20104068517684937,
-0.03400851786136627,
0.01756775751709938,
-0.015224349685013294,
0.053403571248054504,
-0.03449108079075813,
0.05290887877345085,
-0.07176251709461212,
0.008220269344747066,
0.12181006371974945,
-0.024328643456101418,
-0.05739700794219971,
-0.028312966227531433,
-0.17020902037620544,
-0.06740278005599976,
0.0956411212682724,
-0.02301030606031418,
-0.038493987172842026,
-0.19416481256484985,
-0.07758231461048126,
0.0836726725101471,
-0.005229754373431206,
-0.00026939422241412103,
-0.02667984925210476,
-0.01400759443640709,
0.051551688462495804,
-0.1263924390077591,
-0.0817461833357811,
-0.016119787469506264,
-0.010244686156511307,
0.15780030190944672,
0.036945659667253494,
0.0671444907784462,
-0.04113489389419556,
0.07439682632684708,
-0.10939186066389084,
-0.1330682933330536,
0.012581114657223225,
-0.05713250860571861,
-0.1247798427939415,
0.00590341305360198,
-0.028523191809654236,
-0.14373621344566345,
0.010330329649150372,
0.14600490033626556,
0.011493873782455921,
0.008203500881791115,
0.00042600836604833603,
0.04406072571873665,
0.04492412507534027,
0.13803036510944366,
-0.17236369848251343,
0.051932841539382935,
0.045242246240377426,
-0.02304231747984886,
0.06813711673021317,
-0.058795079588890076,
-0.08794987946748734,
-0.06642533838748932,
-0.0035862219519913197,
0.10185319185256958,
0.06055862084031105,
0.08133687824010849,
0.018766403198242188,
-0.06101277470588684,
0.04916154965758324,
-0.1542958915233612,
-0.0033686922397464514,
0.012796238996088505,
-0.03767576813697815,
0.006008950527757406,
0.007991369813680649,
0.018716871738433838,
-0.14128521084785461,
0.08104328066110611,
-0.024285776540637016,
0.018782321363687515,
-0.09262256324291229,
-0.18507328629493713,
0.031381234526634216,
0.0021222590003162622,
-0.044638317078351974,
-0.1048147901892662,
-0.1688883751630783,
-0.007228083908557892,
0.04063841328024864,
-0.03143196180462837,
0.06126277893781662,
-0.01694590598344803,
-0.018690595403313637,
0.018051620572805405,
0.016142982989549637,
-0.018179360777139664,
-0.023428110405802727,
0.013229207135736942,
-0.10619670152664185,
0.04334428906440735,
0.04303080961108208,
-0.014262174256145954,
-0.09770353138446808,
-0.041337937116622925,
-0.27920374274253845,
0.14357484877109528,
-0.06868948042392731,
-0.03953021019697189,
-0.05960310995578766,
-0.029360167682170868,
-0.012364785186946392,
-0.003932672087103128,
0.06567490100860596,
0.16582287847995758,
-0.1960768699645996,
0.022897178307175636,
0.22136247158050537,
-0.13692139089107513,
0.02789011411368847,
0.15305224061012268,
-0.0577569305896759,
-0.0031049689278006554,
0.10757209360599518,
0.134440615773201,
0.1600179523229599,
-0.19904981553554535,
0.05366640165448189,
0.053967393934726715,
-0.06997069716453552,
0.08776798099279404,
0.12182088196277618,
-0.06331804394721985,
0.023099251091480255,
0.018715795129537582,
-0.15397144854068756,
0.02231683023273945,
-0.026979470625519753,
-0.035389985889196396,
0.0058503649197518826,
0.011819478124380112,
0.10137724876403809,
-0.029801027849316597,
-0.03333992511034012,
-0.026564713567495346,
-0.1187615916132927,
0.00008465698920190334,
0.08887200802564621,
-0.038681428879499435,
0.028754569590091705,
-0.07386309653520584,
0.08973447978496552,
-0.043367113918066025,
-0.009190409444272518,
-0.1925686001777649,
-0.11741647869348526,
0.01439583022147417,
-0.06969955563545227,
0.06806281208992004,
-0.03036130592226982,
0.03964444249868393,
0.08641093969345093,
-0.02516753599047661,
0.0026341378688812256,
-0.009734940715134144,
-0.011364500969648361,
-0.0329907163977623,
-0.139298215508461,
-0.04073174670338631,
-0.0429074652493,
0.09087476879358292,
-0.012824064120650291,
0.03136714547872543,
-0.020929614081978798,
0.08349248021841049,
0.0023676517885178328,
-0.01985444314777851,
0.007062112912535667,
0.004104048945009708,
0.04686754196882248,
-0.04330965876579285,
0.015393959358334541,
0.021449580788612366,
-0.06614011526107788,
0.1397167444229126,
-0.15432994067668915,
-0.07465603947639465,
0.07322525978088379,
0.003230948466807604,
-0.059238556772470474,
-0.009659336879849434,
-0.01794772781431675,
0.03557261824607849,
-0.05741642042994499,
-0.065296970307827,
0.1330052763223648,
0.07216927409172058,
0.08896992355585098,
-0.10907997190952301,
-0.05050235241651535,
-0.008036538027226925,
-0.09915745258331299,
-0.025404730811715126,
0.08897143602371216,
-0.0034096234012395144,
-0.13789963722229004,
0.05016738548874855,
-0.043617766350507736,
0.0271840151399374,
0.24995940923690796,
0.02322845719754696,
-0.10388058423995972,
-0.06729188561439514,
0.04010118544101715,
-0.019915571436285973,
0.07025738060474396,
-0.010885700583457947,
0.033774103969335556,
0.07904370874166489,
-0.018549315631389618,
0.029133634641766548,
-0.055056266486644745,
0.019718749448657036,
0.03068087436258793,
-0.07803121209144592,
-0.0991864949464798,
0.06287892907857895,
0.015447033569216728,
0.11477809399366379,
-0.0018783446867018938,
0.06593119353055954,
-0.024438751861453056,
-0.05354855954647064,
-0.13644880056381226,
0.11736750602722168,
-0.15301640331745148,
-0.29789307713508606,
-0.19790491461753845,
0.047159481793642044,
-0.0798734501004219,
0.016480427235364914,
0.013923929072916508,
-0.0355236753821373,
-0.08656639605760574,
-0.05945542827248573,
0.07408735156059265,
0.02721647545695305,
-0.08893454819917679,
-0.08798723667860031,
0.001902637304738164,
-0.007037597708404064,
-0.10380852222442627,
-0.03414776176214218,
-0.032241083681583405,
-0.015927910804748535,
0.030489236116409302,
0.053545739501714706,
0.08639921993017197,
0.08500871807336807,
-0.020304517820477486,
0.02873324230313301,
-0.034363631159067154,
0.20003542304039001,
-0.07411699742078781,
0.10274267941713333,
0.21039505302906036,
-0.05085424706339836,
0.07483205199241638,
0.11911548674106598,
0.027973858639597893,
0.014429881237447262,
0.024851128458976746,
0.028959186747670174,
-0.06181292608380318,
-0.19758549332618713,
-0.15454824268817902,
-0.07848544418811798,
-0.09330252557992935,
0.0032704847399145365,
0.0012129596434533596,
0.028422394767403603,
0.01844177395105362,
-0.11832882463932037,
-0.11195885390043259,
0.11375173181295395,
0.08039604872465134,
0.12253981828689575,
-0.026148363947868347,
0.09996140748262405,
-0.03144579753279686,
-0.04832626134157181,
0.08249946683645248,
-0.08207142353057861,
0.14683055877685547,
-0.05276087671518326,
0.09239047020673752,
0.06637835502624512,
0.09555291384458542,
0.00765996053814888,
0.1167696863412857,
0.003269738517701626,
0.060151249170303345,
-0.001234313938766718,
-0.1350451111793518,
-0.009513523429632187,
0.005799334961920977,
0.007667378056794405,
-0.03513874113559723,
-0.056788668036460876,
0.10858647525310516,
0.09638507664203644,
0.19902770221233368,
-0.009509887546300888,
-0.1465878188610077,
-0.06024258956313133,
-0.01238277554512024,
0.018611032515764236,
-0.0682271346449852,
-0.059676364064216614,
0.13397885859012604,
-0.1604991853237152,
0.06517565995454788,
-0.009319842793047428,
0.07184876501560211,
0.008975332602858543,
0.0222625769674778,
-0.04431901499629021,
0.13199231028556824,
-0.030895298346877098,
0.13573801517486572,
-0.1990133672952652,
0.19554752111434937,
0.03603315353393555,
0.08588845282793045,
-0.11671704053878784,
0.0187237411737442,
0.03899259865283966,
0.03876391425728798,
0.1895553022623062,
0.04885024204850197,
-0.007382184732705355,
-0.06797314435243607,
-0.12454339116811752,
-0.014779745601117611,
0.0867745578289032,
-0.046190667897462845,
0.04399830475449562,
-0.028244145214557648,
-0.03722327575087547,
-0.024567654356360435,
0.003978235647082329,
-0.09499279409646988,
-0.12372511625289917,
0.05050342530012131,
-0.0002490127517376095,
-0.13788539171218872,
-0.043965894728899,
-0.057181112468242645,
0.005655347369611263,
0.04788948968052864,
-0.02354135550558567,
-0.13883906602859497,
-0.15069065988063812,
0.018866492435336113,
0.15965688228607178,
-0.1345447301864624,
0.03779999539256096,
-0.03631243482232094,
0.18840153515338898,
-0.057577792555093765,
-0.10281006246805191,
0.016125814989209175,
-0.09232620894908905,
-0.07254450768232346,
0.021687284111976624,
0.15959642827510834,
0.09558272361755371,
-0.0036806161515414715,
0.012229586951434612,
0.0545077845454216,
0.008238105103373528,
-0.13270732760429382,
-0.053735826164484024,
0.11918681114912033,
-0.003940561320632696,
0.12415173649787903,
-0.1147456020116806,
-0.1400318145751953,
-0.09283499419689178,
0.047145672142505646,
0.16494044661521912,
0.12159623950719833,
-0.09318189322948456,
0.1545848399400711,
0.2050151526927948,
-0.07990800589323044,
-0.24982808530330658,
-0.03584602475166321,
0.07771310955286026,
0.04743780940771103,
0.040574055165052414,
-0.1358329951763153,
0.027908695861697197,
-0.01423538476228714,
0.0051782806403934956,
-0.06619609147310257,
-0.2312329113483429,
-0.15420660376548767,
0.13308623433113098,
-0.0024429578334093094,
-0.016591280698776245,
-0.006308765150606632,
-0.04059759899973869,
-0.018863296136260033,
-0.008059816434979439,
0.04522676765918732,
-0.1235414445400238,
0.04071170091629028,
0.09107957035303116,
0.04608432576060295,
0.07972227782011032,
-0.053221091628074646,
0.09993108361959457,
0.014820844866335392,
0.04971925541758537,
-0.0864696279168129,
0.007732441183179617,
0.13003358244895935,
-0.04903806373476982,
0.1824886053800583,
0.052240028977394104,
0.06253375858068466,
-0.10526324808597565,
-0.0539042167365551,
-0.0825311616063118,
0.09208019822835922,
-0.04965982213616371,
-0.07423567771911621,
-0.021940110251307487,
0.06679099053144455,
0.09456494450569153,
-0.0365091897547245,
-0.10217631608247757,
-0.08484740555286407,
-0.029041361063718796,
0.09877979755401611,
0.09306182712316513,
0.01670823059976101,
-0.07032216340303421,
0.02267238311469555,
-0.015132802538573742,
0.04308578372001648,
0.05185545235872269,
0.05866782367229462,
0.051171738654375076,
0.013397013768553734,
0.07995917648077011,
-0.021832112222909927,
-0.10883201658725739,
-0.07627762109041214,
-0.0007472544093616307,
-0.18187397718429565,
-0.10432402044534683,
-0.049771662801504135,
-0.01029118336737156,
-0.01233588345348835,
-0.09905228763818741,
0.15374833345413208,
-0.017553523182868958,
-0.007142192218452692,
-0.03640138730406761,
0.0244279932230711,
-0.00263784546405077,
0.07767923176288605,
0.0749916359782219,
0.05732961371541023,
-0.07810699194669724,
0.04745937138795853,
0.04383133724331856,
0.030692942440509796,
0.05994619429111481,
0.058275140821933746,
-0.12347371876239777,
-0.07466848194599152,
-0.03600822389125824,
0.16115042567253113,
-0.027648698538541794,
-0.06729073822498322,
-0.017782246693968773,
-0.04896865785121918,
0.01037014089524746,
0.13494230806827545,
0.02800661511719227,
-0.03251216933131218,
-0.06761137396097183,
-0.030911069363355637,
-0.0677936002612114,
0.13201318681240082,
0.09837193787097931,
-0.07880041003227234,
-0.03334669768810272,
0.20221450924873352,
0.012127933092415333,
0.013614051043987274,
-0.045086584985256195,
-0.033304933458566666,
-0.04964343458414078,
0.007709507364779711,
-0.013823688961565495,
0.021929962560534477,
-0.02318708412349224,
-0.0035475697368383408,
-0.021323032677173615,
-0.04359040409326553,
0.025904593989253044,
0.040573809295892715,
-0.04394742101430893,
-0.014950752258300781,
-0.04043849930167198,
-0.00461324630305171,
-0.17007824778556824,
-0.04330640286207199,
0.01582960970699787,
-0.04025460407137871,
0.09870094060897827,
0.09930972009897232,
-0.03385261073708534,
0.04663071036338806,
-0.16543766856193542,
-0.0005533488001674414,
-0.04297315329313278,
0.05467195436358452,
0.01626605913043022,
-0.11264953762292862,
0.04675713926553726,
-0.0011367074912413955,
-0.015615587122738361,
0.002223028801381588,
0.0461505651473999,
-0.08436015993356705,
0.07059045135974884,
0.09744765609502792,
-0.03144824132323265,
-0.06344939768314362,
0.12967626750469208,
-0.003750324947759509,
0.05715453252196312,
0.10166046023368835,
-0.07902586460113525,
0.10192953795194626,
-0.11745120584964752,
0.0032922110985964537,
-0.003516922704875469,
0.024944715201854706,
-0.04441746324300766,
-0.05336763337254524,
0.07603225857019424,
-0.0509696789085865,
0.1267746537923813,
0.1396355926990509,
0.06835967302322388,
0.03572724759578705,
0.09209338575601578,
0.02625296451151371,
0.008786357939243317,
-0.004053747747093439,
-0.04846802353858948,
0.007821324281394482,
-0.0396839901804924,
-0.011130684986710548,
-0.016735658049583435,
-0.08161560446023941,
0.12411093711853027,
0.11816449463367462,
0.19154837727546692,
-0.018812449648976326,
0.1307239532470703,
0.0007347033824771643,
-0.0684603676199913,
-0.1719072312116623,
-0.01022987999022007,
0.002424754435196519,
-0.06761830300092697,
0.06097297742962837,
0.13205429911613464,
-0.18798208236694336,
0.14741341769695282,
0.06648004800081253,
-0.08597397059202194,
-0.14525949954986572,
-0.26716533303260803,
-0.01872164011001587,
0.04348340630531311,
-0.009869467467069626,
-0.13022582232952118,
0.040288399904966354,
-0.013480138964951038,
0.03347693383693695,
-0.033896494656801224,
0.13941234350204468,
-0.12697996199131012,
-0.08420394361019135,
0.10592938959598541,
0.02428426779806614,
0.09026411175727844,
0.052868571132421494,
-0.015928391367197037,
0.047451019287109375,
0.07449928671121597,
0.03324000537395477,
0.031149236485362053,
0.07745889574289322,
-0.007216159719973803,
-0.08518572151660919,
-0.060203369706869125,
-0.005976493936032057,
-0.025057468563318253,
0.056164782494306564,
0.17660272121429443,
0.06784895062446594,
-0.05630448833107948,
-0.05316867306828499,
0.21854712069034576,
-0.010683462023735046,
-0.19906888902187347,
-0.16101127862930298,
0.12473088502883911,
0.011976349167525768,
0.02890586294233799,
-0.007827438414096832,
-0.09791719168424606,
-0.009690008126199245,
0.19250202178955078,
0.2632492482662201,
-0.007152682635933161,
0.02043803595006466,
-0.04244085028767586,
0.014226890169084072,
0.06538593769073486,
0.13193731009960175,
0.0038074487820267677,
0.26397112011909485,
-0.04463070258498192,
0.1519574671983719,
-0.07033167034387589,
-0.01007002405822277,
-0.10297334939241409,
0.08917348086833954,
-0.021406592801213264,
-0.04731663689017296,
-0.023212451487779617,
0.1284061223268509,
-0.07056102156639099,
-0.15080507099628448,
-0.08067453652620316,
-0.02556862309575081,
-0.08146180957555771,
-0.016251899302005768,
0.016432665288448334,
0.06532527506351471,
0.07862955331802368,
-0.013329826295375824,
-0.010661343112587929,
0.12422021478414536,
0.05445946007966995,
-0.017076030373573303,
-0.10060760378837585,
0.07462741434574127,
-0.03119281493127346,
0.13918355107307434,
0.02403857372701168,
0.11745389550924301,
0.108676016330719,
-0.004748095292598009,
-0.0644744262099266,
0.08653824031352997,
0.009725014679133892,
-0.02181374840438366,
0.035512275993824005,
0.09389873594045639,
-0.016861924901604652,
-0.03243738412857056,
0.024957865476608276,
-0.09998960793018341,
0.04477723315358162,
-0.05339866131544113,
-0.013200603425502777,
-0.05035431683063507,
0.1752002090215683,
-0.10668835788965225,
0.07051892578601837,
0.17343416810035706,
-0.021808486431837082,
0.020264767110347748,
-0.09479854255914688,
0.0078991474583745,
-0.010569654405117035,
-0.04045476019382477,
0.007442878559231758,
-0.08940534293651581,
0.029727470129728317,
-0.02718343213200569,
0.033182110637426376,
-0.21690933406352997,
-0.03461021929979324,
-0.017063118517398834,
-0.038390446454286575,
0.007770803291350603,
0.08061080425977707,
0.017873430624604225,
0.022515857592225075,
0.014961742796003819,
0.01596059650182724,
0.02660326100885868,
0.07666446268558502,
-0.11266695708036423,
-0.005201475229114294
] |
null | null |
transformers
|
{"language": ["ar"], "widget": [{"text": "\u0623\u064a\u0646 \u064a\u0639\u064a\u0634 \u0645\u062d\u0645\u062f \u061f", "context": "\u0627\u0633\u0645\u064a \u0645\u062d\u0645\u062f \u0648\u0623\u0646\u0627 \u0623\u0639\u064a\u0634 \u0641\u064a \u0633\u0648\u0631\u064a\u0627"}, {"text": "\u0645\u0627 \u0627\u0644\u0639\u062f\u062f \u0627\u0644\u0630\u0631\u064a \u0644\u0644\u0647\u064a\u062f\u0631\u0648\u062c\u064a\u0646 \u061f", "context": "\u0627\u0644\u0647\u064a\u062f\u0631\u0648\u062c\u064a\u0646 \u0647\u0648 \u0639\u0646\u0635\u0631 \u0643\u064a\u0645\u064a\u0627\u0626\u064a \u0639\u062f\u062f\u0647 \u0627\u0644\u0630\u0631\u064a 1 \u060c \u0648\u0647\u0648 \u063a\u0627\u0632 \u0639\u062f\u064a\u0645 \u0627\u0644\u0631\u0627\u0626\u062d\u0629 \u0648\u0627\u0644\u0644\u0648\u0646 \u0648\u0647\u0648 \u0633\u0631\u064a\u0639 \u0627\u0644\u0627\u0634\u062a\u0639\u0627\u0644"}, {"text": "\u0645\u0627 \u062e\u0648\u0627\u0635 \u0627\u0644\u0647\u064a\u062f\u0631\u0648\u062c\u064a\u0646 \u061f", "context": "\u0627\u0644\u0647\u064a\u062f\u0631\u0648\u062c\u064a\u0646 \u0647\u0648 \u0639\u0646\u0635\u0631 \u0643\u064a\u0645\u064a\u0627\u0626\u064a \u0639\u062f\u062f\u0647 \u0627\u0644\u0630\u0631\u064a 1 \u060c \u0648\u0647\u0648 \u063a\u0627\u0632 \u0639\u062f\u064a\u0645 \u0627\u0644\u0631\u0627\u0626\u062d\u0629 \u0648\u0627\u0644\u0644\u0648\u0646 \u0648\u0647\u0648 \u0633\u0631\u064a\u0639 \u0627\u0644\u0627\u0634\u062a\u0639\u0627\u0644"}]}
|
question-answering
|
gfdgdfgdg/arap_qa_bert
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"ar",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ar"
] |
TAGS
#transformers #pytorch #bert #question-answering #ar #endpoints_compatible #region-us
|
[] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #ar #endpoints_compatible #region-us \n"
] |
[
31
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #ar #endpoints_compatible #region-us \n"
] |
[
-0.030265754088759422,
0.010732460767030716,
-0.010878984816372395,
-0.01969131827354431,
0.0892331451177597,
0.032318416982889175,
0.012980960309505463,
0.08407673984766006,
0.1387159675359726,
0.013008029200136662,
0.1812669038772583,
0.20441022515296936,
-0.06402835994958878,
-0.08011873066425323,
-0.10048899054527283,
-0.1932181864976883,
0.029427656903862953,
0.10281810164451599,
-0.028551265597343445,
0.13183332979679108,
0.0389116145670414,
-0.12724198400974274,
0.04960298910737038,
-0.028499040752649307,
-0.062056902796030045,
0.061686426401138306,
0.005677097011357546,
-0.056691691279411316,
0.13800804316997528,
0.02670894004404545,
0.15701253712177277,
0.030516501516103745,
-0.12119986861944199,
-0.18748502433300018,
0.042833056300878525,
-0.027449192479252815,
-0.04960258677601814,
0.033378105610609055,
0.04434696584939957,
-0.10011923313140869,
-0.021879475563764572,
0.08011641353368759,
-0.00028929157997481525,
0.06914238631725311,
-0.19499856233596802,
-0.16757184267044067,
-0.047578115016222,
0.012724582105875015,
0.06665462255477905,
0.09286129474639893,
-0.027939865365624428,
0.16851480305194855,
-0.1722615361213684,
0.08321268856525421,
0.17974163591861725,
-0.3041142523288727,
-0.010806231759488583,
0.07720574736595154,
0.08940701931715012,
0.058668408542871475,
-0.02991115301847458,
0.06976067274808884,
0.04453708603978157,
0.0258489940315485,
-0.10262240469455719,
-0.1082506775856018,
-0.008059707470238209,
0.10159242153167725,
-0.09026801586151123,
-0.10005219280719757,
0.2247323840856552,
0.0084203677251935,
0.04348478838801384,
0.054726287722587585,
-0.08744226396083832,
0.010493670590221882,
0.020544784143567085,
-0.04128469154238701,
-0.033998239785432816,
0.05069558694958687,
0.021028617396950722,
-0.0256302859634161,
-0.11751638352870941,
0.03720642626285553,
-0.24307237565517426,
0.23985505104064941,
0.028229495510458946,
0.08392640203237534,
-0.23795446753501892,
0.059965867549180984,
-0.06380350142717361,
-0.07125363498926163,
0.008054880425333977,
-0.07462143898010254,
0.012861871160566807,
-0.001266319421119988,
-0.06310902535915375,
0.06942304968833923,
0.046488206833601,
0.192921981215477,
-0.003347711404785514,
0.03481476753950119,
0.0008074123761616647,
0.09339588135480881,
0.045605480670928955,
0.11364837735891342,
-0.013303167186677456,
-0.019460508599877357,
-0.034760136157274246,
-0.13471150398254395,
-0.0321790985763073,
-0.025169333443045616,
-0.0936557874083519,
-0.06840261816978455,
-0.006717304699122906,
0.12153033912181854,
0.07511895149946213,
0.0005968245095573366,
-0.07402099668979645,
0.00801662728190422,
-0.023770062252879143,
-0.018389031291007996,
-0.020558642223477364,
-0.0033211736008524895,
0.025318915024399757,
0.19010931253433228,
-0.0671735629439354,
0.029874242842197418,
-0.024445287883281708,
0.09927012771368027,
-0.07421550899744034,
-0.02471090294420719,
-0.015379722230136395,
-0.0050741517916321754,
0.06770367175340652,
-0.11090958118438721,
0.08569681644439697,
-0.11784841120243073,
-0.02240363508462906,
0.011676348745822906,
0.03997310996055603,
-0.013029403053224087,
0.02146671526134014,
0.001328363548964262,
-0.030731718987226486,
-0.04105048254132271,
-0.05054766312241554,
-0.037550780922174454,
-0.05831539258360863,
0.11694476008415222,
0.036061134189367294,
0.038852185010910034,
-0.0684204250574112,
0.05764057859778404,
-0.07521387189626694,
0.05522884801030159,
-0.08510951697826385,
-0.056194230914115906,
0.0028085452504456043,
0.13906797766685486,
-0.020038018003106117,
-0.0832495242357254,
-0.12110785394906998,
0.029999716207385063,
-0.051538191735744476,
0.18802857398986816,
0.003924766089767218,
-0.06971246004104614,
0.18785996735095978,
-0.04053725674748421,
-0.22029899060726166,
0.08567017316818237,
0.008354786783456802,
0.0005871721659786999,
0.07001455128192902,
0.17452120780944824,
-0.029814448207616806,
-0.10119838267564774,
0.05573189631104469,
0.12360470741987228,
-0.12179794162511826,
-0.07209634780883789,
0.051772087812423706,
-0.058425575494766235,
-0.10922392457723618,
0.03244556486606598,
0.020402152091264725,
0.04946776106953621,
-0.09689421206712723,
-0.03142759948968887,
-0.0068218884989619255,
0.00690420949831605,
0.08172112703323364,
0.07183089852333069,
0.04813087359070778,
-0.07750872522592545,
0.0037955744192004204,
-0.044070176780223846,
-0.029598873108625412,
0.0642772763967514,
0.030829934403300285,
-0.0721491351723671,
0.13091625273227692,
-0.1329406201839447,
0.006097256205976009,
-0.20129512250423431,
-0.08477997779846191,
-0.038295671343803406,
0.11289667338132858,
-0.012009034864604473,
0.24311073124408722,
0.09922986477613449,
-0.14922268688678741,
-0.03667789697647095,
-0.038272079080343246,
0.10901355743408203,
0.005331917200237513,
-0.03828008845448494,
-0.04990466684103012,
0.05190521478652954,
-0.06672319024801254,
-0.0844746083021164,
-0.03996526077389717,
-0.02694571390748024,
0.09659332782030106,
0.1119917705655098,
-0.01383917685598135,
0.05886911600828171,
-0.01905723288655281,
0.043708667159080505,
0.006494789384305477,
0.047644320875406265,
0.10189919918775558,
-0.03992561623454094,
-0.08954176306724548,
0.11731910705566406,
-0.07435615360736847,
0.31246182322502136,
0.1785772293806076,
-0.33505767583847046,
0.015226609073579311,
-0.04747354984283447,
-0.05273459479212761,
0.019254783168435097,
0.08647263050079346,
-0.0036065008025616407,
0.1307036429643631,
0.043047063052654266,
0.08051610738039017,
-0.03981323540210724,
-0.0700676366686821,
-0.01668151654303074,
-0.06182865425944328,
-0.04725900664925575,
0.11798643320798874,
0.059151582419872284,
-0.1825719028711319,
0.15313096344470978,
0.32999593019485474,
0.06057567521929741,
0.060813236981630325,
-0.05443698167800903,
-0.04604943096637726,
0.003255011746659875,
0.03838532045483589,
-0.05943135544657707,
0.05712651461362839,
-0.25977805256843567,
0.00199604663066566,
0.079053595662117,
0.003387246746569872,
0.07156749814748764,
-0.14018185436725616,
-0.08185525238513947,
0.010541915893554688,
0.028614411130547523,
-0.08474498242139816,
0.11002561450004578,
0.040432024747133255,
0.09784051030874252,
0.03605939447879791,
-0.02555099129676819,
0.10211538523435593,
-0.0012189013650640845,
-0.06049465388059616,
0.14874647557735443,
-0.09618823230266571,
-0.23242782056331635,
-0.04348939657211304,
-0.10533711314201355,
0.022405356168746948,
-0.0008505629375576973,
0.07280859351158142,
-0.09476308524608612,
-0.014932564459741116,
0.11795134097337723,
0.043607406318187714,
-0.1988878697156906,
0.006222560536116362,
-0.04245009645819664,
0.039354581385850906,
-0.08971885591745377,
-0.043918512761592865,
-0.06771079450845718,
-0.07232888042926788,
-0.05938318744301796,
0.1161995604634285,
-0.11918485164642334,
0.0884602963924408,
0.11298263072967529,
0.05316503718495369,
0.06366011500358582,
-0.013537326827645302,
0.24501754343509674,
-0.13081596791744232,
-0.040086545050144196,
0.16313444077968597,
-0.05059009790420532,
0.1016722172498703,
0.14952105283737183,
0.02085714600980282,
-0.08078570663928986,
-0.003992443438619375,
-0.021557290107011795,
-0.0643337145447731,
-0.24765609204769135,
-0.05081895738840103,
-0.11993961036205292,
0.0471675731241703,
-0.013001360930502415,
0.030360974371433258,
0.08781139552593231,
0.07060801982879639,
0.026467688381671906,
-0.14536064863204956,
-0.05239599943161011,
0.049909621477127075,
0.2786836624145508,
-0.07072225958108902,
0.08587388694286346,
-0.0438765250146389,
-0.10832285135984421,
0.05285220965743065,
0.0957045927643776,
0.11414915323257446,
0.1329757422208786,
-0.021830666810274124,
0.08565659075975418,
0.14968682825565338,
0.13031035661697388,
0.07662323862314224,
-0.01995965652167797,
-0.06898587197065353,
-0.03205852955579758,
0.009874308481812477,
-0.06099778041243553,
0.03136274591088295,
0.16889850795269012,
-0.12477268278598785,
-0.02496335841715336,
-0.21889051795005798,
0.07434073835611343,
0.02787007763981819,
0.06832938641309738,
-0.07119375467300415,
0.021200718358159065,
0.07554805278778076,
-0.020424995571374893,
-0.04775257036089897,
0.09369093924760818,
0.011561849154531956,
-0.14056237041950226,
0.013665963895618916,
-0.03898501768708229,
0.13399246335029602,
0.036430858075618744,
0.09336203336715698,
-0.07486281543970108,
-0.16501735150814056,
0.05680273100733757,
0.09690701961517334,
-0.30149340629577637,
0.31059515476226807,
0.010359536856412888,
-0.10792414844036102,
-0.050884634256362915,
-0.05949290841817856,
-0.04284629970788956,
0.1563260555267334,
0.17583787441253662,
0.026117892935872078,
-0.02630811743438244,
-0.09380687773227692,
0.0894705280661583,
0.05128825828433037,
0.14872227609157562,
-0.023881735280156136,
-0.02145054005086422,
-0.01351048331707716,
0.022435387596488,
-0.028839649632573128,
0.08170489221811295,
0.0747084692120552,
-0.10587116330862045,
0.048416491597890854,
-0.03761989623308182,
0.028032803907990456,
0.0028034464921802282,
0.002952303271740675,
-0.06911856681108475,
0.08715836703777313,
-0.059216439723968506,
-0.04882529005408287,
-0.0803910493850708,
-0.1335350126028061,
0.1364559680223465,
-0.09649715572595596,
0.018383320420980453,
-0.0817805603146553,
-0.08808505535125732,
-0.06784391403198242,
-0.1107705757021904,
0.1353137493133545,
-0.09360960125923157,
-0.0017965197330340743,
-0.03612210601568222,
0.2080349326133728,
-0.06337655335664749,
0.030915113165974617,
-0.003014738205820322,
0.04298429563641548,
-0.1528601497411728,
-0.09310269355773926,
0.03213275969028473,
-0.10572486370801926,
0.08215171843767166,
0.04411076009273529,
-0.009832284413278103,
0.1230672150850296,
-0.010973644442856312,
0.013838304206728935,
0.20028087496757507,
0.2247980833053589,
-0.030844518914818764,
0.07532121241092682,
0.18322503566741943,
-0.012447801418602467,
-0.2534584701061249,
-0.05671033263206482,
-0.15976691246032715,
-0.06693962216377258,
-0.014061039313673973,
-0.08541005104780197,
0.12311645597219467,
0.015095772221684456,
-0.04032308608293533,
0.07873523235321045,
-0.23544128239154816,
-0.024537639692425728,
0.16659867763519287,
-0.014834189787507057,
0.5124616622924805,
-0.11620271950960159,
-0.08623047918081284,
0.040959589183330536,
-0.2624102830886841,
0.07377736270427704,
0.0574566088616848,
0.04131927713751793,
-0.031763602048158646,
0.11859012395143509,
0.03444729372859001,
-0.08403995633125305,
0.15117354691028595,
0.014555267058312893,
0.005763213150203228,
-0.06881403923034668,
-0.14858004450798035,
0.018048277124762535,
0.01751030422747135,
-0.034176334738731384,
0.05365443229675293,
0.03415665403008461,
-0.17050142586231232,
-0.01851031184196472,
-0.14233016967773438,
0.05533198267221451,
0.019037511199712753,
-0.05058813467621803,
-0.035286203026771545,
-0.028022490441799164,
-0.01725861057639122,
-0.003228020155802369,
0.26492834091186523,
-0.0802217572927475,
0.1827199012041092,
-0.033101264387369156,
0.142618790268898,
-0.19127565622329712,
-0.1142696812748909,
-0.056411053985357285,
-0.048513513058423996,
0.07785539329051971,
-0.04622156545519829,
0.0506584532558918,
0.20002079010009766,
-0.014251184649765491,
0.025091378018260002,
0.10018302500247955,
0.026631217449903488,
-0.0319981649518013,
0.08859943598508835,
-0.2098599076271057,
-0.14504097402095795,
-0.02262922376394272,
-0.03181775286793709,
0.0582452230155468,
0.06958203762769699,
0.06298734992742538,
0.10702449083328247,
-0.022869767621159554,
0.01834976114332676,
-0.04628573730587959,
-0.06269356608390808,
0.007382103241980076,
0.08452063798904419,
0.027878178283572197,
-0.09490314871072769,
0.04618149995803833,
-0.03250468522310257,
-0.26947301626205444,
-0.04107532277703285,
0.08344358950853348,
-0.09972304105758667,
-0.10146597772836685,
-0.12019510567188263,
0.04567532613873482,
-0.1723606139421463,
-0.019924147054553032,
-0.021154038608074188,
-0.10181645303964615,
0.06532518565654755,
0.22689121961593628,
0.09274356812238693,
0.0689229965209961,
0.0004695336683653295,
-0.03965720906853676,
0.059291329234838486,
-0.029259633272886276,
-0.03530724346637726,
-0.01613655872642994,
-0.04773028939962387,
-0.07621629536151886,
-0.014741151593625546,
0.1933479905128479,
-0.07654083520174026,
-0.0811837688088417,
-0.1704702228307724,
0.10004814714193344,
-0.16692093014717102,
-0.11141669750213623,
-0.11019419878721237,
-0.07831767201423645,
0.006717555224895477,
-0.14663997292518616,
-0.036034610122442245,
-0.04661248251795769,
-0.13421940803527832,
0.08983024954795837,
0.07421892136335373,
0.020829932764172554,
-0.065399169921875,
-0.0458947978913784,
0.18170221149921417,
-0.021341336891055107,
0.08964542299509048,
0.1595243513584137,
-0.10885878652334213,
0.1033012866973877,
-0.11332205682992935,
-0.14760173857212067,
0.06454158574342728,
0.01300297025591135,
0.06275387108325958,
0.04252802953124046,
-0.010916190221905708,
0.06779312342405319,
0.04814781993627548,
0.08210750669240952,
-0.04528769850730896,
-0.10657082498073578,
0.02162722498178482,
0.0434713177382946,
-0.19633866846561432,
-0.03430144861340523,
-0.11527635157108307,
0.1037411317229271,
0.0106779420748353,
0.08625813573598862,
0.02620532549917698,
0.12127649784088135,
-0.04906841740012169,
0.02246888540685177,
0.008049334399402142,
-0.15457327663898468,
0.04367813840508461,
-0.07016537338495255,
0.014598114416003227,
-0.020548447966575623,
0.24391618371009827,
-0.1158909797668457,
0.07687786966562271,
0.058750227093696594,
0.06640176475048065,
0.01483873836696148,
-0.010553347878158092,
0.19246207177639008,
0.09140567481517792,
-0.06447454541921616,
-0.08982618153095245,
0.08748381584882736,
-0.06168447062373161,
-0.07038632035255432,
0.13330231606960297,
0.15470999479293823,
0.09257658571004868,
0.062301941215991974,
-0.01946127414703369,
0.04995448514819145,
-0.04456133395433426,
-0.21884062886238098,
0.021623777225613594,
0.007748940959572792,
0.0031051281839609146,
0.1434488296508789,
0.16489888727664948,
-0.027313347905874252,
0.06134350597858429,
-0.06184956058859825,
-0.0040591005235910416,
-0.1437203586101532,
-0.06324648857116699,
-0.04883081093430519,
-0.07922633737325668,
0.051256198436021805,
-0.0900496169924736,
-0.007307039108127356,
0.1351541131734848,
0.0708327442407608,
-0.049850042909383774,
0.12652824819087982,
0.05687190964818001,
-0.07399445027112961,
0.0010629907483235002,
0.007077677641063929,
0.09778353571891785,
0.017508378252387047,
0.03194469213485718,
-0.12061523646116257,
-0.08668981492519379,
-0.05092676728963852,
0.034665994346141815,
-0.12251299619674683,
-0.06595161557197571,
-0.1435909867286682,
-0.09698748588562012,
-0.07396063208580017,
0.1164068877696991,
-0.04321637004613876,
0.15064182877540588,
-0.0428534671664238,
0.054437894374132156,
0.014613623730838299,
0.2298712283372879,
-0.08081936836242676,
-0.02636583335697651,
-0.02387198992073536,
0.20214638113975525,
0.019979367032647133,
0.10043425112962723,
-0.00577952666208148,
0.036350950598716736,
-0.04365318641066551,
0.3350553512573242,
0.20362575352191925,
-0.07507914304733276,
0.04003889486193657,
0.0717930942773819,
0.05460251495242119,
0.10835328698158264,
0.0038374466821551323,
0.10475356131792068,
0.2864741086959839,
-0.11600275337696075,
-0.02457473613321781,
-0.025194454938173294,
0.021124206483364105,
-0.014410230331122875,
0.06044585257768631,
0.06162530928850174,
-0.06241542100906372,
-0.07224934548139572,
0.11333917826414108,
-0.13848650455474854,
0.09355495870113373,
0.024487515911459923,
-0.20427294075489044,
-0.05166297405958176,
-0.03373146057128906,
0.16232433915138245,
-0.008052556775510311,
0.12163666635751724,
-0.029620882123708725,
-0.13144369423389435,
0.0346735417842865,
0.05105208605527878,
-0.22049202024936676,
-0.07538262754678726,
0.1501069813966751,
0.036995936185121536,
-0.010494294576346874,
0.0047204759903252125,
0.02235400676727295,
0.07767398655414581,
0.033771127462387085,
-0.027106307446956635,
0.0066222320310771465,
0.09390122443437576,
-0.10556098818778992,
-0.13232000172138214,
-0.022669216617941856,
0.07356417179107666,
-0.0794840157032013,
0.08360085636377335,
-0.17585806548595428,
0.056181181222200394,
-0.004916847217828035,
-0.03512437641620636,
-0.04042528569698334,
0.07331156730651855,
-0.06618745625019073,
0.014749105088412762,
0.06511851400136948,
0.009534141048789024,
-0.03747636079788208,
-0.03270456939935684,
-0.006050091236829758,
0.0430016852915287,
-0.06620367616415024,
-0.13295622169971466,
0.013161790557205677,
-0.06198456510901451,
0.09554710984230042,
-0.048341210931539536,
-0.08726024627685547,
-0.034863412380218506,
0.01272062212228775,
0.06732165813446045,
-0.071968674659729,
0.0029681145679205656,
0.028408054262399673,
0.05282159149646759,
0.022289037704467773,
-0.10096997767686844,
0.042800936847925186,
0.061436355113983154,
-0.11402799189090729,
-0.04492253065109253
] |
||
null | null |
transformers
|
# Family Guy (Peter) DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
gfdream/dialogpt-small-familyguy
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Family Guy (Peter) DialoGPT Model
|
[
"# Family Guy (Peter) DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Family Guy (Peter) DialoGPT Model"
] |
[
51,
11
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Family Guy (Peter) DialoGPT Model"
] |
[
-0.04568241909146309,
0.07128068059682846,
-0.006813106592744589,
0.02643396519124508,
0.14928379654884338,
0.011868867091834545,
0.18032290041446686,
0.11895003914833069,
-0.0066024139523506165,
-0.051358092576265335,
0.12811177968978882,
0.11248081177473068,
0.011692131869494915,
0.0660632848739624,
-0.06125461682677269,
-0.3329894542694092,
0.02110934630036354,
0.040993936359882355,
0.07302292436361313,
0.12429198622703552,
0.09556686878204346,
-0.025390395894646645,
0.08594679087400436,
0.016628975048661232,
-0.11253298074007034,
0.007848263718187809,
0.03832469508051872,
-0.0588536337018013,
0.14825396239757538,
0.05663406848907471,
0.04524945467710495,
0.06840270012617111,
-0.08160147070884705,
-0.09542624652385712,
0.03792042285203934,
-0.0075119598768651485,
-0.019114123657345772,
0.032106682658195496,
0.061321921646595,
-0.0402284637093544,
0.1646377444267273,
0.12034070491790771,
-0.01274141389876604,
0.05330919101834297,
-0.13959749042987823,
-0.010968983173370361,
-0.0026278160512447357,
0.06178499013185501,
0.03158025071024895,
0.09870732575654984,
-0.03739625960588455,
0.09427470713853836,
-0.03560812026262283,
0.10828626155853271,
0.12307155132293701,
-0.29711902141571045,
-0.04735712334513664,
0.14622271060943604,
0.07535107433795929,
0.06878192722797394,
-0.03239232301712036,
0.054440222680568695,
-0.02418561838567257,
0.018462367355823517,
-0.010026355274021626,
-0.09892996400594711,
-0.0021431983914226294,
0.013373373076319695,
-0.08214796334505081,
-0.0034656880889087915,
0.3055632710456848,
-0.015924111008644104,
0.06947623193264008,
-0.1252264529466629,
-0.081929512321949,
0.029604502022266388,
-0.038045015186071396,
-0.029293356463313103,
-0.08833248913288116,
0.055971287190914154,
-0.04107125848531723,
-0.1602044552564621,
-0.13733193278312683,
-0.014939903281629086,
-0.1313846856355667,
0.1288081407546997,
0.015942225232720375,
0.023381052538752556,
-0.21469323337078094,
0.0643974244594574,
-0.007613647263497114,
-0.09597019106149673,
0.033531803637742996,
-0.12175610661506653,
0.07289239764213562,
0.01325462106615305,
-0.007079008035361767,
-0.018987856805324554,
0.10094892978668213,
0.07404790073633194,
-0.009516260586678982,
0.03550117462873459,
-0.04696119204163551,
0.04473862051963806,
0.10374480485916138,
0.09310968220233917,
0.02142959088087082,
-0.10510145872831345,
0.024613363668322563,
-0.06035633757710457,
-0.042016685009002686,
-0.021362334489822388,
-0.1937330812215805,
0.010125335305929184,
0.056333042681217194,
0.04743250459432602,
0.06963606178760529,
0.15722613036632538,
0.021864265203475952,
-0.0550459660589695,
0.04212075099349022,
0.005849251989275217,
-0.021566784009337425,
0.01927463337779045,
-0.024757126346230507,
0.15432798862457275,
-0.08565818518400192,
0.0258162971585989,
-0.11589077860116959,
0.04841206222772598,
-0.007059062831103802,
-0.05463903024792671,
-0.005546971689909697,
-0.04138420894742012,
-0.006850733421742916,
-0.07391106337308884,
0.009221414104104042,
-0.13529494404792786,
-0.1323157548904419,
-0.042918331921100616,
-0.03777977451682091,
-0.03179891034960747,
-0.12571364641189575,
-0.11215784400701523,
-0.015189851634204388,
0.007180531043559313,
-0.06760305166244507,
-0.04263610765337944,
-0.032199110835790634,
0.062290992587804794,
-0.00008627580245956779,
0.05508219823241234,
-0.06679905205965042,
0.075970858335495,
-0.07390332967042923,
-0.0285191610455513,
-0.10617103427648544,
0.1151367798447609,
0.03948310762643814,
-0.009089023806154728,
0.01967182569205761,
0.008967559784650803,
-0.09982326626777649,
0.08486450463533401,
-0.07732079923152924,
0.2332923710346222,
-0.032535966485738754,
-0.11155488342046738,
0.2568138837814331,
-0.03361502289772034,
-0.122763492166996,
0.09157317876815796,
0.013610190711915493,
0.15593256056308746,
0.14760953187942505,
0.17090381681919098,
-0.037693772464990616,
0.02440481074154377,
0.07452176511287689,
0.12246181070804596,
-0.12759089469909668,
0.042205650359392166,
0.03515153005719185,
-0.015264422632753849,
-0.05409688130021095,
0.025555042549967766,
0.03319159895181656,
0.020736655220389366,
-0.046424731612205505,
-0.005689752288162708,
0.02813495323061943,
0.006824085023254156,
0.0660351812839508,
-0.04099167510867119,
0.09957035630941391,
-0.01921195164322853,
-0.02477431669831276,
0.02190077304840088,
0.00030851628980599344,
-0.031976137310266495,
0.016485022380948067,
-0.08979526907205582,
0.08617597818374634,
0.035845477133989334,
0.06859473884105682,
-0.12903612852096558,
-0.08129563182592392,
-0.049504805356264114,
0.2414541095495224,
0.004637350793927908,
0.08699628710746765,
0.07700935751199722,
-0.05701446905732155,
-0.04337095841765404,
-0.0049639479257166386,
0.18227604031562805,
0.007909785024821758,
-0.046491801738739014,
-0.10331512242555618,
0.071206234395504,
-0.024246254935860634,
0.12905067205429077,
-0.007513077929615974,
0.016498636454343796,
0.006739082746207714,
0.09537026286125183,
-0.003613489679992199,
0.04318523034453392,
0.005855530966073275,
-0.032385073602199554,
-0.02715635858476162,
-0.02754676714539528,
0.1309245526790619,
0.020329726859927177,
-0.03626583144068718,
0.23013146221637726,
-0.1493167281150818,
0.11805067211389542,
0.17000873386859894,
-0.25411468744277954,
0.0034833319950848818,
-0.12066271156072617,
-0.01655472256243229,
0.006272046826779842,
0.004744839388877153,
-0.06711757928133011,
0.22346988320350647,
0.02026444487273693,
0.15603701770305634,
-0.022577036172151566,
-0.0591680109500885,
-0.0014700685860589147,
-0.0031210784800350666,
0.003885905956849456,
0.07820097357034683,
0.05941187962889671,
-0.1843324452638626,
0.18787044286727905,
0.05252690985798836,
0.14413677155971527,
0.12503717839717865,
0.030394116416573524,
0.026773298159241676,
0.03921189904212952,
0.04572540521621704,
-0.07605739682912827,
-0.0449545755982399,
-0.298798143863678,
-0.03031616099178791,
0.047926899045705795,
0.025598006322979927,
0.09724307805299759,
-0.08573754876852036,
-0.04096454009413719,
0.00728962616994977,
-0.006225539371371269,
0.02265842817723751,
0.09586380422115326,
0.02364206127822399,
0.13118253648281097,
0.012172043323516846,
-0.050731562077999115,
0.08989502489566803,
-0.0038078906945884228,
-0.09075714647769928,
0.14540082216262817,
-0.12564347684383392,
-0.36607688665390015,
-0.05471867322921753,
-0.21597307920455933,
-0.10074432939291,
0.03470230847597122,
0.08786141872406006,
-0.08374461531639099,
0.02756822295486927,
0.001248796354047954,
0.1319686323404312,
-0.11804929375648499,
-0.04418554902076721,
-0.11960363388061523,
-0.0074213421903550625,
-0.15487775206565857,
-0.06502296775579453,
-0.05152492970228195,
-0.020515557378530502,
-0.06262077391147614,
0.16421794891357422,
-0.1438201665878296,
-0.04616941884160042,
0.15045931935310364,
0.06769891083240509,
0.04598670080304146,
-0.06382480263710022,
0.20673318207263947,
-0.07898607850074768,
0.04017949849367142,
0.20356032252311707,
-0.06755922734737396,
0.034657783806324005,
0.12760674953460693,
-0.00784143153578043,
-0.03714874014258385,
-0.0016361991874873638,
-0.018086479976773262,
-0.06651477515697479,
-0.1858845353126526,
-0.10852321237325668,
-0.1155439242720604,
0.14379489421844482,
0.02487352304160595,
0.07032874971628189,
0.1454961895942688,
0.05406079441308975,
-0.07041411101818085,
-0.026425771415233612,
0.068032406270504,
0.09348857402801514,
0.33077073097229004,
-0.13364246487617493,
0.10399708896875381,
0.010237005539238453,
-0.16451767086982727,
0.07318684458732605,
0.08328574895858765,
0.007769986521452665,
0.09470118582248688,
0.06323472410440445,
-0.010014249943196774,
0.02126879245042801,
0.07145637273788452,
0.07114461064338684,
-0.011608528904616833,
-0.0324813649058342,
-0.048183027654886246,
-0.06548638641834259,
-0.006001563277095556,
0.04757522791624069,
0.028076373040676117,
-0.19716498255729675,
-0.016175778582692146,
-0.07900100201368332,
0.03961117938160896,
0.11817455291748047,
0.06892511248588562,
-0.11850398778915405,
-0.01876681111752987,
0.04139894247055054,
0.0037639879155904055,
-0.14254151284694672,
0.06806641072034836,
0.08362583070993423,
-0.13418172299861908,
0.04237022250890732,
0.011473354883491993,
0.11654452234506607,
-0.07376348972320557,
0.0862775668501854,
-0.12145931273698807,
-0.06307044625282288,
-0.011010999791324139,
0.07074051350355148,
-0.28445982933044434,
0.13066887855529785,
0.007658238522708416,
-0.07483909279108047,
-0.09725124388933182,
-0.034506142139434814,
0.004641855135560036,
0.10886816680431366,
0.08475523442029953,
0.004518753848969936,
-0.024495543912053108,
-0.021372659131884575,
-0.02033262327313423,
0.03027799353003502,
0.08096153289079666,
-0.060335129499435425,
-0.014824442565441132,
-0.0411258190870285,
-0.0037531491834670305,
-0.011784324422478676,
-0.07324215024709702,
0.000874541699886322,
-0.13653841614723206,
0.06026044860482216,
0.0586770623922348,
0.0724010169506073,
0.019301626831293106,
-0.014358638785779476,
-0.09548383951187134,
0.2493322193622589,
-0.031638819724321365,
-0.1297009289264679,
-0.09617186337709427,
0.00303706806153059,
-0.02832738496363163,
-0.026582567021250725,
-0.03299374878406525,
-0.03043762408196926,
0.06686577200889587,
-0.06293074786663055,
-0.15658056735992432,
0.11377441883087158,
-0.06130264699459076,
-0.07123830914497375,
-0.04019678384065628,
0.28166019916534424,
-0.0032263651955872774,
0.0340765081346035,
0.013651642017066479,
-0.025258364155888557,
-0.07745428383350372,
-0.04873654246330261,
0.014514094218611717,
0.05993873253464699,
-0.01877259463071823,
0.0034677390940487385,
-0.013478604145348072,
-0.00201582838781178,
-0.07969941198825836,
-0.025199174880981445,
0.3593950867652893,
0.12268884479999542,
-0.027077769860625267,
0.11137715727090836,
0.09674451500177383,
-0.04542447626590729,
-0.2241845279932022,
-0.08366023749113083,
-0.09511856734752655,
-0.06389313191175461,
-0.02989656664431095,
-0.20473088324069977,
0.07486551254987717,
-0.029849737882614136,
0.008225379511713982,
0.11578071117401123,
-0.27621325850486755,
-0.06104455515742302,
0.13281282782554626,
-0.028735311701893806,
0.38852357864379883,
-0.06762492656707764,
-0.07757965475320816,
-0.0342140793800354,
-0.13420066237449646,
0.15801909565925598,
-0.021190520375967026,
0.09043892472982407,
-0.0037430047523230314,
0.1998313069343567,
0.05228489637374878,
0.02843249775469303,
0.02482479065656662,
-0.002936151111498475,
-0.11819259822368622,
-0.07510039210319519,
-0.14630888402462006,
0.013978456147015095,
0.011970101855695248,
-0.01669669710099697,
-0.07310786098241806,
0.004078243859112263,
-0.15783581137657166,
-0.06840484589338303,
-0.09236074984073639,
0.038110487163066864,
0.0033405409194529057,
-0.08332239836454391,
0.013717873953282833,
-0.035287752747535706,
-0.02468314953148365,
0.004911189898848534,
0.05467446893453598,
-0.1631103903055191,
0.13121379911899567,
0.08453860133886337,
0.13154301047325134,
-0.10655669122934341,
-0.05916270613670349,
-0.08168588578701019,
-0.06129181757569313,
0.048283424228429794,
-0.07350043207406998,
0.01753799431025982,
0.10574937611818314,
-0.03499109670519829,
0.10514356195926666,
0.08629204332828522,
-0.020805813372135162,
-0.009389268234372139,
0.05566616356372833,
-0.19699180126190186,
-0.06207571551203728,
-0.11287859827280045,
-0.0020245974883437157,
0.046720921993255615,
0.05955228582024574,
0.20710791647434235,
0.009292356669902802,
-0.0435279905796051,
0.008647297509014606,
0.006211298517882824,
-0.02770519256591797,
0.06266981363296509,
-0.05266141518950462,
0.009996828623116016,
-0.13415288925170898,
0.03021223470568657,
-0.029111823067069054,
-0.11157765984535217,
0.0004921008949168026,
0.1737080067396164,
-0.09376511722803116,
-0.11220841109752655,
-0.037217963486909866,
0.08285726606845856,
-0.06250970810651779,
-0.00243506976403296,
-0.010147497057914734,
-0.1697997748851776,
0.07395805418491364,
0.03258882462978363,
0.01261994894593954,
0.07675550878047943,
-0.06267207115888596,
0.005815907847136259,
-0.009756513871252537,
-0.0032381676137447357,
-0.034458428621292114,
-0.0077727059833705425,
-0.06755335628986359,
-0.0007080707582645118,
-0.01978767290711403,
0.1078522726893425,
-0.08256278187036514,
-0.13458453118801117,
-0.1651090681552887,
0.019122648984193802,
-0.06229422986507416,
-0.05957866087555885,
-0.0927964448928833,
-0.039588212966918945,
0.004338713362812996,
-0.012655277736485004,
-0.04281305521726608,
-0.0480627715587616,
-0.1093241274356842,
0.03132299706339836,
-0.018494172021746635,
0.023533422499895096,
-0.05016437545418739,
0.04537111520767212,
0.006609226576983929,
-0.031378306448459625,
0.17972144484519958,
0.13157150149345398,
-0.11301665008068085,
0.08040721714496613,
-0.10629932582378387,
-0.09723147749900818,
0.0723191648721695,
0.01696585863828659,
0.06557149440050125,
0.0685267522931099,
0.03448520228266716,
0.053533922880887985,
0.026578396558761597,
0.06084837391972542,
0.11769381910562515,
-0.08307404071092606,
0.03918959200382233,
-0.06643007695674896,
-0.10264945030212402,
-0.04061007499694824,
-0.0760655477643013,
0.019911136478185654,
0.027217477560043335,
0.134402334690094,
-0.04682677984237671,
0.10068463534116745,
-0.1059931069612503,
0.05684969201683998,
0.032625891268253326,
-0.18841783702373505,
0.056159403175115585,
-0.06950455904006958,
0.060006722807884216,
0.000987678300589323,
0.21895559132099152,
0.040600210428237915,
0.022666143253445625,
0.03710576891899109,
0.07068520039319992,
0.03600240871310234,
-0.02058577910065651,
0.17230874300003052,
0.1208457499742508,
-0.07654649764299393,
-0.1268928050994873,
0.09409017860889435,
-0.0026196956168860197,
0.029086310416460037,
0.16900068521499634,
-0.04603046178817749,
-0.005313795525580645,
0.04804443195462227,
0.09104093164205551,
0.04822062328457832,
-0.11009079217910767,
-0.16510643064975739,
-0.003144441870972514,
0.06172431632876396,
-0.089575856924057,
0.17919082939624786,
0.12148972600698471,
0.0026340417098253965,
0.04664146900177002,
-0.054324615746736526,
-0.023227766156196594,
-0.15472181141376495,
-0.2055574655532837,
-0.062334299087524414,
-0.14859288930892944,
-0.03047705441713333,
-0.1203370913863182,
0.004558430518954992,
-0.018898993730545044,
0.08228039741516113,
-0.07814856618642807,
0.030067896470427513,
0.05051160603761673,
-0.14114612340927124,
0.09025101363658905,
-0.007891073822975159,
0.0788787230849266,
-0.057171422988176346,
-0.016977448016405106,
-0.03423185274004936,
0.078585684299469,
0.03551948815584183,
0.013171409256756306,
-0.015726227313280106,
-0.025639845058321953,
-0.15487195551395416,
-0.08271637558937073,
-0.056325674057006836,
0.06768203526735306,
-0.02496480382978916,
0.13989678025245667,
-0.0009532095864415169,
-0.024050438776612282,
0.029726406559348106,
0.2065732479095459,
-0.036570075899362564,
-0.0915936604142189,
-0.044179003685712814,
0.21854016184806824,
-0.035210054367780685,
0.045180365443229675,
-0.02177785336971283,
0.027932971715927124,
-0.09362096339464188,
0.3342740833759308,
0.3995961546897888,
-0.04277603328227997,
0.0006326051079668105,
-0.015691205859184265,
0.04498426988720894,
0.09539860486984253,
0.1577805131673813,
0.07891067862510681,
0.2724004089832306,
-0.011473314836621284,
0.033670369535684586,
-0.0007881458732299507,
-0.004554320592433214,
0.009181192144751549,
0.02333390712738037,
0.1509162187576294,
-0.06347271054983139,
-0.020302575081586838,
0.11864937841892242,
-0.23282307386398315,
0.07781732082366943,
-0.1634245067834854,
-0.1465187668800354,
-0.07056880742311478,
-0.0026944628916680813,
0.12076111882925034,
0.056526102125644684,
0.09027507156133652,
0.018917618319392204,
-0.03223288431763649,
0.040956947952508926,
0.00037560812779702246,
-0.20102453231811523,
0.0037184949032962322,
0.09275401383638382,
-0.16350044310092926,
-0.04025238752365112,
-0.037946637719869614,
0.07643736153841019,
0.07727792114019394,
0.05529318377375603,
-0.02089175581932068,
0.04069335386157036,
-0.018187548965215683,
-0.10326635092496872,
0.03260327875614166,
0.05258208140730858,
0.009844050742685795,
-0.06683570146560669,
0.08634356409311295,
-0.1034168004989624,
0.06374965608119965,
0.04720254987478256,
-0.01597464084625244,
-0.040641169995069504,
0.09491080045700073,
-0.06184399500489235,
0.045183099806308746,
0.09200790524482727,
-0.02251311205327511,
-0.03767947107553482,
-0.06334494799375534,
-0.029574209824204445,
-0.01481111440807581,
-0.00316880876198411,
-0.0802488848567009,
-0.1716025173664093,
-0.11649645864963531,
0.05180656537413597,
-0.022560160607099533,
-0.18045249581336975,
0.01610768586397171,
-0.12828007340431213,
0.048580560833215714,
-0.09808717668056488,
0.0731147825717926,
0.031803030520677567,
0.05440966784954071,
0.004146524704992771,
-0.023666972294449806,
0.022669052705168724,
0.11111140996217728,
-0.11568079143762589,
-0.05699663981795311
] |
null | null |
transformers
|
# Harry Potter DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
gfdream/dialogpt-small-harrypotter
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Harry Potter DialoGPT Model
|
[
"# Harry Potter DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Harry Potter DialoGPT Model"
] |
[
51,
8
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model"
] |
[
-0.0009023238671943545,
0.07815738022327423,
-0.006546166725456715,
0.07792752981185913,
0.10655936598777771,
0.048972971737384796,
0.17639793455600739,
0.12185695022344589,
0.016568755730986595,
-0.04774167761206627,
0.11647630482912064,
0.2130284160375595,
-0.002118367003276944,
0.024608047679066658,
-0.05022026598453522,
-0.3065771162509918,
0.0474756620824337,
0.014356585219502449,
-0.07174845039844513,
0.11724270135164261,
0.09064973145723343,
-0.046179238706827164,
0.08330509811639786,
-0.009135239757597446,
-0.13198648393154144,
-0.039482954889535904,
0.019292812794446945,
-0.11745545268058777,
0.1662212759256363,
0.05298272892832756,
0.02469746209681034,
-0.008447164669632912,
-0.06598151475191116,
-0.15036040544509888,
0.037190426141023636,
-0.027472136542201042,
-0.01080626156181097,
0.05462246760725975,
0.023526115342974663,
-0.07521048933267593,
0.170567125082016,
0.17678891122341156,
0.0833497866988182,
0.0349111407995224,
-0.14917024970054626,
-0.045548245310783386,
0.008950977586209774,
0.05421316996216774,
-0.017893504351377487,
0.09349167346954346,
-0.019903047010302544,
0.11801653355360031,
-0.04491448402404785,
0.09210366010665894,
0.15255063772201538,
-0.4016275703907013,
-0.027563704177737236,
0.08920855820178986,
0.05989706888794899,
0.12076901644468307,
-0.10560955852270126,
0.03972794860601425,
-0.0039703017100691795,
0.01236654631793499,
-0.014540530741214752,
-0.08304883539676666,
-0.07308239489793777,
0.032504837960004807,
-0.1272556483745575,
0.008525865152478218,
0.23756256699562073,
-0.10643257945775986,
0.037069112062454224,
-0.09791990369558334,
-0.07414398342370987,
0.048336777836084366,
-0.053761593997478485,
-0.081727035343647,
-0.054839808493852615,
0.06347949057817459,
0.004366500303149223,
-0.06301609426736832,
-0.08326146006584167,
-0.0006536149303428829,
-0.12781435251235962,
0.17595994472503662,
0.061243366450071335,
0.041611745953559875,
-0.21322020888328552,
0.08940251916646957,
0.04477722570300102,
-0.04711297154426575,
0.007116159424185753,
-0.11796226352453232,
0.04023287072777748,
0.005483259446918964,
-0.03256071358919144,
-0.021854614838957787,
0.0393419973552227,
0.13909944891929626,
-0.01777748204767704,
0.03252175822854042,
0.006831915583461523,
0.05811219662427902,
0.08162496984004974,
0.02222144603729248,
0.019291909411549568,
-0.0818009302020073,
0.019385190680623055,
-0.08128736168146133,
-0.0030400939285755157,
-0.048940129578113556,
-0.17071883380413055,
-0.07477642595767975,
0.052610911428928375,
0.020047198981046677,
0.03746970370411873,
0.08054786175489426,
-0.0017944995779544115,
-0.05560554191470146,
0.03284840285778046,
0.01671096310019493,
-0.020622212439775467,
-0.010361049324274063,
-0.02412462793290615,
0.19123271107673645,
0.019619356840848923,
0.014111656695604324,
-0.12379156798124313,
0.10023640841245651,
-0.08179095387458801,
0.0037731381598860025,
0.02743307314813137,
-0.04204464703798294,
-0.004716555587947369,
0.02917117439210415,
0.023101668804883957,
-0.1252521574497223,
-0.1099385917186737,
-0.0030569476075470448,
-0.012054097838699818,
-0.036421261727809906,
-0.10490952432155609,
-0.08483029156923294,
-0.012153145857155323,
0.0449371263384819,
-0.013397793285548687,
0.007936403155326843,
-0.05143149942159653,
0.0985720232129097,
-0.0514979362487793,
0.09873400628566742,
-0.08342572301626205,
0.06359215080738068,
-0.09124887734651566,
-0.061886150389909744,
-0.11452563107013702,
0.05216052383184433,
0.012905281968414783,
0.066250741481781,
0.016998225823044777,
-0.044836658984422684,
-0.014836243353784084,
0.05253177136182785,
-0.07656687498092651,
0.1940697431564331,
-0.041674621403217316,
-0.12459053844213486,
0.24146439135074615,
-0.09138800948858261,
-0.1802034229040146,
0.12973085045814514,
-0.022254703566432,
0.08523941785097122,
0.12802475690841675,
0.20380465686321259,
-0.00019822151807602495,
-0.01302915159612894,
0.07281201332807541,
0.07031642645597458,
-0.09803894907236099,
0.06239739805459976,
0.029653839766979218,
-0.008071083575487137,
-0.08906278014183044,
0.05762826278805733,
0.046033453196287155,
-0.010650773532688618,
-0.035073768347501755,
-0.001896020956337452,
-0.012895751744508743,
-0.022185025736689568,
0.14126582443714142,
-0.02006692811846733,
0.1300428807735443,
-0.06926563382148743,
-0.03515486419200897,
-0.009500149637460709,
0.03533667325973511,
-0.04091939330101013,
0.08151165395975113,
-0.0436173714697361,
0.10586477071046829,
0.09034156054258347,
0.053724925965070724,
-0.13120363652706146,
0.00466286763548851,
-0.015246815048158169,
0.17014820873737335,
0.08964069187641144,
0.05222717300057411,
0.06265474855899811,
-0.0020888058934360743,
-0.06708643585443497,
0.045407816767692566,
0.13778303563594818,
-0.037020038813352585,
-0.12218865007162094,
-0.1755627691745758,
0.051157694309949875,
-0.045444171875715256,
0.10855234414339066,
-0.10010123997926712,
0.022670533508062363,
-0.055906031280756,
0.07772238552570343,
-0.024998966604471207,
0.020512236282229424,
-0.0013405600329861045,
-0.021700702607631683,
-0.08356887847185135,
-0.002377772703766823,
0.08597290515899658,
-0.02048647589981556,
-0.06707409024238586,
0.16556480526924133,
-0.16400809586048126,
0.1631954461336136,
0.2116095870733261,
-0.28542569279670715,
-0.005696662236005068,
-0.15163889527320862,
-0.0208092350512743,
0.019645055755972862,
0.07834604382514954,
0.026225795969367027,
0.2044338881969452,
-0.012928472831845284,
0.16565458476543427,
-0.05699567869305611,
-0.07730039209127426,
-0.06881127506494522,
-0.048101142048835754,
0.013522743247449398,
0.09095205366611481,
0.04542696103453636,
-0.11962861567735672,
0.13119758665561676,
0.1054433062672615,
0.06484298408031464,
0.12711186707019806,
0.1030748188495636,
-0.008113685995340347,
0.07252490520477295,
-0.03624548763036728,
-0.03462279960513115,
-0.09254947304725647,
-0.30446043610572815,
-0.04840317741036415,
0.0939924493432045,
0.007963384501636028,
0.09285714477300644,
-0.0919896736741066,
-0.03311870992183685,
0.006042704917490482,
0.009473444893956184,
0.028337622061371803,
0.09653715789318085,
0.013490920886397362,
0.15320514142513275,
-0.008011690340936184,
-0.03430786728858948,
0.05891305208206177,
0.017982570454478264,
-0.09147711098194122,
0.17280617356300354,
-0.17050009965896606,
-0.27190929651260376,
-0.06990014761686325,
-0.21745692193508148,
-0.013139115646481514,
0.05258983001112938,
0.0786920040845871,
-0.11818131804466248,
-0.018352627754211426,
-0.006239492911845446,
0.05685517191886902,
-0.2425733357667923,
0.0004911290016025305,
-0.1354890614748001,
0.0501418262720108,
-0.1974833607673645,
-0.09718500077724457,
-0.02271542325615883,
-0.013450481928884983,
-0.0464281290769577,
0.13365240395069122,
-0.1448695808649063,
-0.011572926305234432,
0.2329535037279129,
0.032479673624038696,
0.027794739231467247,
-0.05020907148718834,
0.19788463413715363,
-0.0958966314792633,
-0.023973820731043816,
0.11024576425552368,
-0.05038975924253464,
0.04834126681089401,
0.06649978458881378,
-0.012981836684048176,
-0.08557141572237015,
0.023789849132299423,
-0.068336620926857,
-0.03150583803653717,
-0.27926525473594666,
-0.0930178239941597,
-0.09319330751895905,
0.11305391043424606,
0.04079577326774597,
0.06421639025211334,
0.16545771062374115,
0.05191578343510628,
-0.024325082078576088,
-0.03006586618721485,
0.11609793454408646,
0.12905290722846985,
0.2277202159166336,
-0.06067761778831482,
0.10221996158361435,
0.009445492178201675,
-0.08203992247581482,
0.06062209978699684,
0.056782789528369904,
0.06324724853038788,
0.02584579586982727,
0.03694582358002663,
-0.030939655378460884,
0.1121687963604927,
0.12571842968463898,
0.05258069559931755,
0.0481170229613781,
0.0002127334737451747,
-0.0561506561934948,
-0.008168719708919525,
-0.05726633965969086,
0.06774696707725525,
0.061340972781181335,
-0.12918008863925934,
-0.08061543852090836,
0.0011613310780376196,
0.06660808622837067,
-0.016230419278144836,
0.06823775917291641,
-0.13560809195041656,
-0.03582429885864258,
0.0790911465883255,
-0.07693151384592056,
-0.14156894385814667,
0.11972879618406296,
-0.026570770889520645,
-0.19904157519340515,
0.05265914276242256,
0.007704653777182102,
0.0908159390091896,
-0.06360849738121033,
0.05343840271234512,
-0.13023801147937775,
-0.12935101985931396,
-0.018437571823596954,
0.07945099472999573,
-0.3450873792171478,
0.13536721467971802,
-0.013286802917718887,
-0.02876877970993519,
-0.06474969536066055,
-0.02640824392437935,
0.013905409723520279,
0.12719078361988068,
0.08667250722646713,
0.0008821099763736129,
0.0991629809141159,
0.03823768347501755,
0.04188435152173042,
-0.002011700300499797,
0.10950417071580887,
0.0050011589191854,
0.004797275178134441,
-0.04982118681073189,
0.007274609990417957,
-0.05164213851094246,
-0.07472953200340271,
0.08393982797861099,
-0.20678792893886566,
0.09087453782558441,
-0.03378438204526901,
0.08427679538726807,
0.04304937273263931,
-0.018965769559144974,
-0.1001204177737236,
0.19745583832263947,
-0.012206900864839554,
-0.11405988782644272,
-0.07517550885677338,
-0.02810264565050602,
0.09103139489889145,
-0.013817726634442806,
0.012886416167020798,
-0.045470476150512695,
0.032183047384023666,
-0.1263762265443802,
-0.1597503274679184,
0.08734500408172607,
-0.04441224783658981,
-0.10894393920898438,
-0.025462759658694267,
0.20382575690746307,
-0.007266622502356768,
0.08242089301347733,
0.01605331338942051,
0.010653935372829437,
-0.18066231906414032,
-0.04018142446875572,
0.02645772136747837,
-0.0016437612939625978,
0.005979063920676708,
0.047698814421892166,
0.019091911613941193,
0.06207629665732384,
-0.1069745197892189,
-0.013920160941779613,
0.3158324360847473,
0.15978319942951202,
-0.00912671908736229,
0.14943915605545044,
0.1093616932630539,
-0.08669080585241318,
-0.17238758504390717,
-0.1171615794301033,
-0.1210922971367836,
-0.08425768464803696,
-0.10681738704442978,
-0.1525043100118637,
0.09535340964794159,
-0.03392014652490616,
0.03498011827468872,
0.14615866541862488,
-0.280263751745224,
-0.10949636250734329,
0.13820378482341766,
0.010744688101112843,
0.3510635495185852,
-0.12303631007671356,
-0.044944874942302704,
-0.06214528530836105,
-0.16933435201644897,
0.08021392673254013,
-0.031203703954815865,
0.11581093072891235,
-0.0744495838880539,
0.19395925104618073,
0.01719796098768711,
0.014287159778177738,
0.0916559100151062,
0.05038322135806084,
-0.05808406323194504,
-0.07368700206279755,
-0.10248131304979324,
0.010812131687998772,
0.03546109423041344,
0.010252019390463829,
-0.008802837692201138,
0.0211968794465065,
-0.11341743916273117,
-0.050869911909103394,
-0.06302189081907272,
0.0072614275850355625,
-0.01001308299601078,
-0.042155615985393524,
-0.05533592775464058,
-0.022557416930794716,
-0.020093943923711777,
0.02266426384449005,
0.14185629785060883,
-0.07527699321508408,
0.18586260080337524,
0.02357078716158867,
0.1586609035730362,
-0.11956068128347397,
-0.06724818795919418,
-0.029193658381700516,
-0.05280323326587677,
0.06468886137008667,
-0.08884575963020325,
-0.027708567678928375,
0.1332162618637085,
-0.01903904788196087,
0.04655366763472557,
0.12936700880527496,
0.02046884410083294,
0.015383756719529629,
0.034968774765729904,
-0.2578005790710449,
-0.07463036477565765,
-0.03505445644259453,
-0.012416874058544636,
0.05272092670202255,
0.05525677278637886,
0.19735674560070038,
-0.03551921248435974,
-0.08521962910890579,
0.020131373777985573,
0.02735883742570877,
-0.02776256389915943,
0.10749414563179016,
0.019579345360398293,
-0.004837906453758478,
-0.16151933372020721,
0.08257976174354553,
-0.005964108742773533,
-0.08297000825405121,
0.028665626421570778,
0.2024049311876297,
-0.12141239643096924,
-0.10309756547212601,
-0.06804922968149185,
0.07315051555633545,
-0.09220825880765915,
0.016043387353420258,
-0.005091092549264431,
-0.1521538347005844,
0.06916408240795135,
0.07598215341567993,
0.04075418785214424,
0.06513199955224991,
-0.11743064224720001,
-0.015730571001768112,
-0.04170290008187294,
-0.002195435343310237,
0.03521120920777321,
0.01863143965601921,
-0.057492829859256744,
0.15846455097198486,
-0.0676199421286583,
0.08538917452096939,
-0.0744810476899147,
-0.1058846190571785,
-0.1395980566740036,
0.04660497233271599,
-0.08038312196731567,
-0.07247276604175568,
-0.12832807004451752,
-0.052204377949237823,
-0.0067099276930093765,
-0.03388519585132599,
0.006552806124091148,
-0.06627799570560455,
-0.10922821611166,
0.01822470687329769,
-0.00743203004822135,
-0.009385870769619942,
-0.06096754968166351,
0.026706209406256676,
0.06246216222643852,
-0.039788868278265,
0.15730851888656616,
0.22509248554706573,
-0.13591648638248444,
0.11564400047063828,
-0.09797432273626328,
-0.105463907122612,
0.046008042991161346,
0.009427277371287346,
0.03594303876161575,
0.0503489226102829,
-0.03594081476330757,
0.0044484552927315235,
0.03905477747321129,
0.08074651658535004,
0.08456914126873016,
-0.06776505708694458,
0.020801106467843056,
-0.05122765153646469,
-0.14904099702835083,
-0.016655439510941505,
-0.0464773029088974,
0.06876829266548157,
-0.006725262850522995,
0.11020535975694656,
-0.0515950471162796,
0.07739507406949997,
-0.07558431476354599,
0.050614211708307266,
0.021146971732378006,
-0.14688286185264587,
-0.006612539757043123,
-0.07093682140111923,
0.042144812643527985,
-0.008834975771605968,
0.20241086184978485,
-0.03228091076016426,
0.010342049412429333,
0.033811055123806,
0.06203942745923996,
-0.01957780309021473,
0.009357001632452011,
0.2014283686876297,
0.12640917301177979,
-0.08496357500553131,
-0.02679651789367199,
0.06793134659528732,
0.07248228788375854,
0.07093550264835358,
0.10807815194129944,
-0.015352966263890266,
0.028434239327907562,
0.07829629629850388,
-0.060215238481760025,
0.07576877623796463,
-0.08603982627391815,
-0.11668483167886734,
0.05793621391057968,
0.012955795042216778,
-0.055695828050374985,
0.20305177569389343,
0.19142870604991913,
-0.026278704404830933,
0.018410727381706238,
-0.0029499190859496593,
-0.10117456316947937,
-0.15619947016239166,
-0.05423750728368759,
-0.07170962542295456,
-0.1319410353899002,
-0.004549739416688681,
-0.16646917164325714,
0.022016216069459915,
-0.01132756657898426,
0.09506805986166,
-0.06855440139770508,
-0.01345991250127554,
0.1364889293909073,
-0.1055467277765274,
0.0847758799791336,
-0.024517204612493515,
0.07877567410469055,
-0.03746940940618515,
-0.018209461122751236,
-0.10342709720134735,
0.007514837197959423,
0.01131442841142416,
0.06840907037258148,
-0.10897937417030334,
0.02432350255548954,
-0.12208317965269089,
-0.08617185056209564,
-0.026142612099647522,
0.09279687702655792,
-0.0403008833527565,
0.15116846561431885,
0.02645145356655121,
-0.06710928678512573,
-0.004313822835683823,
0.2646709978580475,
-0.08046227693557739,
-0.08319197595119476,
-0.030799202620983124,
0.2152107208967209,
0.04053696244955063,
0.06396269053220749,
0.019140036776661873,
0.038027774542570114,
-0.07184682041406631,
0.2957373559474945,
0.34401440620422363,
-0.1318037211894989,
-0.007773484103381634,
0.04225075617432594,
0.04406323283910751,
0.14687567949295044,
0.07998795062303543,
0.11360671371221542,
0.2849363386631012,
-0.09197647124528885,
0.016657205298542976,
-0.04230864346027374,
-0.01424806285649538,
-0.06908884644508362,
0.045314885675907135,
0.08216670155525208,
-0.09241747111082077,
-0.022950593382120132,
0.08125471323728561,
-0.29741767048835754,
0.10791494697332382,
-0.15600289404392242,
-0.14948409795761108,
-0.05027429759502411,
-0.008771711029112339,
0.014683255925774574,
0.019041186198592186,
0.09663030505180359,
0.025651484727859497,
-0.07275258749723434,
0.07816889137029648,
0.024486342445015907,
-0.23020237684249878,
-0.01345184724777937,
0.1456068754196167,
-0.06789913028478622,
-0.025938833132386208,
-0.021313713863492012,
0.051610056310892105,
0.05763651058077812,
0.09027529507875443,
-0.03809558227658272,
-0.0746568813920021,
-0.007141788024455309,
-0.022818787023425102,
0.01914946548640728,
0.0597183033823967,
0.06841408461332321,
-0.0920223817229271,
0.1167774423956871,
-0.07350476831197739,
0.0650370642542839,
0.037623800337314606,
-0.022277191281318665,
0.0018526542698964477,
0.013183658011257648,
-0.06512464582920074,
0.05533479526638985,
0.1295643299818039,
-0.025459708645939827,
-0.002524374984204769,
-0.028180841356515884,
-0.0767761766910553,
-0.024015206843614578,
-0.04643676429986954,
-0.09101243317127228,
-0.18130090832710266,
-0.12738600373268127,
0.041754670441150665,
-0.03240608796477318,
-0.2046082615852356,
0.0060346988029778,
-0.1128578633069992,
0.03700976446270943,
-0.14154092967510223,
0.10004086047410965,
0.07216610759496689,
0.004716616589576006,
0.006774604320526123,
0.0675399899482727,
0.045677728950977325,
0.14796748757362366,
-0.16543124616146088,
-0.04919974133372307
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-herblabels
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4823
- Rouge1: 3.0759
- Rouge2: 1.0495
- Rougel: 3.0758
- Rougelsum: 3.0431
- Gen Len: 18.9716
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log | 1.0 | 264 | 1.6010 | 2.4276 | 0.5658 | 2.3546 | 2.3099 | 18.9091 |
| 2.5052 | 2.0 | 528 | 1.0237 | 2.9016 | 0.3395 | 2.8221 | 2.783 | 18.9673 |
| 2.5052 | 3.0 | 792 | 0.7793 | 2.962 | 0.3091 | 2.9375 | 2.8786 | 18.9588 |
| 1.1552 | 4.0 | 1056 | 0.6530 | 2.98 | 0.4375 | 2.9584 | 2.8711 | 18.9588 |
| 1.1552 | 5.0 | 1320 | 0.5863 | 3.0023 | 0.5882 | 2.987 | 2.9155 | 18.9588 |
| 0.8659 | 6.0 | 1584 | 0.5428 | 3.0576 | 0.8019 | 3.0494 | 2.9989 | 18.9716 |
| 0.8659 | 7.0 | 1848 | 0.5145 | 3.0808 | 0.9476 | 3.0719 | 3.0237 | 18.9716 |
| 0.747 | 8.0 | 2112 | 0.4962 | 3.0748 | 1.0032 | 3.0683 | 3.0359 | 18.9716 |
| 0.747 | 9.0 | 2376 | 0.4856 | 3.0702 | 1.0196 | 3.0665 | 3.0328 | 18.9716 |
| 0.6987 | 10.0 | 2640 | 0.4823 | 3.0759 | 1.0495 | 3.0758 | 3.0431 | 18.9716 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0
- Datasets 1.16.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "model-index": [{"name": "t5-small-herblabels", "results": []}]}
|
text2text-generation
|
ggosline/t5-small-herblabels
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
t5-small-herblabels
===================
This model is a fine-tuned version of t5-small on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4823
* Rouge1: 3.0759
* Rouge2: 1.0495
* Rougel: 3.0758
* Rougelsum: 3.0431
* Gen Len: 18.9716
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 10
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.5
* Pytorch 1.10.0
* Datasets 1.16.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
63,
113,
4,
30
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.5\n* Pytorch 1.10.0\n* Datasets 1.16.1\n* Tokenizers 0.10.3"
] |
[
-0.068316750228405,
0.036866139620542526,
-0.0023741440381854773,
0.10314052551984787,
0.15732356905937195,
0.01711984910070896,
0.12500165402889252,
0.1333978772163391,
-0.12591764330863953,
0.012641599401831627,
0.12266108393669128,
0.1498635709285736,
0.02379315346479416,
0.12083284556865692,
-0.051797375082969666,
-0.28156164288520813,
0.008380892686545849,
0.03186379373073578,
-0.06026096269488335,
0.14316794276237488,
0.1046648919582367,
-0.10579856485128403,
0.090302973985672,
-0.0018888229969888926,
-0.17933706939220428,
0.03278373181819916,
0.009834537282586098,
-0.06278075277805328,
0.15121443569660187,
0.04684649407863617,
0.09638538956642151,
0.005005626939237118,
0.07766216993331909,
-0.2173743098974228,
0.007767168805003166,
0.0653935894370079,
0.005415632389485836,
0.07981478422880173,
0.07680195569992065,
0.003149508498609066,
0.17182405292987823,
-0.0729377418756485,
0.0477585606276989,
0.030970631167292595,
-0.1269911378622055,
-0.2170386165380478,
-0.07846983522176743,
0.023129824548959732,
0.0846480056643486,
0.12257606536149979,
-0.018642812967300415,
0.11476688832044601,
-0.08554435521364212,
0.10574869811534882,
0.23365841805934906,
-0.28379756212234497,
-0.058942507952451706,
-0.0034703656565397978,
0.035407960414886475,
0.08807476609945297,
-0.08335114270448685,
-0.024912798777222633,
0.047421980649232864,
0.05255784094333649,
0.1268702745437622,
-0.03227611258625984,
-0.12170634418725967,
0.013370285741984844,
-0.14066579937934875,
-0.04238172620534897,
0.16744354367256165,
0.04345584288239479,
-0.026006972417235374,
-0.044607751071453094,
-0.0725088119506836,
-0.15214857459068298,
-0.030052995309233665,
-0.011055952869355679,
0.0465635247528553,
-0.008711609058082104,
-0.0660848468542099,
-0.017056304961442947,
-0.10351221263408661,
-0.06295539438724518,
-0.06721502542495728,
0.1127055287361145,
0.040705494582653046,
-0.0037027124781161547,
-0.042786773294210434,
0.10371556133031845,
-0.01016214955598116,
-0.12647539377212524,
0.009630464017391205,
0.03032574988901615,
0.0007271835347637534,
-0.03838340565562248,
-0.07658299803733826,
-0.06186530739068985,
0.011617383919656277,
0.1447540521621704,
-0.05855102092027664,
0.055843330919742584,
0.0018141636392101645,
0.033848464488983154,
-0.09879909455776215,
0.17506720125675201,
-0.03918711468577385,
-0.027735887095332146,
0.006895946338772774,
0.034907035529613495,
0.02254570834338665,
-0.0146830715239048,
-0.11199118196964264,
-0.0036511486396193504,
0.11316092312335968,
0.02583734504878521,
-0.06208706647157669,
0.07487967610359192,
-0.04060745984315872,
-0.019947540014982224,
-0.020989863201975822,
-0.0985463410615921,
0.02220267616212368,
-0.013649904169142246,
-0.07904116809368134,
0.004683977458626032,
0.029943076893687248,
0.020378775894641876,
-0.049237512052059174,
0.10127885639667511,
-0.07969756424427032,
0.03046436607837677,
-0.09512832015752792,
-0.11750296503305435,
0.023964667692780495,
-0.03861052542924881,
0.01872732676565647,
-0.10576947033405304,
-0.19547690451145172,
-0.01639525778591633,
0.05423520505428314,
-0.03227287530899048,
-0.0595494769513607,
-0.06360136717557907,
-0.07278469204902649,
0.025281371548771858,
-0.03334656357765198,
0.15414272248744965,
-0.06999005377292633,
0.10102871060371399,
0.03658011555671692,
0.05076555907726288,
-0.053993239998817444,
0.06594225019216537,
-0.10742467641830444,
0.007080479990690947,
-0.15538200736045837,
0.060148611664772034,
-0.023040488362312317,
0.07142344862222672,
-0.10583318024873734,
-0.10820576548576355,
0.0038107018917798996,
0.00909456517547369,
0.08310435712337494,
0.10019732266664505,
-0.1570938378572464,
-0.07976414263248444,
0.1713338941335678,
-0.06862694770097733,
-0.13502712547779083,
0.12238100916147232,
-0.055447064340114594,
0.05759327858686447,
0.07378512620925903,
0.1721983253955841,
0.049052994698286057,
-0.06720253825187683,
0.03217598795890808,
-0.009405745193362236,
0.05421306565403938,
-0.035595618188381195,
0.07096225768327713,
-0.0005649121012538671,
-0.011635320261120796,
0.024413693696260452,
-0.010198093950748444,
0.07526212185621262,
-0.09069357067346573,
-0.08089101314544678,
-0.05234038084745407,
-0.0775722861289978,
0.014580721966922283,
0.049343954771757126,
0.07613932341337204,
-0.11165710538625717,
-0.09266914427280426,
0.06577353924512863,
0.08003823459148407,
-0.07674482464790344,
0.04893646016716957,
-0.0547000952064991,
0.04292643815279007,
-0.01719985529780388,
0.00040246229036711156,
-0.18517981469631195,
-0.017147723585367203,
0.009867270477116108,
0.0018371858168393373,
0.03936924412846565,
-0.0035376008599996567,
0.06602221727371216,
0.051397114992141724,
-0.06297373026609421,
-0.027047306299209595,
-0.026486419141292572,
-0.005074294749647379,
-0.12048648297786713,
-0.19349487125873566,
-0.017314152792096138,
-0.0033647511154413223,
0.13823185861110687,
-0.21445639431476593,
0.033938102424144745,
-0.021923137828707695,
0.08132350444793701,
0.008440306410193443,
0.0042942906729876995,
-0.05075521022081375,
0.08184785395860672,
-0.05475978925824165,
-0.049287475645542145,
0.07003067433834076,
0.007641898933798075,
-0.10096537321805954,
-0.011816218495368958,
-0.13372136652469635,
0.1440088450908661,
0.13655820488929749,
-0.13903185725212097,
-0.077353835105896,
-0.0029588432516902685,
-0.05135149881243706,
-0.040294915437698364,
-0.03809163346886635,
0.007420740090310574,
0.17948824167251587,
-0.010194484144449234,
0.16448624432086945,
-0.07554460316896439,
-0.04282403737306595,
0.015498630702495575,
-0.030884992331266403,
0.036809686571359634,
0.12743952870368958,
0.10065554827451706,
-0.0908644050359726,
0.13875330984592438,
0.16757945716381073,
-0.08286155015230179,
0.12620878219604492,
-0.043161530047655106,
-0.08959498256444931,
-0.006792307365685701,
-0.01597452536225319,
-0.006731878034770489,
0.04616299644112587,
-0.14333383738994598,
0.0057508512400090694,
0.02057884819805622,
0.041700270026922226,
0.03048502467572689,
-0.21388858556747437,
-0.024239467456936836,
0.03716770559549332,
-0.05614430457353592,
-0.024152720347046852,
-0.014681614935398102,
0.008483919315040112,
0.11161094158887863,
0.009395388886332512,
-0.08165767043828964,
0.03860536962747574,
0.0023225529585033655,
-0.09323320537805557,
0.20786625146865845,
-0.09987387806177139,
-0.177507683634758,
-0.12724216282367706,
-0.0869174599647522,
-0.04942665994167328,
0.005800409242510796,
0.08140908181667328,
-0.08998867124319077,
-0.035632047802209854,
-0.08714983612298965,
0.05361071228981018,
-0.026478897780179977,
0.020485734567046165,
-0.00931477453559637,
0.005467129871249199,
0.07148782163858414,
-0.1068321093916893,
-0.010550742037594318,
-0.028594978153705597,
-0.06952670961618423,
0.04909682646393776,
0.01371026411652565,
0.10764691233634949,
0.16021870076656342,
-0.010702939704060555,
0.010088269598782063,
-0.038711413741111755,
0.22551923990249634,
-0.0664982944726944,
-0.02170829474925995,
0.15463389456272125,
0.006073412951081991,
0.050446417182683945,
0.11986939609050751,
0.04899908974766731,
-0.09485725313425064,
0.03240671381354332,
0.03648306429386139,
-0.027474531903862953,
-0.24047426879405975,
-0.04461692273616791,
-0.05867767333984375,
-0.0011098815593868494,
0.10082673281431198,
0.027390575036406517,
0.05665978416800499,
0.04924740269780159,
0.026050029322504997,
0.07746230065822601,
-0.008228834718465805,
0.07430441677570343,
0.1577109396457672,
0.043075282126665115,
0.13313554227352142,
-0.04669973999261856,
-0.06352947652339935,
0.053436361253261566,
-0.022719237953424454,
0.21555668115615845,
0.011074520647525787,
0.13983386754989624,
0.04556947946548462,
0.14611370861530304,
-0.005476788152009249,
0.08137420564889908,
0.011759309098124504,
-0.030006015673279762,
-0.01917077600955963,
-0.04732236638665199,
-0.0360223725438118,
0.03698455169796944,
-0.07986430078744888,
0.03275256231427193,
-0.1303204745054245,
-0.01997787319123745,
0.053116753697395325,
0.26607730984687805,
0.04714173823595047,
-0.32252994179725647,
-0.08779674768447876,
0.01117012556642294,
-0.07186419516801834,
-0.030356787145137787,
0.040847521275281906,
0.07305844873189926,
-0.09125890582799911,
0.05374128744006157,
-0.06312165409326553,
0.11815904080867767,
-0.025520769879221916,
0.056115929037332535,
0.06458058208227158,
0.09541480988264084,
0.0059763588942587376,
0.08937517553567886,
-0.3341118097305298,
0.273664653301239,
-0.002059556543827057,
0.05819923058152199,
-0.07319056987762451,
0.008068498224020004,
0.03131259232759476,
0.06438291072845459,
0.0473862960934639,
-0.015865620225667953,
-0.07743662595748901,
-0.1667374074459076,
-0.04434594884514809,
0.036269959062337875,
0.1093597337603569,
-0.02137850970029831,
0.10802803933620453,
-0.0531647652387619,
0.011460083536803722,
0.08188074827194214,
-0.012651425786316395,
-0.08912403136491776,
-0.09804235398769379,
-0.00008229886589106172,
0.038700103759765625,
0.021456072106957436,
-0.06571077555418015,
-0.0960598737001419,
-0.0919710174202919,
0.1602945327758789,
-0.00317521789111197,
-0.02770010009407997,
-0.11307530105113983,
0.08602918684482574,
0.08251000940799713,
-0.0873035341501236,
0.035519685596227646,
0.013751784339547157,
0.06414133310317993,
0.04660523310303688,
-0.08562692254781723,
0.12735971808433533,
-0.05776212364435196,
-0.16125915944576263,
-0.048476021736860275,
0.11126150190830231,
0.013526054099202156,
0.06218378618359566,
-0.013566842302680016,
0.013206427916884422,
-0.050451453775167465,
-0.08511343598365784,
-0.0025194110348820686,
-0.013302044942975044,
0.05148681625723839,
0.04128497466444969,
-0.06243640556931496,
0.0253794826567173,
-0.06823963671922684,
-0.06534317880868912,
0.20025265216827393,
0.23577773571014404,
-0.07808200269937515,
0.02140640839934349,
0.03552298992872238,
-0.07630413770675659,
-0.18455879390239716,
0.019420742988586426,
0.05136707425117493,
-0.004348257556557655,
0.03566844388842583,
-0.19599425792694092,
0.09696584939956665,
0.11131404340267181,
-0.000749334052670747,
0.0903192088007927,
-0.34434571862220764,
-0.13848331570625305,
0.10997503250837326,
0.12451346963644028,
0.1082644909620285,
-0.15503425896167755,
-0.020216111093759537,
-0.05611126497387886,
-0.11465075612068176,
0.12161717563867569,
-0.10112239420413971,
0.1280665099620819,
-0.023819230496883392,
0.10466403514146805,
0.005752596538513899,
-0.04308310151100159,
0.11633948236703873,
0.0022965925745666027,
0.0909128189086914,
-0.07054311037063599,
0.026317235082387924,
0.02992827445268631,
-0.03968048095703125,
0.028131308034062386,
-0.10599690675735474,
0.031078996136784554,
-0.09278024733066559,
-0.026227504014968872,
-0.06989551335573196,
0.029126297682523727,
-0.03018386848270893,
-0.06478054821491241,
-0.03552638366818428,
0.004873595200479031,
0.06745726615190506,
-0.017035964876413345,
0.14920450747013092,
-0.009661974385380745,
0.14607232809066772,
0.10453540831804276,
0.08204793930053711,
-0.07185349613428116,
-0.03644445538520813,
-0.013581020757555962,
-0.019546067342162132,
0.04936903342604637,
-0.16253416240215302,
0.03704000264406204,
0.1437760442495346,
0.0012874468229711056,
0.15832872688770294,
0.08158743381500244,
-0.025418447330594063,
0.006464207544922829,
0.06567545235157013,
-0.15474767982959747,
-0.09919749200344086,
-0.025584664195775986,
-0.030459990724921227,
-0.10145127028226852,
0.02880363166332245,
0.12071569263935089,
-0.07731739431619644,
-0.00457289582118392,
-0.00466422364115715,
0.016519008204340935,
-0.07103114575147629,
0.18885476887226105,
0.028432408347725868,
0.03745521232485771,
-0.1000598594546318,
0.09011165052652359,
0.053153086453676224,
-0.0949028730392456,
0.019422877579927444,
0.120579294860363,
-0.07216095179319382,
-0.04699169471859932,
0.07859079539775848,
0.16298529505729675,
-0.09694436192512512,
-0.06200016662478447,
-0.13076600432395935,
-0.1347338855266571,
0.09569250047206879,
0.14829637110233307,
0.08834630995988846,
0.019988354295492172,
-0.049096424132585526,
0.007069761864840984,
-0.11363082379102707,
0.07610374689102173,
0.05014042556285858,
0.061561308801174164,
-0.12678302824497223,
0.1645205318927765,
0.010190208442509174,
0.0339370034635067,
-0.02313281036913395,
0.018940340727567673,
-0.09660719335079193,
0.017805349081754684,
-0.15983060002326965,
-0.030873343348503113,
-0.0215214304625988,
0.004096013028174639,
0.0011668180814012885,
-0.04462883621454239,
-0.06410996615886688,
0.010450375266373158,
-0.11772267520427704,
-0.038320064544677734,
0.008581223897635937,
0.061565857380628586,
-0.11213947087526321,
-0.036194104701280594,
0.02679331786930561,
-0.06921803951263428,
0.07866770029067993,
0.051685016602277756,
-0.0007509013521485031,
0.07202283293008804,
-0.1627136766910553,
0.008575732819736004,
0.06014764681458473,
0.020757367834448814,
0.04654374718666077,
-0.08864358812570572,
-0.007287897169589996,
0.011488242074847221,
0.06057080999016762,
0.017396723851561546,
0.0535585917532444,
-0.13332447409629822,
-0.01919073611497879,
-0.02738765999674797,
-0.08717304468154907,
-0.06617236882448196,
0.03235457092523575,
0.05311610922217369,
0.021383948624134064,
0.19159749150276184,
-0.09121115505695343,
0.04094282537698746,
-0.21207036077976227,
0.01269689854234457,
-0.010683136992156506,
-0.1179363951086998,
-0.11079097539186478,
-0.0833214819431305,
0.0646381825208664,
-0.049526147544384,
0.13835695385932922,
0.013938539661467075,
0.06606125086545944,
0.0392594188451767,
-0.026813587173819542,
0.014149357564747334,
0.01851864717900753,
0.24294514954090118,
0.0436602346599102,
-0.03769408166408539,
0.04317165166139603,
0.045611195266246796,
0.11182313412427902,
0.13243086636066437,
0.20265541970729828,
0.15106981992721558,
-0.004482790362089872,
0.1037033423781395,
0.022146547213196754,
-0.0647464171051979,
-0.13838274776935577,
0.0370110459625721,
-0.02339814230799675,
0.12412501871585846,
-0.03279644995927811,
0.22508494555950165,
0.08305475115776062,
-0.15965776145458221,
0.04369501397013664,
-0.06297244131565094,
-0.07636372745037079,
-0.11856593936681747,
-0.037846438586711884,
-0.09124957025051117,
-0.17627376317977905,
-0.011859985068440437,
-0.12314631789922714,
0.052735067903995514,
0.09377046674489975,
0.02014187164604664,
-0.024943307042121887,
0.13014143705368042,
0.019813893362879753,
-0.012680722400546074,
0.06503768265247345,
-0.010475666262209415,
-0.012401197105646133,
-0.09394138306379318,
-0.07807081192731857,
0.0013531779404729605,
-0.01499014999717474,
0.03318779543042183,
-0.03274443745613098,
-0.06292393803596497,
0.04029087722301483,
-0.04703700914978981,
-0.09434250742197037,
0.015373723581433296,
0.028347419574856758,
0.059499386698007584,
0.06756792962551117,
0.012361211702227592,
-0.006211829837411642,
-0.0036378372460603714,
0.26039794087409973,
-0.09298641234636307,
-0.10012805461883545,
-0.09866759181022644,
0.2829018235206604,
0.04436508193612099,
-0.006435185205191374,
0.025082211941480637,
-0.063088059425354,
-0.016117123886942863,
0.25059816241264343,
0.18613237142562866,
-0.08334466069936752,
-0.01813662424683571,
-0.0021689317654818296,
-0.004873001482337713,
-0.017317188903689384,
0.11898332834243774,
0.1444884091615677,
0.03782745078206062,
-0.08692510426044464,
-0.020117919892072678,
-0.04387323930859566,
-0.008751405403017998,
-0.047108352184295654,
0.07702042907476425,
0.03232971951365471,
-0.009958634153008461,
-0.018635371699929237,
0.07245351374149323,
-0.07212771475315094,
-0.06719949841499329,
0.00533032463863492,
-0.1896340548992157,
-0.15188008546829224,
-0.01604846492409706,
0.10683780163526535,
0.012235104106366634,
0.059257932007312775,
-0.023661550134420395,
0.0052063483744859695,
0.07294463366270065,
-0.01678588055074215,
-0.08447858691215515,
-0.08891741186380386,
0.09599345922470093,
-0.1209564059972763,
0.1755521148443222,
-0.03431534394621849,
0.046576038002967834,
0.12998707592487335,
0.06450233608484268,
-0.07641046494245529,
0.08845254778862,
0.0422452948987484,
-0.05578634887933731,
0.04683919623494148,
0.09512516111135483,
-0.04010392725467682,
0.06347641348838806,
0.045892540365457535,
-0.13775339722633362,
0.02199295163154602,
-0.049498945474624634,
-0.05838437378406525,
-0.030103838071227074,
-0.054119061678647995,
-0.05798845365643501,
0.12465329468250275,
0.22525480389595032,
-0.04534515365958214,
0.012444237247109413,
-0.08544706553220749,
0.0004443636571522802,
0.04243725538253784,
0.034183092415332794,
-0.05125967785716057,
-0.22695904970169067,
-0.0030376070644706488,
0.08125648647546768,
-0.009769286960363388,
-0.2559731602668762,
-0.08034367859363556,
-0.002208450110629201,
-0.06215199455618858,
-0.11707119643688202,
0.11131292581558228,
0.09212030470371246,
0.03748684003949165,
-0.046463411301374435,
-0.10093389451503754,
-0.06480686366558075,
0.17370504140853882,
-0.1351722627878189,
-0.0873090922832489
] |
null | null |
adapter-transformers
|
# Adapter `ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR` for ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR
An [adapter](https://adapterhub.ml) for the `ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR` model that was trained on the [other](https://adapterhub.ml/explore/other/) dataset.
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
## Usage
First, install `adapter-transformers`:
```
pip install -U adapter-transformers
```
_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
Now, the adapter can be loaded and activated like this:
```python
from transformers import AutoModelWithHeads
model = AutoModelWithHeads.from_pretrained("ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR")
adapter_name = model.load_adapter("ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR", source="hf", set_active=True)
```
## Architecture & Training
<!-- Add some description here -->
## Evaluation results
<!-- Add some description here -->
## Citation
<!-- Add some description here -->
|
{"tags": ["adapter-transformers", "adapterhub:other", "xlm-roberta"], "datasets": ["ghadeermobasher/BC5CDR-Chemical-Disease"]}
| null |
ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR
|
[
"adapter-transformers",
"pytorch",
"xlm-roberta",
"adapterhub:other",
"dataset:ghadeermobasher/BC5CDR-Chemical-Disease",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#adapter-transformers #pytorch #xlm-roberta #adapterhub-other #dataset-ghadeermobasher/BC5CDR-Chemical-Disease #region-us
|
# Adapter 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' for ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR
An adapter for the 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' model that was trained on the other dataset.
This adapter was created for usage with the adapter-transformers library.
## Usage
First, install 'adapter-transformers':
_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More_
Now, the adapter can be loaded and activated like this:
## Architecture & Training
## Evaluation results
|
[
"# Adapter 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' for ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR\n\nAn adapter for the 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' model that was trained on the other dataset.\n\nThis adapter was created for usage with the adapter-transformers library.",
"## Usage\n\nFirst, install 'adapter-transformers':\n\n\n_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More_\n\nNow, the adapter can be loaded and activated like this:",
"## Architecture & Training",
"## Evaluation results"
] |
[
"TAGS\n#adapter-transformers #pytorch #xlm-roberta #adapterhub-other #dataset-ghadeermobasher/BC5CDR-Chemical-Disease #region-us \n",
"# Adapter 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' for ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR\n\nAn adapter for the 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' model that was trained on the other dataset.\n\nThis adapter was created for usage with the adapter-transformers library.",
"## Usage\n\nFirst, install 'adapter-transformers':\n\n\n_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More_\n\nNow, the adapter can be loaded and activated like this:",
"## Architecture & Training",
"## Evaluation results"
] |
[
49,
161,
57,
5,
4
] |
[
"passage: TAGS\n#adapter-transformers #pytorch #xlm-roberta #adapterhub-other #dataset-ghadeermobasher/BC5CDR-Chemical-Disease #region-us \n# Adapter 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' for ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR\n\nAn adapter for the 'ghadeermobasher/BC5CDR-Chemical-Disease-balanced-SapBERT-UMLS-2020AB-all-lang-from-XLMR' model that was trained on the other dataset.\n\nThis adapter was created for usage with the adapter-transformers library.## Usage\n\nFirst, install 'adapter-transformers':\n\n\n_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More_\n\nNow, the adapter can be loaded and activated like this:## Architecture & Training## Evaluation results"
] |
[
-0.05709964409470558,
-0.0017240591114386916,
-0.006760421674698591,
0.008356505073606968,
0.10799406468868256,
0.043266769498586655,
0.14967617392539978,
0.061734121292829514,
0.13831986486911774,
0.01907431147992611,
-0.01386991236358881,
0.09408372640609741,
0.009312759153544903,
0.07551690191030502,
0.001164184883236885,
-0.07480562478303909,
-0.014049272052943707,
0.04428000748157501,
-0.08170022815465927,
0.08024639636278152,
0.12844596803188324,
-0.06314460188150406,
0.06478525698184967,
0.06786873191595078,
-0.12304917722940445,
0.03076131083071232,
0.023263080045580864,
-0.06066260486841202,
0.09096033126115799,
0.03185920789837837,
0.1590920239686966,
0.06510787457227707,
0.01164168119430542,
-0.1911822259426117,
0.005900914315134287,
0.06958823651075363,
0.0046449098736047745,
0.07008875161409378,
0.022471128031611443,
-0.0779334008693695,
0.05908868461847305,
-0.04715314134955406,
0.06553570926189423,
0.057786740362644196,
-0.0823666900396347,
-0.2350168377161026,
-0.0596514455974102,
0.09018561244010925,
0.03110966458916664,
0.044943876564502716,
0.024941645562648773,
-0.005534284748136997,
-0.02648947387933731,
0.046386536210775375,
0.17862944304943085,
-0.22071579098701477,
0.011354767717421055,
0.04285541921854019,
0.08656762540340424,
0.05685196816921234,
-0.054267141968011856,
-0.009114481508731842,
-0.05626348778605461,
0.04284479096531868,
0.1142992451786995,
-0.03658808395266533,
0.14463545382022858,
0.034029364585876465,
-0.15113656222820282,
-0.01721028983592987,
0.1854104846715927,
-0.045566484332084656,
-0.0442182682454586,
-0.10983221232891083,
-0.05355263501405716,
0.013491871766746044,
0.021457994356751442,
-0.08698607236146927,
-0.012050758115947247,
-0.0008625355549156666,
0.032733023166656494,
-0.09486038982868195,
-0.1048964262008667,
-0.043901924043893814,
-0.13203692436218262,
0.25478821992874146,
-0.02068820223212242,
0.010879998095333576,
-0.01467041578143835,
0.13741448521614075,
0.06712709367275238,
-0.11380945891141891,
-0.07877431809902191,
-0.031375471502542496,
-0.1126226857304573,
-0.053357142955064774,
-0.016649283468723297,
-0.10410774499177933,
0.017373280599713326,
0.17163611948490143,
0.009276271797716618,
0.04057725518941879,
-0.023578789085149765,
0.024306900799274445,
-0.0023447342682629824,
0.2038232535123825,
-0.02737458422780037,
0.03726521506905556,
-0.015576192177832127,
0.05824155732989311,
-0.026537783443927765,
-0.0139476228505373,
-0.040410663932561874,
-0.07505372166633606,
0.012085657566785812,
0.04942755028605461,
0.017028074711561203,
0.064653679728508,
-0.07214336097240448,
-0.0781976729631424,
0.13436689972877502,
-0.1468387246131897,
0.002865258837118745,
0.027141695842146873,
-0.0025996046606451273,
0.04701186716556549,
0.1520286500453949,
-0.01540757529437542,
-0.006377785000950098,
0.09257762134075165,
-0.05854155123233795,
-0.008753887377679348,
-0.05709126964211464,
-0.07974333316087723,
0.016612648963928223,
-0.009681458584964275,
-0.04977015033364296,
-0.12172457575798035,
-0.09319902211427689,
0.04127223417162895,
0.08129684627056122,
0.00028678489616140723,
0.07217130064964294,
0.061358045786619186,
0.04459797963500023,
0.044347893446683884,
-0.02299201488494873,
-0.025649404153227806,
-0.015678538009524345,
0.05539458617568016,
0.07391729950904846,
0.04578321427106857,
-0.012092492543160915,
0.06342588365077972,
-0.052202243357896805,
0.06737751513719559,
-0.20962685346603394,
0.10195686668157578,
-0.10686783492565155,
-0.07348531484603882,
-0.11968554556369781,
0.013918986544013023,
0.001707615563645959,
0.019920561462640762,
0.017466500401496887,
0.04325185343623161,
-0.09510835260152817,
-0.08266168087720871,
0.07016650587320328,
-0.1234365701675415,
-0.0891324058175087,
0.01090590562671423,
-0.00642053596675396,
0.10970287770032883,
0.02470785565674305,
0.05189013481140137,
0.11615575104951859,
-0.14032712578773499,
-0.10742680728435516,
0.060236625373363495,
-0.05679216980934143,
0.02939257211983204,
0.04378844425082207,
0.010932082310318947,
-0.018348461017012596,
0.024221451953053474,
-0.15506674349308014,
0.08535376936197281,
-0.0011579323327168822,
-0.042226750403642654,
-0.05697983130812645,
-0.016576720401644707,
0.10252827405929565,
-0.021961933001875877,
-0.03760688379406929,
0.06231432780623436,
-0.13588044047355652,
0.20437021553516388,
0.09740708768367767,
-0.062365252524614334,
0.011478505097329617,
-0.11740295588970184,
0.006312580779194832,
-0.10686513036489487,
0.0037303632125258446,
-0.1819395273923874,
-0.07199093699455261,
0.03160508722066879,
-0.12101995199918747,
-0.0009192457655444741,
0.04045594483613968,
0.09882789105176926,
0.022964641451835632,
-0.007751759607344866,
-0.056590527296066284,
-0.035769958049058914,
0.023029210045933723,
-0.00855492427945137,
-0.1256348192691803,
-0.06223846971988678,
-0.045260220766067505,
0.1655627340078354,
-0.11613096296787262,
0.034355856478214264,
0.05685390532016754,
0.12163355201482773,
0.00717266695573926,
-0.026466574519872665,
-0.04607119783759117,
-0.02092740125954151,
-0.02495906502008438,
-0.022973613813519478,
0.009319830685853958,
0.0038650655187666416,
-0.11111853271722794,
0.12293390184640884,
-0.15055148303508759,
-0.03375210240483284,
0.07749418169260025,
0.06724816560745239,
-0.044215645641088486,
-0.06743904203176498,
-0.02587064728140831,
-0.061129841953516006,
0.049850404262542725,
-0.0664065033197403,
0.16718553006649017,
0.046820610761642456,
0.09369127452373505,
-0.01656935177743435,
-0.077617347240448,
-0.003730570897459984,
-0.025530381128191948,
-0.033740464597940445,
0.0216528307646513,
-0.035268962383270264,
-0.09687391668558121,
0.03625786677002907,
0.2206438034772873,
-0.043537043035030365,
0.1018221452832222,
-0.021670421585440636,
-0.05947065353393555,
-0.10196211189031601,
-0.005652726162225008,
0.021708814427256584,
0.09656113386154175,
-0.06989774107933044,
0.030865831300616264,
0.043969277292490005,
-0.014505036175251007,
0.026172848418354988,
-0.0952637791633606,
0.06969095766544342,
0.05002271756529808,
-0.04822589457035065,
0.010157238692045212,
0.02684757299721241,
0.026822451502084732,
0.04950118437409401,
-0.01214676070958376,
0.10150459408760071,
0.050711341202259064,
-0.007188811432570219,
-0.06642104685306549,
0.15481165051460266,
-0.10913941264152527,
-0.2302769422531128,
-0.17858366668224335,
-0.1267516016960144,
-0.029290461912751198,
0.005004344508051872,
0.00516625726595521,
-0.08545161783695221,
-0.06542248278856277,
0.006065279711037874,
0.14006073772907257,
-0.03779356926679611,
-0.0005765205714851618,
0.03148951381444931,
-0.019538873806595802,
0.1030297800898552,
-0.14522866904735565,
-0.023896241560578346,
0.03515845909714699,
-0.07647915184497833,
0.025629229843616486,
0.08420481532812119,
0.030449729412794113,
0.12681615352630615,
-0.013573513366281986,
0.01709633506834507,
0.02303382195532322,
0.07214632630348206,
-0.06393051147460938,
0.025027912110090256,
0.1836332529783249,
-0.10670789331197739,
0.04350137710571289,
0.05976986885070801,
0.050211649388074875,
0.012067115865647793,
-0.0007435676525346935,
0.03109160251915455,
-0.0169590525329113,
-0.22325895726680756,
-0.03829333558678627,
-0.0139057207852602,
-0.06476624310016632,
0.0780143067240715,
0.05244678631424904,
0.04552707448601723,
0.039960622787475586,
-0.02268536388874054,
-0.04279790818691254,
0.013502098619937897,
0.05344688519835472,
0.20220769941806793,
-0.014500278048217297,
0.10928518325090408,
-0.046565305441617966,
-0.012083534151315689,
0.041770823299884796,
0.03520786389708519,
0.14506028592586517,
-0.008807497099041939,
0.14113575220108032,
0.08485502004623413,
-0.026485268026590347,
0.05664629861712456,
0.07219752669334412,
-0.01723446510732174,
-0.016607919707894325,
0.04806051775813103,
-0.07827520370483398,
-0.02806648053228855,
0.03268294036388397,
0.011981348507106304,
-0.008647921495139599,
-0.0034056641161441803,
0.0057151117362082005,
0.06036325916647911,
0.1161174327135086,
-0.05469143018126488,
-0.17302517592906952,
-0.08486633002758026,
-0.004362167790532112,
-0.010480117052793503,
-0.04681940749287605,
-0.06017689034342766,
-0.014531156048178673,
-0.051087167114019394,
0.036149606108665466,
-0.04707486927509308,
0.030603554099798203,
-0.11743368953466415,
-0.012367632240056992,
0.12510815262794495,
0.12190711498260498,
0.025470511987805367,
0.05345860868692398,
-0.3130275011062622,
0.059545159339904785,
0.0534854456782341,
0.02130848541855812,
-0.05363589897751808,
0.08790386468172073,
0.0026803952641785145,
0.026623912155628204,
0.05729222297668457,
0.018278196454048157,
-0.0003511336981318891,
-0.13300520181655884,
-0.05586713179945946,
0.01074141263961792,
0.07664834707975388,
-0.10221323370933533,
0.09386249631643295,
-0.05577435716986656,
0.012504368089139462,
0.07117850333452225,
-0.040200091898441315,
-0.1449933499097824,
-0.1897374838590622,
0.1302804797887802,
-0.0022950212005525827,
-0.018455026671290398,
-0.10448332875967026,
-0.057837292551994324,
0.0215375404804945,
0.20631015300750732,
-0.15853120386600494,
-0.06534317880868912,
-0.1021728515625,
-0.03583507239818573,
0.1148051992058754,
-0.08363540470600128,
0.06112320348620415,
-0.027709510177373886,
0.06642226129770279,
-0.027891801670193672,
-0.15951478481292725,
0.050051458179950714,
-0.06906282156705856,
-0.03158097714185715,
-0.03027595393359661,
0.0966886579990387,
0.08145933598279953,
0.016095688566565514,
-0.003180866129696369,
0.0029856604523956776,
-0.011057551950216293,
-0.049771398305892944,
0.02281847409904003,
0.10712436586618423,
-0.04892587661743164,
0.08120790123939514,
-0.12437774240970612,
-0.038372065871953964,
-0.003754157805815339,
-0.02935185097157955,
0.07578206062316895,
0.1557493656873703,
-0.06573351472616196,
0.1066567450761795,
0.0815383791923523,
-0.0860152468085289,
-0.17533260583877563,
-0.044107548892498016,
0.08456140756607056,
0.027930017560720444,
0.023939024657011032,
-0.254520982503891,
0.17634405195713043,
0.06010700389742851,
-0.025740299373865128,
0.0609460212290287,
-0.18944917619228363,
-0.08437541872262955,
0.16278934478759766,
0.00968673825263977,
-0.003468963783234358,
-0.07306920737028122,
-0.05818839371204376,
-0.012912816368043423,
-0.18422892689704895,
0.12099847942590714,
0.009002272970974445,
0.0826205313205719,
-0.0623636357486248,
0.044672418385744095,
0.04467853158712387,
-0.042611900717020035,
0.12827646732330322,
0.008807007223367691,
0.023433014750480652,
-0.04332061856985092,
-0.0015393644571304321,
0.14500491321086884,
-0.04579970985651016,
0.05394785851240158,
-0.06990916281938553,
0.015164408832788467,
-0.02814149297773838,
-0.06694187968969345,
-0.04202376678586006,
0.12017202377319336,
-0.03716062009334564,
-0.05206964910030365,
0.005696606822311878,
0.009613942354917526,
-0.013594004325568676,
-0.05519993230700493,
-0.03361310437321663,
-0.03614416718482971,
0.02213764563202858,
0.15303942561149597,
0.09578659385442734,
0.02877753973007202,
-0.15383701026439667,
-0.0033987907227128744,
0.009871378540992737,
0.06494702398777008,
-0.06492000073194504,
0.09326338768005371,
0.07166661322116852,
0.02129712514579296,
0.11932138353586197,
0.05963783711194992,
-0.09539692848920822,
-0.049812860786914825,
0.09501342475414276,
-0.09025858342647552,
-0.09349989145994186,
-0.02184258960187435,
-0.006516164634376764,
-0.09925244748592377,
0.015493247658014297,
0.1721349060535431,
0.04473906755447388,
-0.009133691899478436,
0.03017597086727619,
-0.0046708472073078156,
-0.0822063609957695,
0.10721447318792343,
-0.0022733460646122694,
0.058255065232515335,
-0.046078942716121674,
0.0574236698448658,
0.10268542915582657,
-0.07356244325637817,
-0.017196862027049065,
-0.023744070902466774,
-0.06776352226734161,
-0.03449787572026253,
-0.10637293756008148,
0.08239268511533737,
-0.12052294611930847,
-0.017748557031154633,
-0.03733768314123154,
-0.11210981756448746,
-0.005520608276128769,
0.1222788542509079,
0.04938198998570442,
0.0749531164765358,
-0.020442882552742958,
0.05522744730114937,
-0.0502079576253891,
0.11780241131782532,
-0.014607321470975876,
0.08432989567518234,
-0.11639691144227982,
0.0517314188182354,
-0.008305180817842484,
0.05766671150922775,
-0.033070143312215805,
-0.0011547390604391694,
-0.08022720366716385,
-0.046315353363752365,
-0.1946379840373993,
-0.03672740235924721,
-0.03790511190891266,
-0.03312128037214279,
0.03806869685649872,
-0.012806053273379803,
-0.00012092581891920418,
0.04475787281990051,
-0.025770939886569977,
-0.038262512534856796,
0.024173282086849213,
0.08947764337062836,
-0.14068837463855743,
-0.05847575142979622,
0.041432105004787445,
-0.08508659154176712,
0.1103091835975647,
0.12885500490665436,
-0.004080594051629305,
0.016847161576151848,
-0.019562246277928352,
-0.010758144780993462,
0.010049542412161827,
0.05806145817041397,
-0.0029169837944209576,
-0.15687395632266998,
0.04949779063463211,
-0.010300405323505402,
-0.0340895913541317,
-0.0024966371711343527,
0.18418920040130615,
-0.09478477388620377,
-0.04005303233861923,
-0.056629057973623276,
0.024586185812950134,
-0.08789627999067307,
0.04808858409523964,
0.11161445081233978,
0.12730900943279266,
0.10943781584501266,
-0.0656861960887909,
0.014930450357496738,
-0.1711852252483368,
0.006890491116791964,
0.009357541799545288,
-0.026992134749889374,
0.04355745017528534,
-0.10294245183467865,
-0.0055838702246546745,
-0.04203152656555176,
0.20317594707012177,
-0.05324769392609596,
-0.03788493946194649,
0.04965736344456673,
-0.052367888391017914,
-0.0061729527078568935,
-0.01602005399763584,
0.18461576104164124,
-0.007811895105987787,
0.012520985677838326,
0.053980715572834015,
0.0370059534907341,
-0.035633329302072525,
-0.0249389186501503,
0.0596623569726944,
0.1586551070213318,
-0.0596914179623127,
-0.01663229987025261,
0.0977889895439148,
-0.12335249781608582,
-0.06080348417162895,
0.04443170502781868,
-0.11613602191209793,
0.04222957417368889,
-0.03375420719385147,
0.16319863498210907,
0.08640579134225845,
-0.07632464170455933,
0.08815552294254303,
0.010834944434463978,
-0.05932902917265892,
-0.08021816611289978,
-0.06487874686717987,
-0.039730384945869446,
-0.1058647409081459,
-0.02896341308951378,
-0.06857127696275711,
0.009163114242255688,
0.04339155554771423,
0.006427539046853781,
0.010510138235986233,
0.197688028216362,
0.007405879441648722,
-0.049495499581098557,
0.009052286855876446,
-0.019339127466082573,
-0.01325658243149519,
-0.027189087122678757,
0.05681247636675835,
0.05334121361374855,
0.06986162066459656,
0.03515038266777992,
0.010313848964869976,
0.0014142256695777178,
0.052600063383579254,
-0.021905794739723206,
-0.1021217331290245,
-0.028870586305856705,
0.015075872652232647,
-0.05386678874492645,
0.10741426795721054,
0.00617431104183197,
-0.014996849000453949,
-0.04373196139931679,
0.13760322332382202,
-0.040442269295454025,
-0.11926679313182831,
-0.08978541195392609,
0.2027815729379654,
-0.007147061172872782,
0.019381754100322723,
-0.0000315379656967707,
-0.05313480645418167,
-0.03927988559007645,
0.05101759359240532,
0.08686158806085587,
-0.02249406836926937,
0.04073944687843323,
0.023158561438322067,
0.0032851274590939283,
-0.019729234278202057,
0.034285563975572586,
0.07259528338909149,
0.13552601635456085,
-0.006932777352631092,
-0.028045613318681717,
-0.039125047624111176,
0.0026102217379957438,
0.007887715473771095,
0.12975487112998962,
-0.009401117451488972,
0.002437206218019128,
-0.03021126613020897,
-0.006995628122240305,
0.045906055718660355,
-0.1288379579782486,
-0.035848282277584076,
-0.05758027359843254,
-0.06577452272176743,
-0.06441434472799301,
0.09906353056430817,
-0.1057644933462143,
-0.005475426092743874,
-0.01098344475030899,
-0.06708341091871262,
0.17835944890975952,
0.01496396865695715,
-0.10894446074962616,
0.04583858326077461,
0.09540308266878128,
-0.06580455601215363,
0.10099142044782639,
-0.0032412465661764145,
0.0865158885717392,
0.08822203427553177,
0.05722292512655258,
-0.10151024162769318,
0.026951106265187263,
-0.004044315777719021,
-0.16808772087097168,
0.007003387436270714,
0.1181681752204895,
-0.016862710937857628,
0.04661501199007034,
0.04532921686768532,
-0.11525308340787888,
0.041340749710798264,
0.020584261044859886,
-0.10942662507295609,
-0.06104186922311783,
-0.0443393774330616,
-0.012919400818645954,
0.11899132281541824,
0.16281884908676147,
0.01129991840571165,
-0.01497175358235836,
-0.1031784638762474,
0.029014721512794495,
0.06966707855463028,
0.14561381936073303,
-0.042712852358818054,
-0.12210328131914139,
0.041014865040779114,
0.0364338681101799,
0.042186036705970764,
-0.21799324452877045,
-0.09984489530324936,
0.040422383695840836,
-0.051370903849601746,
-0.006789256352931261,
0.09687883406877518,
0.10169710963964462,
0.12999139726161957,
-0.02094840258359909,
-0.02835661917924881,
0.0026732077822089195,
0.08398845791816711,
-0.21550266444683075,
-0.03167225047945976
] |
null | null |
transformers
|
A fake news detector using RoBERTa.
Dataset: https://www.kaggle.com/clmentbisaillon/fake-and-real-news-dataset
Training involved using hyperparameter search with 10 trials.
|
{}
|
text-classification
|
ghanashyamvtatti/roberta-fake-news
|
[
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tf #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us
|
A fake news detector using RoBERTa.
Dataset: URL
Training involved using hyperparameter search with 10 trials.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
43
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.009321261197328568,
0.03833454102277756,
-0.007458383683115244,
0.04509740695357323,
0.19073563814163208,
0.044316064566373825,
0.058454908430576324,
0.1287291944026947,
0.022551734000444412,
-0.051579687744379044,
0.10078538954257965,
0.24138477444648743,
-0.027322381734848022,
0.13276614248752594,
-0.10855449736118317,
-0.2971561849117279,
0.04076668992638588,
0.051133155822753906,
-0.021313859149813652,
0.11753110587596893,
0.10417038202285767,
-0.05905111879110336,
0.07596512883901596,
-0.046419013291597366,
-0.15378999710083008,
0.048159681260585785,
0.06594772636890411,
-0.13833029568195343,
0.10127073526382446,
0.07382937520742416,
0.12588348984718323,
0.04158720746636391,
-0.06650744378566742,
-0.11784545332193375,
0.0489172525703907,
0.011088315397500992,
-0.1050376296043396,
0.041167519986629486,
0.06748871505260468,
-0.11690855771303177,
0.07884424179792404,
0.055323585867881775,
0.027410900220274925,
0.06450454145669937,
-0.16225191950798035,
-0.10423062741756439,
-0.016300354152917862,
0.05403559282422066,
0.041660375893116,
0.054933350533246994,
0.002406331244856119,
0.165925994515419,
-0.11779409646987915,
0.14394643902778625,
0.10201145708560944,
-0.2930733859539032,
-0.03548481687903404,
0.08707228302955627,
0.023323284462094307,
0.06098324805498123,
-0.04945063963532448,
0.048939675092697144,
0.029580453410744667,
0.010480081662535667,
0.018018465489149094,
-0.09019327163696289,
-0.17499925196170807,
0.024893369525671005,
-0.07776133716106415,
-0.03775801882147789,
0.22313113510608673,
-0.04041024297475815,
0.05340717360377312,
-0.0007737316773273051,
-0.09091417491436005,
-0.030990980565547943,
-0.02418326400220394,
-0.002275970298796892,
-0.049730271100997925,
0.06232700124382973,
0.026058444753289223,
-0.013186926953494549,
-0.09809423238039017,
0.023618433624505997,
-0.21575523912906647,
0.20315684378147125,
0.011073125526309013,
0.046705398708581924,
-0.18286055326461792,
0.0459989495575428,
0.014395499601960182,
-0.10498945415019989,
0.04919755086302757,
-0.10466555505990982,
-0.0019425388891249895,
-0.043636951595544815,
-0.04863319173455238,
-0.06329197436571121,
0.09377274662256241,
0.1529233455657959,
0.03608943521976471,
0.05445031076669693,
-0.045937035232782364,
0.07743069529533386,
0.02620675228536129,
0.10140315443277359,
0.009603774175047874,
-0.037173379212617874,
0.07036366313695908,
-0.1295900046825409,
-0.02339029870927334,
-0.07223007827997208,
-0.1485622376203537,
-0.02866498939692974,
0.07995997369289398,
0.08356259763240814,
0.02166367881000042,
0.10404128581285477,
-0.0419488288462162,
-0.03934840112924576,
0.04686550423502922,
-0.07738121598958969,
-0.005868424661457539,
-0.0012271626619622111,
0.03016207553446293,
0.08848854154348373,
-0.002339733298867941,
-0.008075385354459286,
-0.07455407083034515,
0.09171751886606216,
-0.06258238852024078,
-0.01459585502743721,
-0.014811906032264233,
-0.07497330754995346,
0.037943750619888306,
-0.12400772422552109,
0.039442531764507294,
-0.19917459785938263,
-0.09627019613981247,
0.012180755846202374,
0.017002997919917107,
-0.009292177855968475,
-0.052401572465896606,
-0.009447808377444744,
-0.012960529886186123,
0.04553200304508209,
-0.034451503306627274,
-0.03935784101486206,
-0.07009398192167282,
0.10774018615484238,
-0.036607593297958374,
0.08322080969810486,
-0.1176370233297348,
0.06356711685657501,
-0.08634956181049347,
-0.039686016738414764,
-0.14588141441345215,
0.040148086845874786,
-0.04176180064678192,
0.18446800112724304,
0.007176793646067381,
-0.034750938415527344,
-0.06632345169782639,
0.054496005177497864,
-0.08303962647914886,
0.16552738845348358,
-0.09155189990997314,
-0.1015721783041954,
0.23152711987495422,
-0.08864890784025192,
-0.1539284735918045,
0.0929335504770279,
-0.008485768921673298,
0.04341743513941765,
0.09651672095060349,
0.19637277722358704,
0.11013047397136688,
0.0019564928952604532,
0.11539260298013687,
0.11205906420946121,
-0.10410775244235992,
-0.07545039057731628,
-0.0041510057635605335,
0.00875872652977705,
-0.1460372507572174,
0.048308901488780975,
0.08407088369131088,
0.09102927148342133,
-0.05070752277970314,
-0.0368100069463253,
-0.016935158520936966,
-0.013283477164804935,
0.09212461113929749,
0.05224981531500816,
0.10786306113004684,
-0.09071997553110123,
-0.003641280811280012,
-0.040592778474092484,
-0.0010869416873902082,
0.037917427718639374,
0.014790180139243603,
-0.07004217803478241,
0.10545012354850769,
0.039877865463495255,
0.03945983201265335,
-0.21580500900745392,
-0.10288207978010178,
-0.015250361524522305,
0.16243277490139008,
0.004942703060805798,
0.11983106285333633,
0.04601852968335152,
-0.06990334391593933,
-0.028971347957849503,
-0.006388726178556681,
0.16697250306606293,
0.022194530814886093,
-0.028899136930704117,
-0.09790395945310593,
0.08510036021471024,
-0.06988522410392761,
0.019481709226965904,
-0.07692243903875351,
0.015266002155840397,
0.0883803740143776,
0.11349862813949585,
0.031224610283970833,
0.07150533050298691,
-0.008004596456885338,
0.0458093099296093,
-0.06443936377763748,
0.00450447853654623,
0.0957048088312149,
0.00033932068618014455,
-0.06369208544492722,
0.17560070753097534,
-0.13162904977798462,
0.32559719681739807,
0.19696497917175293,
-0.26217126846313477,
-0.051548831164836884,
0.010724123567342758,
-0.008646144531667233,
0.025664126500487328,
0.05032195523381233,
0.004741590935736895,
0.06533849239349365,
-0.030018649995326996,
0.19591739773750305,
-0.04492192715406418,
-0.055207397788763046,
0.004895183723419905,
-0.036748845130205154,
-0.025998972356319427,
0.0950174331665039,
0.059740208089351654,
-0.2587147653102875,
0.1858753114938736,
0.21288014948368073,
0.028769975528120995,
0.19745883345603943,
-0.01727685146033764,
0.0362655408680439,
0.0803036317229271,
-0.03968440368771553,
-0.014263233169913292,
-0.06441580504179001,
-0.161104217171669,
-0.03264031931757927,
0.06877408176660538,
0.01929817534983158,
0.047901153564453125,
-0.0910627692937851,
-0.05571477860212326,
-0.0020206233020871878,
0.01679692231118679,
-0.013326429761946201,
0.11468371748924255,
0.06406453251838684,
0.13502994179725647,
-0.0016402006149291992,
-0.07701896876096725,
0.10393793135881424,
0.008926604874432087,
-0.09246701747179031,
0.18543438613414764,
-0.1490211933851242,
-0.34924226999282837,
-0.11350839585065842,
-0.16895651817321777,
-0.008474688977003098,
0.03288592770695686,
0.10130787640810013,
-0.1030014231801033,
-0.03007487766444683,
0.0308525487780571,
-0.010349035263061523,
-0.07138746231794357,
0.04646311327815056,
-0.07447469979524612,
0.08255894482135773,
-0.04587502032518387,
-0.06014982610940933,
-0.07149641215801239,
-0.021236412227153778,
-0.02701779641211033,
0.15794764459133148,
-0.1391502469778061,
0.08623568713665009,
0.16595755517482758,
-0.020067209377884865,
0.06996379792690277,
-0.05299761891365051,
0.18056757748126984,
-0.1120925024151802,
0.007414393126964569,
0.17841097712516785,
-0.06571152061223984,
0.061808329075574875,
0.14389599859714508,
0.017920805141329765,
-0.06319127976894379,
0.03352314978837967,
-0.02789968065917492,
-0.09262390434741974,
-0.22048231959342957,
-0.1109093502163887,
-0.1310063898563385,
0.06482371687889099,
0.04772242531180382,
0.07774230092763901,
0.16355474293231964,
0.05206412076950073,
0.011895610950887203,
0.007102332077920437,
0.019119808450341225,
0.07621041685342789,
0.2204350233078003,
0.012580770067870617,
0.13032104074954987,
-0.06767912209033966,
-0.12088239938020706,
0.10234188288450241,
-0.004204527009278536,
0.08419796079397202,
0.07917173206806183,
0.005538177210837603,
0.01378991361707449,
0.07539360225200653,
0.14305070042610168,
0.10576064884662628,
0.026446959003806114,
-0.03137886896729469,
-0.0356302447617054,
-0.010866614989936352,
-0.04261491820216179,
0.007755958009511232,
0.0462922677397728,
-0.13628387451171875,
-0.0749204233288765,
-0.12924832105636597,
0.09153498709201813,
0.09460277110338211,
0.04153214395046234,
-0.19113317131996155,
0.008756885305047035,
0.07322221994400024,
-0.035650670528411865,
-0.0991269052028656,
0.07300576567649841,
-0.043313167989254,
-0.1552319973707199,
0.11367128044366837,
-0.024095997214317322,
0.12887953221797943,
-0.049325887113809586,
0.07350627332925797,
-0.030443841591477394,
-0.10093050450086594,
0.011503707617521286,
0.09925934672355652,
-0.29933327436447144,
0.22340725362300873,
0.015824979171156883,
-0.07107891142368317,
-0.0699729472398758,
-0.03219805285334587,
0.05306370556354523,
0.20534342527389526,
0.08746523410081863,
0.0022704557050019503,
-0.06334558129310608,
-0.13034701347351074,
0.016016768291592598,
0.007824087515473366,
0.11142180114984512,
-0.0351637564599514,
-0.018165161833167076,
-0.05461844429373741,
-0.028345316648483276,
-0.020483968779444695,
-0.02581126056611538,
0.01464915368705988,
-0.15178021788597107,
0.06377823650836945,
0.015195479616522789,
0.037542786449193954,
0.01475437916815281,
-0.042067740112543106,
-0.09104583412408829,
0.18582181632518768,
-0.046456169337034225,
-0.07273619621992111,
-0.13419097661972046,
-0.06647150218486786,
0.0190085731446743,
-0.08803398162126541,
0.0577816367149353,
-0.0850016176700592,
0.008287896402180195,
-0.05328083038330078,
-0.22105200588703156,
0.136966273188591,
-0.10764653980731964,
-0.03392641618847847,
-0.05488676577806473,
0.1371820569038391,
-0.0902433767914772,
0.020884796977043152,
0.03147709742188454,
0.008263466879725456,
-0.06647293269634247,
-0.06925084441900253,
0.002256006235256791,
-0.033020611852407455,
0.0188447255641222,
0.04305266961455345,
-0.09435202926397324,
-0.06753315776586533,
-0.03547960892319679,
-0.006159116514027119,
0.28110551834106445,
0.16856764256954193,
-0.06638824939727783,
0.1532687097787857,
0.09365438669919968,
-0.060475338250398636,
-0.3188568949699402,
-0.07566292583942413,
-0.1023324504494667,
-0.051819462329149246,
-0.02177719958126545,
-0.14433974027633667,
0.0836372897028923,
-0.015517358668148518,
-0.003576214425265789,
0.10677479952573776,
-0.1557193398475647,
-0.08037810772657394,
0.17019356787204742,
-0.012521365657448769,
0.36895620822906494,
-0.12660299241542816,
-0.0813005194067955,
-0.032633934170007706,
-0.12180130183696747,
0.13972099125385284,
-0.03457560017704964,
0.07809701561927795,
-0.00970024149864912,
0.022823013365268707,
0.04347782954573631,
-0.015697721391916275,
0.06755752861499786,
-0.015534134581685066,
0.021747516468167305,
-0.12256232649087906,
-0.09800358861684799,
0.050938498228788376,
-0.0012164103100076318,
-0.014525603502988815,
-0.010069524869322777,
0.01889680325984955,
-0.15399169921875,
-0.029489589855074883,
-0.09314239770174026,
0.05357936769723892,
0.02643832191824913,
-0.045520585030317307,
0.0042415643110871315,
0.002437808783724904,
-0.0034311413764953613,
-0.02365579456090927,
0.23234309256076813,
-0.0613761804997921,
0.20948292315006256,
0.08876118808984756,
0.1404688060283661,
-0.136032372713089,
0.033540934324264526,
-0.06812077760696411,
-0.06497341394424438,
0.053064778447151184,
-0.10509978234767914,
0.0677591860294342,
0.10358286648988724,
-0.04510800912976265,
0.06881155073642731,
0.10522960871458054,
0.034270647913217545,
-0.03468047454953194,
0.17221061885356903,
-0.23245227336883545,
0.023621613159775734,
-0.07834894210100174,
-0.058216724544763565,
0.07705168426036835,
0.03639107942581177,
0.1447460949420929,
0.04016062244772911,
-0.05073674023151398,
0.010490224696695805,
-0.015589777380228043,
-0.009692923165857792,
0.04736130312085152,
0.09083392471075058,
0.0294950008392334,
-0.12706077098846436,
0.04262249544262886,
0.05070679262280464,
-0.14541703462600708,
-0.005537425167858601,
0.1815081387758255,
-0.15697652101516724,
-0.1299753189086914,
0.0037433821707963943,
0.12616120278835297,
-0.12407790124416351,
-0.026786144822835922,
-0.07814116030931473,
-0.12194111943244934,
0.0694480761885643,
0.2583850920200348,
0.09926041215658188,
0.08048375695943832,
-0.024284666404128075,
-0.05153515934944153,
0.01066920068114996,
0.01013757660984993,
0.0006234251195564866,
0.019149107858538628,
-0.12915752828121185,
0.04823042079806328,
-0.020864631980657578,
0.1746288686990738,
-0.09802678972482681,
-0.06294061988592148,
-0.1864650994539261,
0.03237463906407356,
-0.053476810455322266,
-0.036729440093040466,
-0.05430159717798233,
-0.016365505754947662,
0.00541598629206419,
-0.057709649205207825,
-0.04378709942102432,
-0.058432240039110184,
-0.12446195632219315,
0.03069818764925003,
-0.005234524141997099,
0.04515013471245766,
-0.050289690494537354,
-0.05494287237524986,
0.08799003064632416,
-0.027647079899907112,
0.11279013752937317,
0.09474439918994904,
-0.08407987654209137,
0.11205213516950607,
-0.1418243944644928,
-0.10161864757537842,
0.11592499911785126,
0.020127184689044952,
0.07100807130336761,
0.06749971956014633,
0.04911384359002113,
0.04418381303548813,
0.006045393645763397,
0.07308679074048996,
0.04334741458296776,
-0.12134910374879837,
0.049183059483766556,
-0.02765093371272087,
-0.17561137676239014,
-0.04460466653108597,
-0.02754919044673443,
0.09643793851137161,
-0.010431152768433094,
0.12309997528791428,
-0.06451715528964996,
0.08575484156608582,
-0.05909856781363487,
0.004743561614304781,
-0.023746976628899574,
-0.20988520979881287,
-0.06822951883077621,
-0.07612072676420212,
0.026603244245052338,
-0.006439046002924442,
0.1916307955980301,
0.07230682671070099,
0.03344825655221939,
0.05904145911335945,
0.04383727163076401,
0.013986093923449516,
0.0170248094946146,
0.15913496911525726,
0.07887425273656845,
-0.05814015492796898,
-0.07122661918401718,
0.06160949543118477,
0.010819459334015846,
-0.013046654872596264,
0.14506031572818756,
0.06500440090894699,
-0.03205122798681259,
0.05159856379032135,
-0.005223423708230257,
0.04337305203080177,
-0.05414113774895668,
-0.16707244515419006,
-0.01842368394136429,
0.07157707214355469,
0.011008630506694317,
0.01351366750895977,
0.14977091550827026,
-0.020792176946997643,
0.04620819538831711,
-0.051317762583494186,
-0.03894871845841408,
-0.1834738552570343,
-0.08593132346868515,
-0.10249766707420349,
-0.0940849557518959,
0.006609790492802858,
-0.08558192849159241,
0.0015269372379407287,
0.042883072048425674,
0.05846875160932541,
-0.05476118251681328,
0.05724756792187691,
0.035331420600414276,
-0.08263092488050461,
0.1021917536854744,
-0.028148632496595383,
0.027034642174839973,
-0.011987937614321709,
-0.010573923587799072,
-0.1295742243528366,
-0.022367345169186592,
-0.06645791232585907,
0.03137175738811493,
-0.054285693913698196,
0.010825966484844685,
-0.16240820288658142,
-0.12045063823461533,
-0.015026298351585865,
0.04872824251651764,
-0.04438084363937378,
0.10731632262468338,
0.018155314028263092,
0.0012577339075505733,
0.05467524379491806,
0.2205696851015091,
-0.043988604098558426,
-0.06197338551282883,
-0.06621558964252472,
0.24015583097934723,
0.06498633325099945,
0.09725312888622284,
-0.007792160380631685,
-0.00826571136713028,
-0.06392902135848999,
0.31719911098480225,
0.2990921139717102,
-0.05951068922877312,
0.05765257403254509,
0.014131797477602959,
0.03515201807022095,
0.14858707785606384,
0.15556403994560242,
0.08631066977977753,
0.22001273930072784,
-0.05610945448279381,
-0.0323924645781517,
-0.03244649991393089,
-0.00011717861343640834,
-0.10615155845880508,
0.08834095299243927,
0.08061715215444565,
-0.046249665319919586,
-0.038747143000364304,
0.11574290692806244,
-0.18907932937145233,
0.11950955539941788,
-0.00375749752856791,
-0.21530038118362427,
-0.08457891643047333,
-0.05569221079349518,
0.07448866218328476,
0.00903343502432108,
0.07589029520750046,
-0.009611186571419239,
-0.10277153551578522,
0.007392990868538618,
0.03433285653591156,
-0.21637725830078125,
-0.045405976474285126,
0.07789403200149536,
-0.05879265069961548,
0.000526217685546726,
-0.042464446276426315,
0.020487124100327492,
0.08371546864509583,
0.04306786507368088,
-0.010364369489252567,
0.047011617571115494,
0.006061826832592487,
-0.006725840270519257,
0.013983911834657192,
0.031955450773239136,
0.010460343211889267,
-0.0943627655506134,
0.0788504108786583,
-0.13313212990760803,
0.043683238327503204,
-0.08427242189645767,
-0.048453301191329956,
-0.01001046597957611,
0.07200850546360016,
-0.062395378947257996,
0.0614827498793602,
0.12546974420547485,
-0.002076550154015422,
-0.014398833736777306,
-0.0724807158112526,
-0.039408233016729355,
0.029062144458293915,
-0.1274152547121048,
-0.14411133527755737,
-0.064139224588871,
-0.09697576612234116,
0.06718652695417404,
-0.005783849395811558,
-0.1647292673587799,
-0.000674006761983037,
-0.12190773338079453,
0.04701998829841614,
-0.1751846969127655,
0.11614945530891418,
0.051899779587984085,
0.016969917342066765,
-0.0007698916597291827,
-0.028184883296489716,
0.050279732793569565,
0.07457059621810913,
-0.1374601572751999,
-0.06961626559495926
] |
null | null |
transformers
|
This repository belongs to TransportersBERT from ActTrans publication.
Taju, Semmy Wellem, Syed Muazzam Ali Shah, and Yu-Yen Ou. “ActTRANS: Functional Classification in Active Transport Proteins Based on Transfer Learning and Contextual Representations.” Computational Biology and Chemistry 93 (August 1, 2021): 107537. https://doi.org/10.1016/j.compbiolchem.2021.107537.
|
{}
| null |
ghazikhanihamed/TransportersBERT
|
[
"transformers",
"pytorch",
"bert",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #endpoints_compatible #region-us
|
This repository belongs to TransportersBERT from ActTrans publication.
Taju, Semmy Wellem, Syed Muazzam Ali Shah, and Yu-Yen Ou. “ActTRANS: Functional Classification in Active Transport Proteins Based on Transfer Learning and Contextual Representations.” Computational Biology and Chemistry 93 (August 1, 2021): 107537. URL
|
[] |
[
"TAGS\n#transformers #pytorch #bert #endpoints_compatible #region-us \n"
] |
[
23
] |
[
"passage: TAGS\n#transformers #pytorch #bert #endpoints_compatible #region-us \n"
] |
[
-0.05582514405250549,
-0.0059938449412584305,
-0.0106970788910985,
-0.029046038165688515,
0.12276504933834076,
0.03318239748477936,
0.02372138574719429,
0.05814843624830246,
0.1283326894044876,
-0.008450702764093876,
0.13131918013095856,
0.20412099361419678,
-0.05518573895096779,
-0.0014659949811175466,
-0.060662250965833664,
-0.2503744065761566,
0.07448812574148178,
0.10944642126560211,
-0.05779325217008591,
0.09946653246879578,
0.033594902604818344,
-0.1090199276804924,
0.06034360080957413,
-0.031356219202280045,
-0.10178717225790024,
0.05382249876856804,
0.022958533838391304,
-0.07516350597143173,
0.12614554166793823,
0.017229093238711357,
0.18774759769439697,
0.02010849304497242,
-0.12349475175142288,
-0.16799280047416687,
0.028976095840334892,
0.0017078231321647763,
-0.06416994333267212,
0.029829638078808784,
0.067641980946064,
-0.0958416610956192,
0.011812526732683182,
0.04986337199807167,
0.005196088925004005,
0.029664115980267525,
-0.16331109404563904,
-0.1660199761390686,
-0.040016546845436096,
0.03566199541091919,
0.02647480182349682,
0.07168487459421158,
0.023379387333989143,
0.1733984798192978,
-0.15529543161392212,
0.08478239923715591,
0.18858951330184937,
-0.3070982098579407,
0.008051520213484764,
0.08039850741624832,
0.06561373919248581,
0.05102481693029404,
-0.0034548938274383545,
0.0508800707757473,
0.001212250324897468,
0.02218480594456196,
-0.026412250474095345,
-0.08552643656730652,
-0.013642089441418648,
0.08882433921098709,
-0.0965084433555603,
-0.09593788534402847,
0.201665997505188,
-0.035358089953660965,
0.051234010607004166,
0.048935793340206146,
-0.10587480664253235,
-0.04507492482662201,
-0.00264694239012897,
-0.006025178357958794,
-0.0201078150421381,
0.06390652805566788,
0.011368521489202976,
-0.024782819673419,
-0.11910133063793182,
0.039458055049180984,
-0.22834016382694244,
0.2708556354045868,
0.02697429433465004,
0.09296334534883499,
-0.22665227949619293,
0.06216507405042648,
-0.053294818848371506,
-0.07662333548069,
0.029203347861766815,
-0.09727942198514938,
0.0473918691277504,
0.005940338596701622,
-0.0753636509180069,
0.05180890113115311,
0.051905784755945206,
0.13826292753219604,
0.016379985958337784,
0.03267199918627739,
0.0194330383092165,
0.10806956142187119,
0.01739308051764965,
0.10608944296836853,
0.01434782799333334,
-0.0010544572724029422,
0.022262273356318474,
-0.16448475420475006,
-0.004203388001769781,
-0.04162668436765671,
-0.09218078851699829,
-0.0717039629817009,
0.031197229400277138,
0.09885258227586746,
0.010065644048154354,
0.0125453881919384,
-0.0785975456237793,
-0.002858554245904088,
0.0459388829767704,
-0.05210942402482033,
-0.005135030020028353,
0.013749269768595695,
0.03504934534430504,
0.21628758311271667,
-0.030476583167910576,
-0.027280736714601517,
-0.03371305391192436,
0.12695543467998505,
-0.07887829095125198,
-0.008509011007845402,
-0.04094306752085686,
-0.027221381664276123,
0.05422108992934227,
-0.14036667346954346,
0.07319435477256775,
-0.14634281396865845,
-0.0619744248688221,
0.04048692435026169,
0.04245685040950775,
0.017060942947864532,
0.031087180599570274,
0.014675622805953026,
-0.005929364822804928,
-0.019680023193359375,
-0.07101673632860184,
-0.06312412023544312,
-0.05892617255449295,
0.10866933315992355,
-0.0038128141313791275,
0.04870020970702171,
-0.11713949590921402,
0.06124676764011383,
-0.08538630604743958,
0.03670010715723038,
-0.13224002718925476,
-0.038841135799884796,
-0.027744892984628677,
0.1672174632549286,
0.006601573899388313,
-0.06774948537349701,
-0.1257120668888092,
0.047252584248781204,
-0.038980625569820404,
0.1608281284570694,
-0.026440229266881943,
-0.12333235889673233,
0.24205224215984344,
-0.09686779230833054,
-0.17037728428840637,
0.05819088593125343,
0.007297846022993326,
-0.0187822412699461,
0.08241696655750275,
0.17078529298305511,
0.03631723299622536,
-0.08738293498754501,
0.08095592260360718,
0.14191113412380219,
-0.13318699598312378,
-0.16365906596183777,
0.028528474271297455,
-0.0334528423845768,
-0.11907413601875305,
0.041265517473220825,
0.0034436413552612066,
0.08533528447151184,
-0.0921982154250145,
-0.004812197759747505,
-0.01293246354907751,
-0.015628160908818245,
0.0729161873459816,
0.058011963963508606,
0.09166673570871353,
-0.0677810087800026,
0.00991390272974968,
0.0398266464471817,
-0.011255721561610699,
0.03168392553925514,
0.058278635144233704,
-0.043341703712940216,
0.1279294639825821,
-0.06011424958705902,
0.0025730268098413944,
-0.2085462361574173,
-0.08715872466564178,
-0.011371874250471592,
0.09177418053150177,
-0.0393996424973011,
0.16790297627449036,
0.11040320247411728,
-0.0925755500793457,
0.00358761684037745,
-0.03143608197569847,
0.11512381583452225,
0.01712878979742527,
-0.030011799186468124,
-0.03772463649511337,
0.012738056480884552,
-0.07095976918935776,
-0.09710898995399475,
-0.02491631917655468,
-0.00627898471429944,
0.12167930603027344,
0.1264781653881073,
0.002507270546630025,
0.03385916352272034,
-0.04122108966112137,
0.059168342500925064,
-0.012312098406255245,
0.02199527993798256,
0.10435131192207336,
-0.017714541405439377,
-0.08321622759103775,
0.15026481449604034,
-0.08384737372398376,
0.36931121349334717,
0.2024332880973816,
-0.32852646708488464,
0.03981154039502144,
-0.02746942825615406,
-0.01390197966247797,
0.024579064920544624,
0.1122342124581337,
-0.02590859867632389,
0.0881802886724472,
0.04394598677754402,
0.11318054795265198,
-0.024956289678812027,
-0.04454986751079559,
0.0007368221413344145,
-0.047743842005729675,
-0.049511488527059555,
0.09542335569858551,
0.06248341128230095,
-0.13251787424087524,
0.1600245237350464,
0.28373822569847107,
0.04560614377260208,
0.09792512655258179,
-0.06133773550391197,
-0.022757407277822495,
0.039699189364910126,
0.015039279125630856,
-0.04576999694108963,
0.03373005613684654,
-0.26591286063194275,
-0.047633323818445206,
0.07407993823289871,
0.02055620402097702,
0.09061875939369202,
-0.1459200233221054,
-0.05993150174617767,
0.030102252960205078,
0.02484927326440811,
-0.07744612544775009,
0.09087593108415604,
0.04228215664625168,
0.07148098945617676,
0.012928525917232037,
-0.04512014612555504,
0.10244821012020111,
0.005629644729197025,
-0.04586281627416611,
0.16416428983211517,
-0.11823447793722153,
-0.2552604377269745,
-0.1193137839436531,
-0.15801620483398438,
0.02478799968957901,
0.012210648506879807,
0.074419766664505,
-0.09139055013656616,
-0.026302505284547806,
0.10130403935909271,
0.061081867665052414,
-0.15493257343769073,
0.044168904423713684,
-0.04580516368150711,
0.03345111384987831,
-0.08739827573299408,
-0.07113046199083328,
-0.07068533450365067,
-0.06981467455625534,
-0.04961463436484337,
0.10121359676122665,
-0.1246945932507515,
0.08166283369064331,
0.12132786959409714,
0.05094803124666214,
0.07646921277046204,
-0.0020916727371513844,
0.17505085468292236,
-0.06357396394014359,
-0.05962150916457176,
0.18625718355178833,
-0.039154816418886185,
0.10033397376537323,
0.09657420217990875,
0.044079359620809555,
-0.06951406598091125,
-0.033484332263469696,
-0.060933757573366165,
-0.10718005150556564,
-0.20763958990573883,
-0.10036817938089371,
-0.13023525476455688,
0.004751497879624367,
-0.000670114066451788,
0.042121771723032,
0.0700206607580185,
0.06631368398666382,
0.053532831370830536,
-0.0864759013056755,
-0.050034765154123306,
0.04923805594444275,
0.22149646282196045,
-0.035591885447502136,
0.07959269732236862,
-0.05890441685914993,
-0.08794771879911423,
0.08127860724925995,
0.060010988265275955,
0.1817447543144226,
0.10210268944501877,
0.022593215107917786,
0.06595935672521591,
0.16745993494987488,
0.15201883018016815,
0.1540367156267166,
-0.02193843014538288,
-0.03577892482280731,
-0.01526061724871397,
-0.00010030799603555351,
-0.06889137625694275,
0.0004913448356091976,
0.11326914280653,
-0.13479973375797272,
-0.0604836605489254,
-0.24327106773853302,
0.07084902375936508,
0.04919629916548729,
0.032679811120033264,
-0.15196309983730316,
-0.00040421399171464145,
0.07282399386167526,
-0.00041144643910229206,
-0.044284626841545105,
0.09084761887788773,
-0.0016829834785312414,
-0.11569281667470932,
0.050119396299123764,
-0.04132482036948204,
0.10694653540849686,
-0.02481699176132679,
0.08771265298128128,
-0.0358782634139061,
-0.12598921358585358,
0.0662776455283165,
0.06900722533464432,
-0.23785486817359924,
0.28764957189559937,
-0.015047562308609486,
-0.08228088170289993,
-0.048801977187395096,
-0.04197835922241211,
-0.003781283274292946,
0.1852884590625763,
0.1024385541677475,
0.04168209061026573,
-0.03920183330774307,
-0.15113374590873718,
0.043328672647476196,
0.033142901957035065,
0.1328010857105255,
-0.03353354334831238,
-0.04113762453198433,
0.0004659405385609716,
-0.012823380529880524,
-0.026963958516716957,
0.03337518498301506,
0.0776907280087471,
-0.1309017539024353,
0.04395826533436775,
-0.0019880991894751787,
0.028880812227725983,
-0.0074317543767392635,
-0.01322081871330738,
-0.06172190606594086,
0.11954687535762787,
-0.06683460623025894,
-0.05747222900390625,
-0.0848630890250206,
-0.1599932312965393,
0.1275681108236313,
-0.10324639081954956,
0.072115458548069,
-0.08794764429330826,
-0.06709247827529907,
-0.07358412444591522,
-0.1814906895160675,
0.12599299848079681,
-0.0998830795288086,
0.028712023049592972,
-0.036930546164512634,
0.2198343575000763,
-0.04788460582494736,
0.00006587969255633652,
-0.008528475649654865,
0.013039504177868366,
-0.11477264016866684,
-0.08884283155202866,
0.01379021629691124,
-0.015655362978577614,
0.05710950493812561,
0.041695158928632736,
-0.03288666531443596,
0.06880264729261398,
0.015344602055847645,
0.034568872302770615,
0.21396347880363464,
0.186134472489357,
-0.033745236694812775,
0.11985735595226288,
0.18027181923389435,
-0.033794477581977844,
-0.26661917567253113,
-0.0846354141831398,
-0.17172688245773315,
-0.047513436526060104,
-0.02810327708721161,
-0.14752227067947388,
0.142644464969635,
0.0381203256547451,
-0.020230794325470924,
0.12928421795368195,
-0.2486487627029419,
-0.049123749136924744,
0.1653672605752945,
0.0083856750279665,
0.5381866097450256,
-0.10660535097122192,
-0.09454790502786636,
0.02489558421075344,
-0.25975850224494934,
0.10427799820899963,
0.02520833909511566,
0.051780153065919876,
-0.021174505352973938,
0.09585968405008316,
0.036323096603155136,
-0.08304581791162491,
0.11906874924898148,
0.028275268152356148,
0.02357698790729046,
-0.06498955935239792,
-0.13870584964752197,
0.037250224500894547,
-0.013110331259667873,
-0.03493497520685196,
0.06826557219028473,
0.017235321924090385,
-0.14760255813598633,
-0.02404404617846012,
-0.13333261013031006,
0.05567457154393196,
0.036672841757535934,
-0.031218502670526505,
0.005919408518821001,
-0.03673669323325157,
-0.025360016152262688,
0.013563361018896103,
0.2656884491443634,
-0.02885071188211441,
0.13380834460258484,
0.004754696507006884,
0.06820327788591385,
-0.2138592153787613,
-0.11385384202003479,
-0.07559774816036224,
-0.052582528442144394,
0.08766578137874603,
-0.04031619429588318,
0.04227626696228981,
0.15855242311954498,
-0.016625750809907913,
-0.009202802553772926,
0.1196829304099083,
0.0072800288908183575,
-0.03377111256122589,
0.11844226717948914,
-0.22603391110897064,
-0.05008016526699066,
-0.02580120787024498,
-0.03263162076473236,
0.12460703402757645,
0.12217804044485092,
0.09506645798683167,
0.0754433125257492,
-0.022474301978945732,
-0.024322625249624252,
-0.03513093665242195,
-0.07823941111564636,
0.013863112777471542,
0.04773413762450218,
0.034937534481287,
-0.11898844689130783,
0.047177523374557495,
-0.02152738906443119,
-0.266872763633728,
-0.056712910532951355,
0.10026243329048157,
-0.13707385957241058,
-0.0934320017695427,
-0.07949313521385193,
0.07076770067214966,
-0.15040098130702972,
-0.04215400665998459,
-0.01284860447049141,
-0.11120935529470444,
0.07076677680015564,
0.250417560338974,
0.1062779352068901,
0.10727981477975845,
-0.032035890966653824,
-0.005367985460907221,
0.042143698781728745,
-0.07543303072452545,
-0.013991466723382473,
0.0121172945946455,
-0.08700574934482574,
-0.010479255579411983,
-0.013696597889065742,
0.15301570296287537,
-0.08327650278806686,
-0.07890510559082031,
-0.171699658036232,
0.08268199115991592,
-0.10711699724197388,
-0.10958558320999146,
-0.11338668316602707,
-0.06782516837120056,
0.011405235156416893,
-0.09893908351659775,
-0.03024802915751934,
-0.0328654907643795,
-0.13816463947296143,
0.07684630155563354,
0.03262607753276825,
0.005515735596418381,
-0.06403716653585434,
-0.035874143242836,
0.1498086303472519,
-0.05030526593327522,
0.09152921289205551,
0.18889226019382477,
-0.08515924215316772,
0.13183313608169556,
-0.10378775745630264,
-0.14850100874900818,
0.10374602675437927,
0.012674182653427124,
0.08053745329380035,
0.08146080374717712,
0.015140734612941742,
0.07422392815351486,
0.028560757637023926,
0.05162277817726135,
0.003286920255050063,
-0.1188889890909195,
0.016973691061139107,
0.04206009581685066,
-0.17900550365447998,
-0.02852419763803482,
-0.09733768552541733,
0.15730208158493042,
0.03292210027575493,
0.08486420661211014,
0.020920267328619957,
0.1054333969950676,
-0.023887382820248604,
0.009118972346186638,
0.011180376634001732,
-0.19744594395160675,
0.027193883433938026,
-0.05555891990661621,
0.006399804726243019,
0.0009134450228884816,
0.2835138142108917,
-0.07805345952510834,
0.031515587121248245,
0.03223814070224762,
0.053333036601543427,
-0.0018267212435603142,
0.019711701199412346,
0.23368167877197266,
0.1001359298825264,
-0.04224341735243797,
-0.09082697331905365,
0.08135782182216644,
-0.009332779794931412,
-0.035943757742643356,
0.1321525275707245,
0.12974345684051514,
0.06283511221408844,
0.0970291718840599,
-0.0061447578482329845,
0.06124432384967804,
-0.11149083077907562,
-0.2839953899383545,
0.005014513153582811,
0.04187164828181267,
0.007620683405548334,
0.13637039065361023,
0.11712085455656052,
-0.013480436988174915,
0.07515306025743484,
-0.01791444607079029,
-0.010756190866231918,
-0.1442539393901825,
-0.0785292312502861,
-0.036694128066301346,
-0.10411686450242996,
0.016604600474238396,
-0.05977771431207657,
-0.006930612958967686,
0.16422225534915924,
0.04799465835094452,
-0.02543029561638832,
0.11942467838525772,
0.060724616050720215,
-0.05485526844859123,
0.032888129353523254,
-0.004953940864652395,
0.02457866445183754,
0.03158317878842354,
-0.021798279136419296,
-0.1477023959159851,
-0.07525672018527985,
-0.06770733743906021,
0.031246397644281387,
-0.09311450272798538,
-0.024162618443369865,
-0.1108081042766571,
-0.10779142379760742,
-0.06457403302192688,
0.07176531851291656,
-0.06568951159715652,
0.1171649917960167,
-0.032683372497558594,
0.016234206035733223,
0.007168353535234928,
0.172343447804451,
-0.07198196649551392,
-0.024842115119099617,
0.017614079639315605,
0.2056950181722641,
0.049104735255241394,
0.1042415201663971,
0.0006923355394974351,
0.03809111937880516,
-0.08218837529420853,
0.3366861045360565,
0.24629603326320648,
-0.03666435182094574,
0.042452335357666016,
0.064607635140419,
0.050530221313238144,
0.11894407868385315,
0.10735389590263367,
0.10173739492893219,
0.3289482593536377,
-0.09050356596708298,
-0.03519318625330925,
-0.025250563398003578,
0.009432761929929256,
-0.09690305590629578,
0.0489007793366909,
0.039478812366724014,
-0.06709855049848557,
-0.08793099969625473,
0.10046962648630142,
-0.17121359705924988,
0.10175119340419769,
0.06997337937355042,
-0.2198277860879898,
-0.03783851116895676,
-0.060111887753009796,
0.181540384888649,
-0.012015782296657562,
0.129024475812912,
-0.03838730603456497,
-0.1459164023399353,
0.06123366579413414,
0.050386372953653336,
-0.27638113498687744,
-0.08839067071676254,
0.1257728487253189,
0.042554836720228195,
-0.00688927061855793,
-0.016768725588917732,
0.013223016634583473,
0.060327593237161636,
0.07101310789585114,
-0.025480445474386215,
-0.0019402769394218922,
0.04185595363378525,
-0.10931044816970825,
-0.12752482295036316,
-0.0229122806340456,
0.011525065638124943,
-0.09778720140457153,
0.027700234204530716,
-0.1871083378791809,
0.03940398618578911,
0.00809240061789751,
-0.03241167217493057,
-0.0014000540832057595,
0.0010574592743068933,
-0.057058386504650116,
0.013885253109037876,
0.04856681451201439,
0.014625655487179756,
-0.04175407066941261,
-0.04523738846182823,
-0.017677124589681625,
0.06060760095715523,
-0.08221389353275299,
-0.15152835845947266,
-0.019660353660583496,
-0.08916762471199036,
0.11470503360033035,
-0.03491480275988579,
-0.06863245368003845,
-0.022439386695623398,
-0.02936902828514576,
0.07283741235733032,
-0.12221226841211319,
0.023255428299307823,
0.016710588708519936,
0.04700179398059845,
0.0185729768127203,
-0.03645060583949089,
0.05477501451969147,
0.05671923607587814,
-0.11537165194749832,
-0.07711852341890335
] |
null | null |
transformers
|
# Connor
|
{"tags": ["conversational"]}
|
text-generation
|
ghhostboy/DialoGPT-medium-connorDBH3-1
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Connor
|
[
"# Connor"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Connor"
] |
[
51,
3
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Connor"
] |
[
-0.02925184741616249,
0.05394621565937996,
-0.007346093188971281,
0.018662704154849052,
0.1354130506515503,
0.0011122088180854917,
0.0965338721871376,
0.13358888030052185,
0.0066132009960711,
-0.030047448351979256,
0.1957976222038269,
0.18953122198581696,
-0.028730075806379318,
-0.00020101357949897647,
-0.07705681025981903,
-0.24681657552719116,
0.052927784621715546,
0.04203924909234047,
-0.0069792806170880795,
0.1323334276676178,
0.06496594846248627,
-0.10170730948448181,
0.07417894899845123,
0.014766890555620193,
-0.18092578649520874,
0.004835694562643766,
0.03368838131427765,
-0.10148485749959946,
0.11837714910507202,
0.05181579291820526,
0.07185623049736023,
0.03968442231416702,
-0.0590103417634964,
-0.1718032956123352,
0.0378260463476181,
0.003725616494193673,
-0.045444898307323456,
0.04131190851330757,
0.055930279195308685,
-0.07218889892101288,
0.06813380122184753,
0.0985194742679596,
0.012346921488642693,
0.05202097073197365,
-0.13377007842063904,
-0.05445164814591408,
-0.015775445848703384,
0.00477970065549016,
0.051107440143823624,
0.10512708872556686,
-0.03996889665722847,
0.058458659797906876,
-0.10048452019691467,
0.09470035135746002,
0.0878966823220253,
-0.3642279803752899,
-0.014858249574899673,
0.060671690851449966,
0.053683433681726456,
0.028430063277482986,
-0.04060983285307884,
0.07492391765117645,
-0.011408698745071888,
0.0041669453494250774,
0.01961233839392662,
-0.06842498481273651,
-0.09562181681394577,
0.05102641507983208,
-0.06840752065181732,
-0.07275954633951187,
0.23701967298984528,
-0.0411129966378212,
0.08811333775520325,
-0.057200636714696884,
-0.08560889214277267,
0.006761489436030388,
-0.028157033026218414,
-0.021867457777261734,
-0.07773901522159576,
0.08344189077615738,
-0.05920545756816864,
-0.1176028922200203,
-0.1734667420387268,
-0.007984761148691177,
-0.1778126358985901,
0.13832898437976837,
0.0320969820022583,
0.05517163127660751,
-0.19260957837104797,
0.07919147610664368,
0.016488784924149513,
-0.0861700028181076,
0.0037327716127038,
-0.0633641928434372,
0.03372429683804512,
0.01996453106403351,
-0.06334789097309113,
-0.10411565005779266,
0.1054673120379448,
0.09487809985876083,
0.01767733506858349,
0.0021578283049166203,
-0.06312471628189087,
0.07998111844062805,
0.04181459918618202,
0.15595614910125732,
0.0031191317830234766,
-0.03598559647798538,
0.04100242629647255,
-0.1252720057964325,
0.00036424913560040295,
-0.061184242367744446,
-0.19767357409000397,
-0.011346372775733471,
0.09601219743490219,
0.06185605749487877,
0.049581047147512436,
0.11046309024095535,
-0.010031310841441154,
-0.049231257289648056,
0.010036534629762173,
-0.025234196335077286,
-0.006252532359212637,
0.02294621616601944,
0.004628372378647327,
0.14127634465694427,
-0.011905285529792309,
0.062359776347875595,
-0.140914186835289,
0.09742679446935654,
-0.051546014845371246,
-0.01248314417898655,
-0.013144419528543949,
-0.04174954816699028,
0.003785834414884448,
0.023977981880307198,
-0.0001645632873987779,
-0.10523349791765213,
-0.17493905127048492,
0.005424353759735823,
0.005040725693106651,
-0.0273689404129982,
-0.06790044158697128,
-0.09073055535554886,
-0.010336840525269508,
0.036975227296352386,
-0.03890056908130646,
-0.06945078074932098,
-0.045190975069999695,
0.11755256354808807,
-0.042167194187641144,
0.042104799300432205,
-0.11105046421289444,
0.07916320115327835,
-0.11077804863452911,
-0.04744410142302513,
-0.10197347402572632,
0.0980762392282486,
0.00718857254832983,
0.13121473789215088,
-0.01728755421936512,
-0.017116080969572067,
-0.09815175086259842,
0.05661775916814804,
-0.0483304001390934,
0.18905037641525269,
-0.08971841633319855,
-0.11480304598808289,
0.23990611732006073,
-0.030095553025603294,
-0.13967178761959076,
0.14997372031211853,
0.0016244726721197367,
0.01759430021047592,
0.10524740070104599,
0.22605174779891968,
-0.014883499592542648,
-0.010476954281330109,
0.11382004618644714,
0.07884949445724487,
-0.08619476109743118,
-0.036369845271110535,
0.02140865847468376,
-0.061507076025009155,
-0.010726026259362698,
0.05706731230020523,
0.11435557901859283,
0.009444364346563816,
-0.03720390051603317,
-0.04272313043475151,
-0.03357268124818802,
0.0012569663813337684,
0.052476849406957626,
0.01237579993903637,
0.1141212210059166,
-0.05483352020382881,
-0.06145123019814491,
-0.037660107016563416,
-0.027675192803144455,
-0.06503517925739288,
0.04087546095252037,
-0.06467100977897644,
0.14506307244300842,
-0.0013940678909420967,
0.06594178825616837,
-0.12138308584690094,
-0.09115113317966461,
-0.02046472206711769,
0.24025379121303558,
0.028638137504458427,
0.06147945299744606,
0.05269705876708031,
-0.041196905076503754,
0.012030383571982384,
0.01953684724867344,
0.15377284586429596,
-0.023100776597857475,
-0.06244855746626854,
-0.06345859169960022,
0.10359788686037064,
-0.04177824780344963,
0.07620002329349518,
-0.12415637820959091,
0.006968116853386164,
0.11460010707378387,
0.06582167744636536,
0.009664244949817657,
0.01680014841258526,
-0.02746749483048916,
-0.021077269688248634,
-0.08522540330886841,
0.00495782820507884,
0.10600091516971588,
-0.007186750415712595,
-0.060660794377326965,
0.18891876935958862,
-0.12747310101985931,
0.21995873749256134,
0.19535614550113678,
-0.32847464084625244,
0.03309515863656998,
-0.12449122220277786,
-0.05018541216850281,
0.019545070827007294,
0.051728181540966034,
-0.050629425793886185,
0.15073467791080475,
0.02067919820547104,
0.18692433834075928,
-0.043979134410619736,
-0.04404625669121742,
-0.02415771782398224,
-0.003789240960031748,
-0.006700950674712658,
0.07037521153688431,
0.09694147109985352,
-0.12172182649374008,
0.1580640971660614,
0.20468899607658386,
0.0679987370967865,
0.16063782572746277,
0.01437209639698267,
-0.0073439921252429485,
0.06535328179597855,
-0.0007422196213155985,
-0.04035529866814613,
-0.053615786135196686,
-0.2567192018032074,
-0.03348274156451225,
0.06302867829799652,
0.0036448733881115913,
0.09699583053588867,
-0.1298881620168686,
-0.036478329449892044,
0.008474661037325859,
-0.017181789502501488,
0.01482146605849266,
0.06206975877285004,
0.05965055525302887,
0.10083626210689545,
0.006409200839698315,
-0.018204452469944954,
0.09128560870885849,
0.015156551264226437,
-0.07827339321374893,
0.20302309095859528,
-0.1133514940738678,
-0.3264577090740204,
-0.11229793727397919,
-0.10049223899841309,
-0.05149480327963829,
0.044781286269426346,
0.11583702266216278,
-0.07116156816482544,
-0.013666602782905102,
-0.00012099794548703358,
0.08007866889238358,
-0.1435246765613556,
-0.0060938140377402306,
-0.07178319990634918,
0.012652627192437649,
-0.1434086710214615,
-0.08146951347589493,
-0.058304451406002045,
-0.035322412848472595,
-0.11820250749588013,
0.13414062559604645,
-0.11009538918733597,
0.055668674409389496,
0.2082853615283966,
0.026049841195344925,
0.07032635062932968,
-0.04794178158044815,
0.18897143006324768,
-0.10888353735208511,
-0.005476977676153183,
0.11953116953372955,
-0.07212688773870468,
0.07971642166376114,
0.09387548267841339,
0.01311377715319395,
-0.08847784996032715,
0.009223191998898983,
-0.024561917409300804,
-0.06821213662624359,
-0.2246413677930832,
-0.12730298936367035,
-0.11128836870193481,
0.10052230209112167,
0.049173880368471146,
0.053642403334379196,
0.1727563887834549,
0.0530095212161541,
-0.030952993780374527,
-0.0047622015699744225,
0.0319613479077816,
0.08462076634168625,
0.3333025276660919,
-0.05017906054854393,
0.14549614489078522,
-0.00036478505353443325,
-0.12129691243171692,
0.07011137157678604,
0.06197326257824898,
0.12701977789402008,
0.07242222130298615,
0.1320236176252365,
0.010376046411693096,
0.11551918834447861,
0.09322155267000198,
0.04498298466205597,
-0.0011454194318503141,
-0.029153570532798767,
-0.02955644205212593,
-0.042252399027347565,
-0.01946003921329975,
0.02731548249721527,
0.08646757155656815,
-0.17463566362857819,
-0.012395774945616722,
-0.07677937299013138,
0.08058426529169083,
0.17890849709510803,
0.04724331572651863,
-0.14464187622070312,
-0.02763214148581028,
0.06071821600198746,
-0.03439399600028992,
-0.09548717737197876,
0.10307709872722626,
0.010424081236124039,
-0.11046235263347626,
-0.000881882559042424,
-0.03702011704444885,
0.11439632624387741,
-0.12734322249889374,
0.1037118211388588,
-0.09629391133785248,
-0.09961969405412674,
0.023403676226735115,
0.1161087155342102,
-0.2543545663356781,
0.2005613148212433,
0.014315655454993248,
-0.05915552005171776,
-0.12335977703332901,
-0.02583943121135235,
-0.022067811340093613,
0.10082397609949112,
0.10635748505592346,
-0.015414229594171047,
-0.01193956471979618,
-0.03909774497151375,
-0.03216499835252762,
0.03125924617052078,
0.1270037293434143,
-0.049203384667634964,
-0.011188369244337082,
-0.07417355477809906,
-0.0016169337322935462,
-0.005096765235066414,
0.0032697771675884724,
-0.005393367260694504,
-0.22330342233181,
0.0886646956205368,
0.04297109693288803,
0.05749543756246567,
0.030012011528015137,
-0.011055294424295425,
-0.04832468181848526,
0.2742316424846649,
-0.09723491966724396,
-0.06184287369251251,
-0.09418249130249023,
-0.05288313701748848,
0.01004297100007534,
-0.05761445686221123,
0.026147663593292236,
-0.08054821938276291,
0.044170305132865906,
-0.07633932679891586,
-0.16519498825073242,
0.13780638575553894,
-0.07908501476049423,
-0.03868110850453377,
-0.04351727291941643,
0.21240517497062683,
-0.0666743591427803,
-0.00240578455850482,
0.06577730178833008,
0.040171876549720764,
-0.1326344758272171,
-0.08853650838136673,
0.027201393619179726,
-0.010402924381196499,
0.004504109267145395,
-0.01046372763812542,
-0.025708355009555817,
-0.012467986904084682,
-0.03303941711783409,
-0.03683174401521683,
0.3667992651462555,
0.1363682895898819,
-0.06361039727926254,
0.14458955824375153,
0.12633748352527618,
-0.061410870403051376,
-0.3078187108039856,
-0.06858920305967331,
-0.1277894824743271,
-0.07910357415676117,
-0.07385775446891785,
-0.17497405409812927,
0.10165239125490189,
0.03927032649517059,
-0.020674241706728935,
0.15778371691703796,
-0.2751116156578064,
-0.1221558079123497,
0.16328765451908112,
0.0009125191136263311,
0.4068010151386261,
-0.14288648962974548,
-0.10437782108783722,
-0.038204457610845566,
-0.12281082570552826,
0.21138393878936768,
-0.0294773131608963,
0.11215157806873322,
-0.016649100929498672,
0.149748757481575,
0.05510520562529564,
-0.03112068958580494,
0.039921391755342484,
-0.00009811078780330718,
-0.05917102470993996,
-0.10312094539403915,
-0.08654168248176575,
0.04907678812742233,
0.02403629571199417,
-0.023335743695497513,
-0.05818193405866623,
0.029476376250386238,
-0.14560848474502563,
-0.04242371767759323,
-0.08341463655233383,
0.03625420853495598,
0.008478580974042416,
-0.06227307766675949,
-0.023048527538776398,
-0.07072912156581879,
-0.012144756503403187,
0.02379419468343258,
0.2039346843957901,
-0.07569921761751175,
0.1190752312541008,
0.15772877633571625,
0.0968879833817482,
-0.15422430634498596,
0.026693927124142647,
-0.07108532637357712,
-0.060782063752412796,
0.07174588739871979,
-0.1244666650891304,
0.05647420138120651,
0.1431879699230194,
-0.019048668444156647,
0.062306709587574005,
0.093423992395401,
0.005013942718505859,
-0.0010188198648393154,
0.09075768291950226,
-0.29018914699554443,
-0.1541135460138321,
-0.07211020588874817,
-0.015148764476180077,
0.08962874859571457,
0.0719262957572937,
0.18547621369361877,
0.01603105664253235,
-0.03809863701462746,
0.021393537521362305,
-0.006730877794325352,
0.0089988699182868,
0.07432221621274948,
-0.01425810344517231,
0.004281471483409405,
-0.12848517298698425,
0.05764727666974068,
0.03379267081618309,
-0.11735671013593674,
0.016241099685430527,
0.1614680141210556,
-0.09257188439369202,
-0.12941859662532806,
-0.09788809716701508,
0.0918416976928711,
-0.14377926290035248,
0.00361635722219944,
-0.03291473537683487,
-0.11598578840494156,
0.08148009330034256,
0.15115420520305634,
0.042730964720249176,
0.07979434728622437,
-0.059073556214571,
-0.020935699343681335,
-0.011706070974469185,
0.03482424095273018,
-0.015706820413470268,
-0.027747195214033127,
-0.029748758301138878,
0.032130006700754166,
-0.04720143601298332,
0.11525800079107285,
-0.07434550672769547,
-0.12564405798912048,
-0.1393452286720276,
0.025784002617001534,
-0.12588649988174438,
-0.07320915907621384,
-0.052705831825733185,
-0.03848135843873024,
0.014455427415668964,
-0.047794610261917114,
-0.03060125559568405,
-0.05646202340722084,
-0.11300956457853317,
0.02727883867919445,
-0.02042435109615326,
0.010491162538528442,
-0.12235856801271439,
0.009645631536841393,
0.07233042269945145,
-0.01381764467805624,
0.15597043931484222,
0.11984588950872421,
-0.10167893767356873,
0.09639330208301544,
-0.16445839405059814,
-0.09953459352254868,
0.07818103581666946,
0.01575256884098053,
0.06717244535684586,
0.1466037929058075,
-0.011582564562559128,
0.07672522217035294,
0.05115162953734398,
0.05542599782347679,
0.07179354131221771,
-0.11460574716329575,
0.02389528788626194,
-0.011403894051909447,
-0.17975419759750366,
-0.03826102986931801,
-0.04418688267469406,
0.013445000164210796,
0.0025697837118059397,
0.15301178395748138,
-0.05622294545173645,
0.0963088721036911,
-0.06635666638612747,
0.04486967623233795,
0.0411275252699852,
-0.18070755898952484,
-0.03604935482144356,
-0.07195618748664856,
0.02261582389473915,
0.0025793369859457016,
0.19810304045677185,
-0.01746758073568344,
0.08771996945142746,
0.05208727717399597,
0.0901220440864563,
-0.02807123027741909,
-0.0016879624454304576,
0.1545127034187317,
0.11606251448392868,
-0.07729587703943253,
-0.13520658016204834,
0.0698671042919159,
0.03262162208557129,
0.06130353733897209,
0.13942363858222961,
-0.011711389757692814,
0.004401880316436291,
0.07048087567090988,
0.017136942595243454,
0.0806276798248291,
-0.12662982940673828,
-0.08344478905200958,
-0.024274703115224838,
0.08810848742723465,
-0.036155663430690765,
0.14447735249996185,
0.18316078186035156,
-0.024356747046113014,
0.011877927929162979,
-0.04304046556353569,
-0.05369251221418381,
-0.16152824461460114,
-0.19328774511814117,
-0.09191878139972687,
-0.13428109884262085,
0.02256055921316147,
-0.12149618566036224,
0.07231094688177109,
-0.002630371367558837,
0.0971016213297844,
-0.08431866019964218,
0.09223153442144394,
0.12150278687477112,
-0.0661185011267662,
0.08507523685693741,
-0.03560033440589905,
0.06632541865110397,
-0.021175172179937363,
-0.009392407722771168,
-0.1039053425192833,
0.023766355589032173,
0.021474409848451614,
0.06330981105566025,
-0.05498984456062317,
-0.014108160510659218,
-0.15965639054775238,
-0.09022944420576096,
-0.05521683022379875,
0.07664664089679718,
-0.030568020418286324,
0.07676540315151215,
0.010228199884295464,
-0.017517900094389915,
0.048373181372880936,
0.2539057433605194,
-0.06987801939249039,
-0.07953987270593643,
-0.004763890989124775,
0.14693714678287506,
0.036458730697631836,
0.10921000689268112,
-0.016933122649788857,
0.00758946780115366,
-0.0609518401324749,
0.39958909153938293,
0.28481969237327576,
-0.09782253205776215,
0.01356764230877161,
0.019335096701979637,
0.03878680244088173,
0.11455408483743668,
0.07693379372358322,
0.09613210707902908,
0.28509002923965454,
-0.05150345340371132,
-0.05799813196063042,
-0.012410702183842659,
-0.014153141528367996,
-0.11986690759658813,
0.08600786328315735,
0.06672467291355133,
-0.048113152384757996,
-0.04451439529657364,
0.09715087711811066,
-0.2242901623249054,
0.07662616670131683,
-0.12378726899623871,
-0.20510637760162354,
-0.06453341245651245,
-0.01306221168488264,
0.14139007031917572,
-0.0030332899186760187,
0.10171270370483398,
-0.0035608005709946156,
-0.08835478127002716,
-0.01295030489563942,
0.01667134091258049,
-0.2220873236656189,
0.02820418030023575,
0.07438264042139053,
-0.11552060395479202,
0.043498605489730835,
-0.029434632509946823,
0.03576662391424179,
0.07828012108802795,
0.07148585468530655,
-0.01457840483635664,
0.0008470552274957299,
0.0072289337404072285,
-0.040996063500642776,
0.010761539451777935,
0.05810254439711571,
0.010656234808266163,
-0.055849794298410416,
0.10991960018873215,
-0.1354595273733139,
0.07819081097841263,
-0.030561506748199463,
-0.021595491096377373,
0.006426818203181028,
0.025492971763014793,
-0.04118790850043297,
0.03240106999874115,
0.07756144553422928,
0.0008272333652712405,
-0.0011678365990519524,
-0.05477802827954292,
-0.04978339746594429,
-0.03672594949603081,
-0.03046082705259323,
-0.1137775257229805,
-0.11467036604881287,
-0.09677951782941818,
0.10845700651407242,
-0.01653127931058407,
-0.19124320149421692,
0.0020816053729504347,
-0.06622135639190674,
0.08284913748502731,
-0.15111427009105682,
0.12025394290685654,
0.12695342302322388,
0.0027502349112182856,
-0.017331790179014206,
-0.01502646692097187,
0.03998102247714996,
0.07273806631565094,
-0.10284501314163208,
-0.05015341192483902
] |
null | null |
transformers
|
# Connor
|
{"tags": ["conversational"]}
|
text-generation
|
ghhostboy/DialoGPT-medium-connorDBH3-21
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Connor
|
[
"# Connor"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Connor"
] |
[
51,
3
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Connor"
] |
[
-0.02925184741616249,
0.05394621565937996,
-0.007346093188971281,
0.018662704154849052,
0.1354130506515503,
0.0011122088180854917,
0.0965338721871376,
0.13358888030052185,
0.0066132009960711,
-0.030047448351979256,
0.1957976222038269,
0.18953122198581696,
-0.028730075806379318,
-0.00020101357949897647,
-0.07705681025981903,
-0.24681657552719116,
0.052927784621715546,
0.04203924909234047,
-0.0069792806170880795,
0.1323334276676178,
0.06496594846248627,
-0.10170730948448181,
0.07417894899845123,
0.014766890555620193,
-0.18092578649520874,
0.004835694562643766,
0.03368838131427765,
-0.10148485749959946,
0.11837714910507202,
0.05181579291820526,
0.07185623049736023,
0.03968442231416702,
-0.0590103417634964,
-0.1718032956123352,
0.0378260463476181,
0.003725616494193673,
-0.045444898307323456,
0.04131190851330757,
0.055930279195308685,
-0.07218889892101288,
0.06813380122184753,
0.0985194742679596,
0.012346921488642693,
0.05202097073197365,
-0.13377007842063904,
-0.05445164814591408,
-0.015775445848703384,
0.00477970065549016,
0.051107440143823624,
0.10512708872556686,
-0.03996889665722847,
0.058458659797906876,
-0.10048452019691467,
0.09470035135746002,
0.0878966823220253,
-0.3642279803752899,
-0.014858249574899673,
0.060671690851449966,
0.053683433681726456,
0.028430063277482986,
-0.04060983285307884,
0.07492391765117645,
-0.011408698745071888,
0.0041669453494250774,
0.01961233839392662,
-0.06842498481273651,
-0.09562181681394577,
0.05102641507983208,
-0.06840752065181732,
-0.07275954633951187,
0.23701967298984528,
-0.0411129966378212,
0.08811333775520325,
-0.057200636714696884,
-0.08560889214277267,
0.006761489436030388,
-0.028157033026218414,
-0.021867457777261734,
-0.07773901522159576,
0.08344189077615738,
-0.05920545756816864,
-0.1176028922200203,
-0.1734667420387268,
-0.007984761148691177,
-0.1778126358985901,
0.13832898437976837,
0.0320969820022583,
0.05517163127660751,
-0.19260957837104797,
0.07919147610664368,
0.016488784924149513,
-0.0861700028181076,
0.0037327716127038,
-0.0633641928434372,
0.03372429683804512,
0.01996453106403351,
-0.06334789097309113,
-0.10411565005779266,
0.1054673120379448,
0.09487809985876083,
0.01767733506858349,
0.0021578283049166203,
-0.06312471628189087,
0.07998111844062805,
0.04181459918618202,
0.15595614910125732,
0.0031191317830234766,
-0.03598559647798538,
0.04100242629647255,
-0.1252720057964325,
0.00036424913560040295,
-0.061184242367744446,
-0.19767357409000397,
-0.011346372775733471,
0.09601219743490219,
0.06185605749487877,
0.049581047147512436,
0.11046309024095535,
-0.010031310841441154,
-0.049231257289648056,
0.010036534629762173,
-0.025234196335077286,
-0.006252532359212637,
0.02294621616601944,
0.004628372378647327,
0.14127634465694427,
-0.011905285529792309,
0.062359776347875595,
-0.140914186835289,
0.09742679446935654,
-0.051546014845371246,
-0.01248314417898655,
-0.013144419528543949,
-0.04174954816699028,
0.003785834414884448,
0.023977981880307198,
-0.0001645632873987779,
-0.10523349791765213,
-0.17493905127048492,
0.005424353759735823,
0.005040725693106651,
-0.0273689404129982,
-0.06790044158697128,
-0.09073055535554886,
-0.010336840525269508,
0.036975227296352386,
-0.03890056908130646,
-0.06945078074932098,
-0.045190975069999695,
0.11755256354808807,
-0.042167194187641144,
0.042104799300432205,
-0.11105046421289444,
0.07916320115327835,
-0.11077804863452911,
-0.04744410142302513,
-0.10197347402572632,
0.0980762392282486,
0.00718857254832983,
0.13121473789215088,
-0.01728755421936512,
-0.017116080969572067,
-0.09815175086259842,
0.05661775916814804,
-0.0483304001390934,
0.18905037641525269,
-0.08971841633319855,
-0.11480304598808289,
0.23990611732006073,
-0.030095553025603294,
-0.13967178761959076,
0.14997372031211853,
0.0016244726721197367,
0.01759430021047592,
0.10524740070104599,
0.22605174779891968,
-0.014883499592542648,
-0.010476954281330109,
0.11382004618644714,
0.07884949445724487,
-0.08619476109743118,
-0.036369845271110535,
0.02140865847468376,
-0.061507076025009155,
-0.010726026259362698,
0.05706731230020523,
0.11435557901859283,
0.009444364346563816,
-0.03720390051603317,
-0.04272313043475151,
-0.03357268124818802,
0.0012569663813337684,
0.052476849406957626,
0.01237579993903637,
0.1141212210059166,
-0.05483352020382881,
-0.06145123019814491,
-0.037660107016563416,
-0.027675192803144455,
-0.06503517925739288,
0.04087546095252037,
-0.06467100977897644,
0.14506307244300842,
-0.0013940678909420967,
0.06594178825616837,
-0.12138308584690094,
-0.09115113317966461,
-0.02046472206711769,
0.24025379121303558,
0.028638137504458427,
0.06147945299744606,
0.05269705876708031,
-0.041196905076503754,
0.012030383571982384,
0.01953684724867344,
0.15377284586429596,
-0.023100776597857475,
-0.06244855746626854,
-0.06345859169960022,
0.10359788686037064,
-0.04177824780344963,
0.07620002329349518,
-0.12415637820959091,
0.006968116853386164,
0.11460010707378387,
0.06582167744636536,
0.009664244949817657,
0.01680014841258526,
-0.02746749483048916,
-0.021077269688248634,
-0.08522540330886841,
0.00495782820507884,
0.10600091516971588,
-0.007186750415712595,
-0.060660794377326965,
0.18891876935958862,
-0.12747310101985931,
0.21995873749256134,
0.19535614550113678,
-0.32847464084625244,
0.03309515863656998,
-0.12449122220277786,
-0.05018541216850281,
0.019545070827007294,
0.051728181540966034,
-0.050629425793886185,
0.15073467791080475,
0.02067919820547104,
0.18692433834075928,
-0.043979134410619736,
-0.04404625669121742,
-0.02415771782398224,
-0.003789240960031748,
-0.006700950674712658,
0.07037521153688431,
0.09694147109985352,
-0.12172182649374008,
0.1580640971660614,
0.20468899607658386,
0.0679987370967865,
0.16063782572746277,
0.01437209639698267,
-0.0073439921252429485,
0.06535328179597855,
-0.0007422196213155985,
-0.04035529866814613,
-0.053615786135196686,
-0.2567192018032074,
-0.03348274156451225,
0.06302867829799652,
0.0036448733881115913,
0.09699583053588867,
-0.1298881620168686,
-0.036478329449892044,
0.008474661037325859,
-0.017181789502501488,
0.01482146605849266,
0.06206975877285004,
0.05965055525302887,
0.10083626210689545,
0.006409200839698315,
-0.018204452469944954,
0.09128560870885849,
0.015156551264226437,
-0.07827339321374893,
0.20302309095859528,
-0.1133514940738678,
-0.3264577090740204,
-0.11229793727397919,
-0.10049223899841309,
-0.05149480327963829,
0.044781286269426346,
0.11583702266216278,
-0.07116156816482544,
-0.013666602782905102,
-0.00012099794548703358,
0.08007866889238358,
-0.1435246765613556,
-0.0060938140377402306,
-0.07178319990634918,
0.012652627192437649,
-0.1434086710214615,
-0.08146951347589493,
-0.058304451406002045,
-0.035322412848472595,
-0.11820250749588013,
0.13414062559604645,
-0.11009538918733597,
0.055668674409389496,
0.2082853615283966,
0.026049841195344925,
0.07032635062932968,
-0.04794178158044815,
0.18897143006324768,
-0.10888353735208511,
-0.005476977676153183,
0.11953116953372955,
-0.07212688773870468,
0.07971642166376114,
0.09387548267841339,
0.01311377715319395,
-0.08847784996032715,
0.009223191998898983,
-0.024561917409300804,
-0.06821213662624359,
-0.2246413677930832,
-0.12730298936367035,
-0.11128836870193481,
0.10052230209112167,
0.049173880368471146,
0.053642403334379196,
0.1727563887834549,
0.0530095212161541,
-0.030952993780374527,
-0.0047622015699744225,
0.0319613479077816,
0.08462076634168625,
0.3333025276660919,
-0.05017906054854393,
0.14549614489078522,
-0.00036478505353443325,
-0.12129691243171692,
0.07011137157678604,
0.06197326257824898,
0.12701977789402008,
0.07242222130298615,
0.1320236176252365,
0.010376046411693096,
0.11551918834447861,
0.09322155267000198,
0.04498298466205597,
-0.0011454194318503141,
-0.029153570532798767,
-0.02955644205212593,
-0.042252399027347565,
-0.01946003921329975,
0.02731548249721527,
0.08646757155656815,
-0.17463566362857819,
-0.012395774945616722,
-0.07677937299013138,
0.08058426529169083,
0.17890849709510803,
0.04724331572651863,
-0.14464187622070312,
-0.02763214148581028,
0.06071821600198746,
-0.03439399600028992,
-0.09548717737197876,
0.10307709872722626,
0.010424081236124039,
-0.11046235263347626,
-0.000881882559042424,
-0.03702011704444885,
0.11439632624387741,
-0.12734322249889374,
0.1037118211388588,
-0.09629391133785248,
-0.09961969405412674,
0.023403676226735115,
0.1161087155342102,
-0.2543545663356781,
0.2005613148212433,
0.014315655454993248,
-0.05915552005171776,
-0.12335977703332901,
-0.02583943121135235,
-0.022067811340093613,
0.10082397609949112,
0.10635748505592346,
-0.015414229594171047,
-0.01193956471979618,
-0.03909774497151375,
-0.03216499835252762,
0.03125924617052078,
0.1270037293434143,
-0.049203384667634964,
-0.011188369244337082,
-0.07417355477809906,
-0.0016169337322935462,
-0.005096765235066414,
0.0032697771675884724,
-0.005393367260694504,
-0.22330342233181,
0.0886646956205368,
0.04297109693288803,
0.05749543756246567,
0.030012011528015137,
-0.011055294424295425,
-0.04832468181848526,
0.2742316424846649,
-0.09723491966724396,
-0.06184287369251251,
-0.09418249130249023,
-0.05288313701748848,
0.01004297100007534,
-0.05761445686221123,
0.026147663593292236,
-0.08054821938276291,
0.044170305132865906,
-0.07633932679891586,
-0.16519498825073242,
0.13780638575553894,
-0.07908501476049423,
-0.03868110850453377,
-0.04351727291941643,
0.21240517497062683,
-0.0666743591427803,
-0.00240578455850482,
0.06577730178833008,
0.040171876549720764,
-0.1326344758272171,
-0.08853650838136673,
0.027201393619179726,
-0.010402924381196499,
0.004504109267145395,
-0.01046372763812542,
-0.025708355009555817,
-0.012467986904084682,
-0.03303941711783409,
-0.03683174401521683,
0.3667992651462555,
0.1363682895898819,
-0.06361039727926254,
0.14458955824375153,
0.12633748352527618,
-0.061410870403051376,
-0.3078187108039856,
-0.06858920305967331,
-0.1277894824743271,
-0.07910357415676117,
-0.07385775446891785,
-0.17497405409812927,
0.10165239125490189,
0.03927032649517059,
-0.020674241706728935,
0.15778371691703796,
-0.2751116156578064,
-0.1221558079123497,
0.16328765451908112,
0.0009125191136263311,
0.4068010151386261,
-0.14288648962974548,
-0.10437782108783722,
-0.038204457610845566,
-0.12281082570552826,
0.21138393878936768,
-0.0294773131608963,
0.11215157806873322,
-0.016649100929498672,
0.149748757481575,
0.05510520562529564,
-0.03112068958580494,
0.039921391755342484,
-0.00009811078780330718,
-0.05917102470993996,
-0.10312094539403915,
-0.08654168248176575,
0.04907678812742233,
0.02403629571199417,
-0.023335743695497513,
-0.05818193405866623,
0.029476376250386238,
-0.14560848474502563,
-0.04242371767759323,
-0.08341463655233383,
0.03625420853495598,
0.008478580974042416,
-0.06227307766675949,
-0.023048527538776398,
-0.07072912156581879,
-0.012144756503403187,
0.02379419468343258,
0.2039346843957901,
-0.07569921761751175,
0.1190752312541008,
0.15772877633571625,
0.0968879833817482,
-0.15422430634498596,
0.026693927124142647,
-0.07108532637357712,
-0.060782063752412796,
0.07174588739871979,
-0.1244666650891304,
0.05647420138120651,
0.1431879699230194,
-0.019048668444156647,
0.062306709587574005,
0.093423992395401,
0.005013942718505859,
-0.0010188198648393154,
0.09075768291950226,
-0.29018914699554443,
-0.1541135460138321,
-0.07211020588874817,
-0.015148764476180077,
0.08962874859571457,
0.0719262957572937,
0.18547621369361877,
0.01603105664253235,
-0.03809863701462746,
0.021393537521362305,
-0.006730877794325352,
0.0089988699182868,
0.07432221621274948,
-0.01425810344517231,
0.004281471483409405,
-0.12848517298698425,
0.05764727666974068,
0.03379267081618309,
-0.11735671013593674,
0.016241099685430527,
0.1614680141210556,
-0.09257188439369202,
-0.12941859662532806,
-0.09788809716701508,
0.0918416976928711,
-0.14377926290035248,
0.00361635722219944,
-0.03291473537683487,
-0.11598578840494156,
0.08148009330034256,
0.15115420520305634,
0.042730964720249176,
0.07979434728622437,
-0.059073556214571,
-0.020935699343681335,
-0.011706070974469185,
0.03482424095273018,
-0.015706820413470268,
-0.027747195214033127,
-0.029748758301138878,
0.032130006700754166,
-0.04720143601298332,
0.11525800079107285,
-0.07434550672769547,
-0.12564405798912048,
-0.1393452286720276,
0.025784002617001534,
-0.12588649988174438,
-0.07320915907621384,
-0.052705831825733185,
-0.03848135843873024,
0.014455427415668964,
-0.047794610261917114,
-0.03060125559568405,
-0.05646202340722084,
-0.11300956457853317,
0.02727883867919445,
-0.02042435109615326,
0.010491162538528442,
-0.12235856801271439,
0.009645631536841393,
0.07233042269945145,
-0.01381764467805624,
0.15597043931484222,
0.11984588950872421,
-0.10167893767356873,
0.09639330208301544,
-0.16445839405059814,
-0.09953459352254868,
0.07818103581666946,
0.01575256884098053,
0.06717244535684586,
0.1466037929058075,
-0.011582564562559128,
0.07672522217035294,
0.05115162953734398,
0.05542599782347679,
0.07179354131221771,
-0.11460574716329575,
0.02389528788626194,
-0.011403894051909447,
-0.17975419759750366,
-0.03826102986931801,
-0.04418688267469406,
0.013445000164210796,
0.0025697837118059397,
0.15301178395748138,
-0.05622294545173645,
0.0963088721036911,
-0.06635666638612747,
0.04486967623233795,
0.0411275252699852,
-0.18070755898952484,
-0.03604935482144356,
-0.07195618748664856,
0.02261582389473915,
0.0025793369859457016,
0.19810304045677185,
-0.01746758073568344,
0.08771996945142746,
0.05208727717399597,
0.0901220440864563,
-0.02807123027741909,
-0.0016879624454304576,
0.1545127034187317,
0.11606251448392868,
-0.07729587703943253,
-0.13520658016204834,
0.0698671042919159,
0.03262162208557129,
0.06130353733897209,
0.13942363858222961,
-0.011711389757692814,
0.004401880316436291,
0.07048087567090988,
0.017136942595243454,
0.0806276798248291,
-0.12662982940673828,
-0.08344478905200958,
-0.024274703115224838,
0.08810848742723465,
-0.036155663430690765,
0.14447735249996185,
0.18316078186035156,
-0.024356747046113014,
0.011877927929162979,
-0.04304046556353569,
-0.05369251221418381,
-0.16152824461460114,
-0.19328774511814117,
-0.09191878139972687,
-0.13428109884262085,
0.02256055921316147,
-0.12149618566036224,
0.07231094688177109,
-0.002630371367558837,
0.0971016213297844,
-0.08431866019964218,
0.09223153442144394,
0.12150278687477112,
-0.0661185011267662,
0.08507523685693741,
-0.03560033440589905,
0.06632541865110397,
-0.021175172179937363,
-0.009392407722771168,
-0.1039053425192833,
0.023766355589032173,
0.021474409848451614,
0.06330981105566025,
-0.05498984456062317,
-0.014108160510659218,
-0.15965639054775238,
-0.09022944420576096,
-0.05521683022379875,
0.07664664089679718,
-0.030568020418286324,
0.07676540315151215,
0.010228199884295464,
-0.017517900094389915,
0.048373181372880936,
0.2539057433605194,
-0.06987801939249039,
-0.07953987270593643,
-0.004763890989124775,
0.14693714678287506,
0.036458730697631836,
0.10921000689268112,
-0.016933122649788857,
0.00758946780115366,
-0.0609518401324749,
0.39958909153938293,
0.28481969237327576,
-0.09782253205776215,
0.01356764230877161,
0.019335096701979637,
0.03878680244088173,
0.11455408483743668,
0.07693379372358322,
0.09613210707902908,
0.28509002923965454,
-0.05150345340371132,
-0.05799813196063042,
-0.012410702183842659,
-0.014153141528367996,
-0.11986690759658813,
0.08600786328315735,
0.06672467291355133,
-0.048113152384757996,
-0.04451439529657364,
0.09715087711811066,
-0.2242901623249054,
0.07662616670131683,
-0.12378726899623871,
-0.20510637760162354,
-0.06453341245651245,
-0.01306221168488264,
0.14139007031917572,
-0.0030332899186760187,
0.10171270370483398,
-0.0035608005709946156,
-0.08835478127002716,
-0.01295030489563942,
0.01667134091258049,
-0.2220873236656189,
0.02820418030023575,
0.07438264042139053,
-0.11552060395479202,
0.043498605489730835,
-0.029434632509946823,
0.03576662391424179,
0.07828012108802795,
0.07148585468530655,
-0.01457840483635664,
0.0008470552274957299,
0.0072289337404072285,
-0.040996063500642776,
0.010761539451777935,
0.05810254439711571,
0.010656234808266163,
-0.055849794298410416,
0.10991960018873215,
-0.1354595273733139,
0.07819081097841263,
-0.030561506748199463,
-0.021595491096377373,
0.006426818203181028,
0.025492971763014793,
-0.04118790850043297,
0.03240106999874115,
0.07756144553422928,
0.0008272333652712405,
-0.0011678365990519524,
-0.05477802827954292,
-0.04978339746594429,
-0.03672594949603081,
-0.03046082705259323,
-0.1137775257229805,
-0.11467036604881287,
-0.09677951782941818,
0.10845700651407242,
-0.01653127931058407,
-0.19124320149421692,
0.0020816053729504347,
-0.06622135639190674,
0.08284913748502731,
-0.15111427009105682,
0.12025394290685654,
0.12695342302322388,
0.0027502349112182856,
-0.017331790179014206,
-0.01502646692097187,
0.03998102247714996,
0.07273806631565094,
-0.10284501314163208,
-0.05015341192483902
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# common6
This model is a fine-tuned version of [common6/checkpoint-3500](https://huggingface.co/common6/checkpoint-3500) on the COMMON_VOICE - FA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3706
- Wer: 0.3421
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 200.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.0344 | 10.0 | 500 | 0.4043 | 0.4511 |
| 0.9651 | 20.0 | 1000 | 0.3793 | 0.4159 |
| 0.9125 | 30.0 | 1500 | 0.3756 | 0.4046 |
| 0.8831 | 40.0 | 2000 | 0.3650 | 0.3876 |
| 0.8399 | 50.0 | 2500 | 0.3605 | 0.3772 |
| 0.819 | 60.0 | 3000 | 0.3622 | 0.3714 |
| 0.8029 | 70.0 | 3500 | 0.3561 | 0.3664 |
| 0.8104 | 80.0 | 4000 | 0.3595 | 0.3660 |
| 0.8118 | 90.0 | 4500 | 0.3460 | 0.3592 |
| 0.7831 | 100.0 | 5000 | 0.3566 | 0.3593 |
| 0.744 | 110.0 | 5500 | 0.3578 | 0.3535 |
| 0.7388 | 120.0 | 6000 | 0.3538 | 0.3520 |
| 0.714 | 130.0 | 6500 | 0.3682 | 0.3506 |
| 0.7291 | 140.0 | 7000 | 0.3625 | 0.3505 |
| 0.697 | 150.0 | 7500 | 0.3619 | 0.3479 |
| 0.6811 | 160.0 | 8000 | 0.3631 | 0.3440 |
| 0.6841 | 170.0 | 8500 | 0.3672 | 0.3460 |
| 0.6616 | 180.0 | 9000 | 0.3677 | 0.3410 |
| 0.6471 | 190.0 | 9500 | 0.3707 | 0.3420 |
| 0.6759 | 200.0 | 10000 | 0.3706 | 0.3421 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.3.dev0
- Tokenizers 0.10.3
|
{"language": ["fa"], "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "common6", "results": []}]}
|
automatic-speech-recognition
|
ghofrani/common6
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"fa",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fa"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us
|
common6
=======
This model is a fine-tuned version of common6/checkpoint-3500 on the COMMON\_VOICE - FA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3706
* Wer: 0.3421
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 6e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 256
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 200.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.3.dev0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
61,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 200.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
-0.13553977012634277,
0.0657314658164978,
-0.002648151246830821,
0.039307136088609695,
0.14089421927928925,
0.0014919538516551256,
0.08693157881498337,
0.14648954570293427,
-0.10292138159275055,
0.08878582715988159,
0.08613797277212143,
0.07693296670913696,
0.07253614813089371,
0.10049618035554886,
-0.02541545405983925,
-0.33333614468574524,
0.0199380312114954,
0.02572035603225231,
-0.11258042603731155,
0.10825756192207336,
0.11200502514839172,
-0.11091233044862747,
0.011477576568722725,
0.032519109547138214,
-0.1330472081899643,
0.021482406184077263,
-0.012258491478860378,
-0.07051002234220505,
0.11857634037733078,
0.039386529475450516,
0.09274652600288391,
0.016122041270136833,
0.08409222960472107,
-0.26859304308891296,
0.014626238495111465,
0.060278233140707016,
0.06013225018978119,
0.06551355868577957,
0.10750869661569595,
-0.01878250576555729,
0.14538738131523132,
-0.07166951894760132,
0.05720868334174156,
0.055091388523578644,
-0.11322735995054245,
-0.3399525284767151,
-0.07358866184949875,
0.013114331290125847,
0.10950548201799393,
0.10811059176921844,
-0.04073416814208031,
0.06754757463932037,
-0.06821510195732117,
0.09388948231935501,
0.21367239952087402,
-0.22770710289478302,
-0.07812804728746414,
-0.04573792591691017,
0.039798229932785034,
0.035928696393966675,
-0.12485425919294357,
-0.030554361641407013,
0.036797504872083664,
0.048193614929914474,
0.07330254465341568,
0.013290979899466038,
-0.02934081107378006,
0.006531869061291218,
-0.1286003738641739,
-0.050428420305252075,
0.1480991393327713,
0.07999039441347122,
-0.03939521312713623,
-0.08520548790693283,
-0.005596770439296961,
-0.202241912484169,
-0.04471273720264435,
0.0067951069213449955,
0.016415515914559364,
-0.03634430840611458,
-0.11050080507993698,
0.017826318740844727,
-0.08905403316020966,
-0.1008266806602478,
0.020267672836780548,
0.17094624042510986,
0.049494970589876175,
-0.03637518733739853,
0.00864329282194376,
0.08926735818386078,
0.02246984653174877,
-0.13971534371376038,
-0.0017091543413698673,
0.05310216546058655,
-0.08528852462768555,
-0.01595863327383995,
-0.055562376976013184,
-0.04726288840174675,
0.0039276801981031895,
0.09015177935361862,
-0.035753022879362106,
0.08727860450744629,
-0.002109265187755227,
0.016568299382925034,
-0.0847269743680954,
0.17643387615680695,
-0.060846373438835144,
-0.002945849671959877,
-0.05037040635943413,
0.08822798728942871,
-0.03761652484536171,
-0.016531862318515778,
-0.04744764044880867,
0.008078210055828094,
0.12409024685621262,
0.03920438140630722,
-0.045111872255802155,
0.014289462938904762,
-0.06282547861337662,
-0.02155500277876854,
-0.04752146452665329,
-0.11237046122550964,
0.027010370045900345,
0.029782135039567947,
-0.07822666317224503,
0.02208702825009823,
-0.0027894258964806795,
0.023958874866366386,
-0.035520631819963455,
0.10110203176736832,
-0.061821505427360535,
-0.00009992322156904265,
-0.07805538177490234,
-0.10859336704015732,
0.029199885204434395,
-0.032968342304229736,
0.015945710241794586,
-0.05535898730158806,
-0.09231507778167725,
-0.06228742003440857,
0.04069330170750618,
-0.044259943068027496,
-0.07684799283742905,
-0.07844782620668411,
-0.0736638680100441,
0.04372347146272659,
-0.03291528671979904,
0.16961820423603058,
-0.05776038020849228,
0.12060768902301788,
0.05208859220147133,
0.03254152834415436,
0.026409359648823738,
0.0865219235420227,
-0.042381491512060165,
0.027259504422545433,
-0.11891458928585052,
0.08778900653123856,
-0.08449628204107285,
0.03770098090171814,
-0.13071709871292114,
-0.13804037868976593,
-0.030498968437314034,
0.012239491567015648,
0.1099928766489029,
0.0892641544342041,
-0.18088792264461517,
-0.10263686627149582,
0.18084771931171417,
-0.0799163356423378,
-0.08886119723320007,
0.14028042554855347,
-0.019751880317926407,
0.0038209958001971245,
0.058766692876815796,
0.1927611082792282,
0.08769779652357101,
-0.07905358076095581,
0.017309244722127914,
-0.06228657439351082,
0.13736042380332947,
0.036789704114198685,
0.10120023041963577,
-0.053529441356658936,
0.009656493552029133,
-0.002534023951739073,
-0.009717435576021671,
0.06610766053199768,
-0.10351897776126862,
-0.07578741759061813,
-0.01849810965359211,
-0.08496169000864029,
-0.005201178137212992,
0.07181006669998169,
0.03421556204557419,
-0.11329120397567749,
-0.11553038656711578,
0.028054581955075264,
0.10552109032869339,
-0.12163352966308594,
0.03004889376461506,
-0.08058056235313416,
0.027946196496486664,
-0.020535394549369812,
-0.02094239741563797,
-0.16763608157634735,
0.0060011474415659904,
0.03143149986863136,
-0.016649682074785233,
0.03199904039502144,
-0.01084356289356947,
0.07953382283449173,
0.03850039839744568,
-0.052764903753995895,
-0.0738896057009697,
-0.06955960392951965,
-0.007695764768868685,
-0.0758822038769722,
-0.21031884849071503,
-0.07068677991628647,
-0.023295802995562553,
0.14267395436763763,
-0.22285841405391693,
0.0014156431425362825,
0.0339881032705307,
0.10722679644823074,
0.028190359473228455,
-0.053821224719285965,
-0.0019313020166009665,
0.08846915513277054,
-0.01016214955598116,
-0.055612243711948395,
0.03143937513232231,
0.006379708647727966,
-0.11716353893280029,
0.028890028595924377,
-0.13304004073143005,
0.0915326178073883,
0.10255623608827591,
-0.04227989539504051,
-0.07342109084129333,
-0.0549507811665535,
-0.06470756977796555,
-0.061666857451200485,
-0.02423807606101036,
-0.012165915220975876,
0.21362639963626862,
0.02070864476263523,
0.11775971949100494,
-0.08177902549505234,
-0.02984427660703659,
0.03331439942121506,
0.001620367169380188,
-0.00517930556088686,
0.13657917082309723,
0.03090650960803032,
-0.0320528969168663,
0.09350792318582535,
0.038514476269483566,
-0.07688819617033005,
0.16389861702919006,
-0.08137121051549911,
-0.13227270543575287,
-0.007366003002971411,
0.023718426004052162,
0.03131374344229698,
0.104747474193573,
-0.18184716999530792,
-0.005770870950073004,
0.01895575039088726,
0.043069884181022644,
0.03987773880362511,
-0.22045426070690155,
-0.012611844576895237,
0.04566902667284012,
-0.0767534002661705,
-0.061205316334962845,
0.0015740945236757398,
0.0007933495217002928,
0.08637447655200958,
-0.00040052979602478445,
-0.05103594437241554,
-0.014875567518174648,
-0.03536726161837578,
-0.09224776923656464,
0.18361034989356995,
-0.10463328659534454,
-0.15458156168460846,
-0.1376299262046814,
-0.03280620649456978,
-0.002364299027249217,
-0.0031614392064511776,
0.0540052205324173,
-0.12687928974628448,
-0.035625800490379333,
-0.04817039892077446,
0.07060996443033218,
-0.08123201131820679,
0.0318048931658268,
-0.005073624663054943,
0.007268640212714672,
0.09630522131919861,
-0.10940925776958466,
0.015241599641740322,
-0.019922353327274323,
-0.0537545345723629,
0.020882437005639076,
0.016309134662151337,
0.09328938275575638,
0.16095614433288574,
0.037579916417598724,
0.025361936539411545,
-0.037610240280628204,
0.17330195009708405,
-0.11093386262655258,
-0.05271866172552109,
0.12125761061906815,
-0.0021586085204035044,
0.028739212080836296,
0.11032387614250183,
0.04817987233400345,
-0.09251458197832108,
0.02922639064490795,
0.05377665162086487,
-0.016068553552031517,
-0.24873462319374084,
-0.03138657659292221,
-0.07237769663333893,
-0.03873956948518753,
0.1020231693983078,
0.027513647451996803,
-0.004800252616405487,
0.023234166204929352,
-0.018452312797307968,
-0.010598254390060902,
0.007918931543827057,
0.05592093989253044,
0.09907416999340057,
0.03280215337872505,
0.11343280971050262,
-0.012156659737229347,
-0.04703841730952263,
0.01983305811882019,
0.0009946743957698345,
0.2533171772956848,
0.01570017635822296,
0.16186852753162384,
0.05432736501097679,
0.15303248167037964,
0.01009477861225605,
0.08922525495290756,
0.009919656440615654,
-0.025819852948188782,
0.025304364040493965,
-0.04795762524008751,
-0.04395408183336258,
0.028922609984874725,
0.09760923683643341,
0.03772294521331787,
-0.1504519283771515,
-0.03025048039853573,
0.007838639430701733,
0.36083540320396423,
0.0780099406838417,
-0.2916276156902313,
-0.09996049106121063,
-0.006186333019286394,
-0.10030299425125122,
-0.060720622539520264,
0.03331955894827843,
0.11365770548582077,
-0.09823262691497803,
0.05128561705350876,
-0.06052793189883232,
0.1024286076426506,
-0.05868220329284668,
0.009551393799483776,
0.058814726769924164,
0.07257962226867676,
-0.004422713536769152,
0.07606874406337738,
-0.2881627082824707,
0.31432613730430603,
-0.01678607612848282,
0.09000423550605774,
-0.03678661212325096,
0.028084659948945045,
0.028093937784433365,
-0.05246666073799133,
0.05362870171666145,
-0.011915168724954128,
-0.09384941309690475,
-0.20602144300937653,
-0.059672605246305466,
0.04221491888165474,
0.14386487007141113,
-0.04256744310259819,
0.12129142135381699,
-0.028266873210668564,
0.007396037224680185,
0.06907106190919876,
-0.0863923579454422,
-0.10857759416103363,
-0.09740135073661804,
0.014438272453844547,
0.04355732351541519,
0.105716273188591,
-0.10111863166093826,
-0.11109364032745361,
-0.0650474950671196,
0.13411076366901398,
-0.05232863128185272,
-0.010990147478878498,
-0.12594817578792572,
0.10040292888879776,
0.17583085596561432,
-0.06656042486429214,
0.044812027364969254,
0.0314149484038353,
0.12301380187273026,
0.037806812673807144,
0.010439880192279816,
0.10311123728752136,
-0.07351077347993851,
-0.19191259145736694,
-0.037170324474573135,
0.18621422350406647,
0.050918251276016235,
0.07299238443374634,
-0.019439317286014557,
0.02621452324092388,
-0.02294779196381569,
-0.06438671797513962,
0.062469106167554855,
-0.007145493291318417,
0.003713887883350253,
0.07438816875219345,
-0.03338323533535004,
0.018782922998070717,
-0.09182854741811752,
-0.06486144661903381,
0.17064622044563293,
0.27101126313209534,
-0.05501798912882805,
0.007156741805374622,
0.029911456629633904,
-0.04762980341911316,
-0.13604208827018738,
0.03856409713625908,
0.15216168761253357,
0.05522570759057999,
-0.00812531728297472,
-0.2464035600423813,
0.05850879102945328,
0.09020980447530746,
-0.0132901044562459,
0.06664010137319565,
-0.2845851182937622,
-0.13041254878044128,
0.12261746823787689,
0.09233123064041138,
-0.026507338508963585,
-0.1425192952156067,
-0.05778083577752113,
-0.02450604736804962,
-0.09117545187473297,
0.05617315694689751,
-0.004605036228895187,
0.13311274349689484,
0.007473645266145468,
0.0827847495675087,
0.036420539021492004,
-0.04161073639988899,
0.14617842435836792,
-0.00540919229388237,
0.04919099807739258,
0.0015243086963891983,
0.0668235495686531,
-0.025046762079000473,
-0.031246935948729515,
0.013547468930482864,
-0.07367206364870071,
0.013577720150351524,
-0.14362400770187378,
-0.043388571590185165,
-0.09805481135845184,
0.024486323818564415,
-0.025713643059134483,
-0.038126442581415176,
-0.0034670191816985607,
0.03353709727525711,
0.060943663120269775,
0.013456125743687153,
0.11185935139656067,
-0.07725599408149719,
0.16906960308551788,
0.058834392577409744,
0.11952508240938187,
-0.01837136037647724,
-0.07122516632080078,
-0.00650485884398222,
-0.006387483794242144,
0.04719795659184456,
-0.10772743076086044,
0.03798839449882507,
0.1516989767551422,
0.045517995953559875,
0.16777276992797852,
0.06244311481714249,
-0.08439984172582626,
0.023068983107805252,
0.07045019418001175,
-0.0663178414106369,
-0.12731856107711792,
-0.029657024890184402,
0.04707108810544014,
-0.1397480070590973,
0.010545918717980385,
0.09314504265785217,
-0.0526592992246151,
-0.011475838720798492,
-0.0019976284820586443,
0.013399478048086166,
-0.07752548158168793,
0.2413642406463623,
0.029455190524458885,
0.08246822655200958,
-0.09823693335056305,
0.07328102737665176,
0.042106494307518005,
-0.1644841581583023,
0.03310086205601692,
0.0705803781747818,
-0.027173461392521858,
-0.010404163040220737,
0.010669323615729809,
0.07280541211366653,
0.016686471179127693,
-0.0679410845041275,
-0.11497505009174347,
-0.15556120872497559,
0.08256196230649948,
0.09916188567876816,
0.03182290121912956,
0.029947001487016678,
-0.04812125861644745,
0.050375796854496,
-0.13116925954818726,
0.08009093254804611,
0.10703539848327637,
0.06547610461711884,
-0.1352807581424713,
0.17892006039619446,
0.012157534249126911,
0.007865272462368011,
0.00929370429366827,
-0.01975766010582447,
-0.08166297525167465,
0.046930719166994095,
-0.11451483517885208,
-0.03907005116343498,
-0.0457586832344532,
-0.009281530044972897,
0.016874007880687714,
-0.07107304781675339,
-0.060181669890880585,
0.030627138912677765,
-0.12904804944992065,
-0.04449041932821274,
-0.005287733394652605,
0.063664510846138,
-0.09029395878314972,
-0.018057221546769142,
0.055853504687547684,
-0.10905464738607407,
0.09075653553009033,
0.07570522278547287,
0.004516200162470341,
0.06305529177188873,
-0.11811074614524841,
0.0038417885079979897,
0.05662042275071144,
0.0010796295246109366,
0.021332023665308952,
-0.16002000868320465,
-0.0033636002335697412,
-0.001026846352033317,
0.050027236342430115,
0.00005457473889691755,
0.03509730473160744,
-0.12747052311897278,
-0.055926449596881866,
-0.032202258706092834,
-0.058450862765312195,
-0.055546924471855164,
0.04306173324584961,
0.06343685835599899,
0.06343203783035278,
0.15926453471183777,
-0.0918985903263092,
0.055933885276317596,
-0.22053398191928864,
0.018071824684739113,
-0.04644002020359039,
-0.07581916451454163,
-0.057614583522081375,
-0.02896791696548462,
0.0956663265824318,
-0.06707283854484558,
0.08261314779520035,
-0.05177270248532295,
0.05871386453509331,
0.027184579521417618,
-0.12670890986919403,
0.03320367634296417,
0.03943658620119095,
0.2664569318294525,
0.059515729546546936,
-0.030172783881425858,
0.08177655935287476,
0.007533184252679348,
0.05426613241434097,
0.19892878830432892,
0.14025992155075073,
0.17831790447235107,
0.02866549603641033,
0.08895015716552734,
0.08317294716835022,
-0.11024614423513412,
-0.10767930001020432,
0.09542115032672882,
-0.01982322707772255,
0.11757610738277435,
-0.0023482516407966614,
0.2517177164554596,
0.10109812021255493,
-0.19463446736335754,
0.06557601690292358,
-0.043746668845415115,
-0.0974569097161293,
-0.10232096910476685,
-0.03997344151139259,
-0.07432476431131363,
-0.1849585473537445,
0.020896971225738525,
-0.12922777235507965,
0.06371006369590759,
0.06456612050533295,
0.0392574779689312,
0.010169080458581448,
0.15180455148220062,
0.03369864448904991,
-0.013278241269290447,
0.12863236665725708,
0.0023702767211943865,
-0.006786432582885027,
-0.05732544884085655,
-0.10142926126718521,
0.061310864984989166,
-0.02407875843346119,
0.05915144830942154,
-0.05416199937462807,
-0.114464171230793,
0.05226323381066322,
-0.011302812956273556,
-0.11097660660743713,
0.01752549037337303,
-0.014021257869899273,
0.07784043252468109,
0.10148848593235016,
0.034904949367046356,
-0.009720790199935436,
-0.01877143792808056,
0.2457960844039917,
-0.10632336139678955,
-0.05296357348561287,
-0.13799883425235748,
0.24867784976959229,
0.029674092307686806,
-0.012326259166002274,
0.018780630081892014,
-0.07572752982378006,
-0.003525230335071683,
0.15819795429706573,
0.12590523064136505,
-0.0039710174314677715,
-0.01654491014778614,
0.005037215538322926,
-0.013221470639109612,
-0.049364980310201645,
0.08056456595659256,
0.13981980085372925,
0.03833622857928276,
-0.056878846138715744,
-0.03939734026789665,
-0.0641048327088356,
-0.04617372527718544,
-0.017439374700188637,
0.06927493214607239,
0.03832165524363518,
-0.02217721939086914,
-0.02220204845070839,
0.12856309115886688,
-0.07528773695230484,
-0.1138891875743866,
-0.006603454705327749,
-0.18034599721431732,
-0.18062783777713776,
-0.044933874160051346,
0.027093790471553802,
0.048587996512651443,
0.045325253158807755,
-0.02235349453985691,
-0.0077985855750739574,
0.0963025912642479,
0.012448643334209919,
-0.0346270389854908,
-0.11924595385789871,
0.09279539436101913,
-0.09613491594791412,
0.16154249012470245,
-0.03780408948659897,
0.038854025304317474,
0.10794651508331299,
0.08476781845092773,
-0.054158687591552734,
0.07254218310117722,
0.06332787126302719,
-0.13254490494728088,
0.04409285634756088,
0.2213340550661087,
-0.04189535230398178,
0.13886073231697083,
0.03808748349547386,
-0.1418074071407318,
0.01862414740025997,
-0.10771942883729935,
-0.05198758840560913,
-0.0626760795712471,
-0.016027936711907387,
-0.05764923617243767,
0.12084244936704636,
0.22898150980472565,
-0.0670253187417984,
-0.02332405000925064,
-0.06580395251512527,
0.00695943646132946,
0.050901759415864944,
0.13069209456443787,
-0.05159921571612358,
-0.30697736144065857,
0.008561113849282265,
0.013173466548323631,
-0.005652649328112602,
-0.25324124097824097,
-0.07826028019189835,
0.03637303039431572,
-0.07440268248319626,
-0.040375176817178726,
0.11898031085729599,
0.06072871759533882,
0.04320725426077843,
-0.056059934198856354,
-0.08610133081674576,
-0.02776486426591873,
0.20560652017593384,
-0.17644527554512024,
-0.06181729957461357
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# common7
This model is a fine-tuned version of [common7/checkpoint-18500](https://huggingface.co/common7/checkpoint-18500) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - FA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3448
- Wer: 0.3478
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 150.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:-----:|:---------------:|:------:|
| 2.957 | 3.29 | 500 | 2.9503 | 1.0 |
| 1.7225 | 6.58 | 1000 | 0.8860 | 0.7703 |
| 1.4907 | 9.86 | 1500 | 0.6555 | 0.6673 |
| 1.4177 | 13.16 | 2000 | 0.5784 | 0.6076 |
| 1.3425 | 16.45 | 2500 | 0.5379 | 0.5718 |
| 1.33 | 19.73 | 3000 | 0.4962 | 0.5245 |
| 1.4378 | 23.03 | 3500 | 0.4699 | 0.5098 |
| 1.1894 | 26.31 | 4000 | 0.4527 | 0.4848 |
| 1.1844 | 29.6 | 4500 | 0.4309 | 0.4651 |
| 1.1795 | 32.89 | 5000 | 0.4131 | 0.4524 |
| 1.1471 | 36.18 | 5500 | 0.4052 | 0.4435 |
| 1.1337 | 39.47 | 6000 | 0.3927 | 0.4363 |
| 1.1896 | 42.76 | 6500 | 0.3811 | 0.4254 |
| 1.1847 | 46.05 | 7000 | 0.3855 | 0.4129 |
| 0.9954 | 49.34 | 7500 | 0.3729 | 0.3981 |
| 1.0293 | 52.63 | 8000 | 0.3637 | 0.4014 |
| 1.0224 | 55.92 | 8500 | 0.3578 | 0.3885 |
| 1.012 | 59.21 | 9000 | 0.3629 | 0.3930 |
| 1.0772 | 62.5 | 9500 | 0.3635 | 0.3906 |
| 1.0344 | 65.79 | 10000 | 0.3469 | 0.3771 |
| 0.9457 | 69.08 | 10500 | 0.3435 | 0.3735 |
| 0.9307 | 72.37 | 11000 | 0.3519 | 0.3762 |
| 0.9523 | 75.65 | 11500 | 0.3443 | 0.3666 |
| 0.9523 | 78.94 | 12000 | 0.3502 | 0.3757 |
| 0.9475 | 82.24 | 12500 | 0.3509 | 0.3643 |
| 0.9971 | 85.52 | 13000 | 0.3502 | 0.3626 |
| 0.9058 | 88.81 | 13500 | 0.3472 | 0.3605 |
| 0.8922 | 92.1 | 14000 | 0.3530 | 0.3618 |
| 0.9 | 95.39 | 14500 | 0.3500 | 0.3574 |
| 0.9051 | 98.68 | 15000 | 0.3456 | 0.3535 |
| 0.9304 | 101.97 | 15500 | 0.3438 | 0.3578 |
| 0.9433 | 105.26 | 16000 | 0.3396 | 0.3530 |
| 0.8988 | 108.55 | 16500 | 0.3436 | 0.3539 |
| 0.8789 | 111.84 | 17000 | 0.3426 | 0.3516 |
| 0.8667 | 115.13 | 17500 | 0.3438 | 0.3506 |
| 0.8895 | 118.42 | 18000 | 0.3434 | 0.3503 |
| 0.8888 | 121.71 | 18500 | 0.3425 | 0.3494 |
| 0.9453 | 125.0 | 19000 | 0.3415 | 0.3480 |
| 0.9267 | 128.29 | 19500 | 0.3477 | 0.3503 |
| 0.8315 | 131.58 | 20000 | 0.3476 | 0.3505 |
| 0.8542 | 134.86 | 20500 | 0.3475 | 0.3506 |
| 0.8478 | 138.16 | 21000 | 0.3430 | 0.3481 |
| 0.8643 | 141.45 | 21500 | 0.3451 | 0.3485 |
| 0.8705 | 144.73 | 22000 | 0.3444 | 0.3474 |
| 0.9869 | 148.03 | 22500 | 0.3441 | 0.3493 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.3.dev0
- Tokenizers 0.10.3
|
{"language": ["fa"], "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_7_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "common7", "results": []}]}
|
automatic-speech-recognition
|
ghofrani/common7
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_7_0",
"generated_from_trainer",
"fa",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fa"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us
|
common7
=======
This model is a fine-tuned version of common7/checkpoint-18500 on the MOZILLA-FOUNDATION/COMMON\_VOICE\_7\_0 - FA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3448
* Wer: 0.3478
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 6e-05
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 128
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 150.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.3.dev0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 150.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 150.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
71,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_7_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 150.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
-0.12231084704399109,
0.06722528487443924,
-0.003954681567847729,
0.040873412042856216,
0.12504468858242035,
0.014249385334551334,
0.08379577100276947,
0.14482538402080536,
-0.10104702413082123,
0.08925904333591461,
0.08855915069580078,
0.08122069388628006,
0.07269507646560669,
0.1178816631436348,
-0.028818953782320023,
-0.28921687602996826,
0.01534270215779543,
-0.0132745997980237,
-0.09669046103954315,
0.10680289566516876,
0.09624464064836502,
-0.10853200405836105,
0.021447328850626945,
0.012018844485282898,
-0.10603878647089005,
0.006541518494486809,
-0.02628401480615139,
-0.04679182544350624,
0.11873960494995117,
0.056875526905059814,
0.06249355152249336,
0.0338020995259285,
0.06902096420526505,
-0.2708577513694763,
0.01480994001030922,
0.05527586117386818,
0.04379193112254143,
0.05180583894252777,
0.09991487115621567,
-0.013006175868213177,
0.1340353637933731,
-0.07747090607881546,
0.04100412875413895,
0.05439592897891998,
-0.10189551115036011,
-0.3123217821121216,
-0.08896283060312271,
-0.004374621901661158,
0.11685293167829514,
0.09230305254459381,
-0.04335099458694458,
0.055364832282066345,
-0.07956359535455704,
0.10218798369169235,
0.23248158395290375,
-0.23495697975158691,
-0.06967388093471527,
-0.017813414335250854,
0.050906404852867126,
0.040194034576416016,
-0.10865006595849991,
0.001843124977312982,
0.027267416939139366,
0.03576447442173958,
0.0788801833987236,
0.0034355854149907827,
0.0006131313857622445,
-0.0005975045496597886,
-0.13405202329158783,
-0.02950325608253479,
0.11303751170635223,
0.08243448287248611,
-0.02409549243748188,
-0.10956268012523651,
-0.01841225102543831,
-0.1874275505542755,
-0.0500546433031559,
-0.004864685703068972,
0.02988256700336933,
-0.028665989637374878,
-0.08480851352214813,
0.008499485440552235,
-0.06254354864358902,
-0.08183053135871887,
0.015825055539608,
0.15436871349811554,
0.0387006551027298,
-0.04235854372382164,
0.011134401895105839,
0.08641472458839417,
0.034125737845897675,
-0.14131714403629303,
-0.003470245748758316,
0.04486581310629845,
-0.09996035695075989,
-0.017254197970032692,
-0.04621012881398201,
-0.034923601895570755,
0.03365053981542587,
0.11721271276473999,
-0.0344175361096859,
0.09569910168647766,
-0.006497770082205534,
0.010947812348604202,
-0.07529313117265701,
0.1473502814769745,
-0.06156627833843231,
-0.0937647894024849,
-0.04592718929052353,
0.09252173453569412,
-0.002130951266735792,
-0.017416944727301598,
-0.07666172087192535,
0.010376479476690292,
0.0963345319032669,
0.04904765635728836,
-0.024140717461705208,
0.023719632998108864,
-0.06989589333534241,
-0.02204854041337967,
-0.010060258209705353,
-0.10973644256591797,
0.046826209872961044,
0.0434332937002182,
-0.05841522663831711,
0.013109494000673294,
-0.01459265872836113,
0.029418369755148888,
-0.014422175474464893,
0.1074753850698471,
-0.04355079308152199,
0.004019786603748798,
-0.06400854140520096,
-0.11354563385248184,
0.03186865895986557,
-0.04370734840631485,
0.00490913912653923,
-0.06219891831278801,
-0.08436618000268936,
-0.06427332758903503,
0.04352233186364174,
-0.06453545391559601,
-0.04993317648768425,
-0.087461456656456,
-0.06654547154903412,
0.051953062415122986,
-0.02383446879684925,
0.19066199660301208,
-0.06486926227807999,
0.10305177420377731,
0.03421080857515335,
0.04933132603764534,
0.04761410132050514,
0.07695186883211136,
-0.02516394481062889,
0.03273826092481613,
-0.1503686159849167,
0.11047620326280594,
-0.08651821315288544,
0.026562144979834557,
-0.13354194164276123,
-0.10482816398143768,
-0.022175364196300507,
0.00722513860091567,
0.10638318955898285,
0.11992347985506058,
-0.18596810102462769,
-0.10949478298425674,
0.16361697018146515,
-0.06472428888082504,
-0.06738383322954178,
0.15357846021652222,
-0.0247043389827013,
-0.02515045367181301,
0.05510453134775162,
0.18266910314559937,
0.10084045678377151,
-0.09510289132595062,
0.02033335529267788,
-0.06271311640739441,
0.11919369548559189,
0.059415627270936966,
0.08976934850215912,
-0.053132615983486176,
0.021621670573949814,
-0.0036461432464420795,
0.0031220538076013327,
0.08157513290643692,
-0.09503164887428284,
-0.07162487506866455,
-0.01193025428801775,
-0.080750472843647,
0.030921921133995056,
0.05046666041016579,
0.020550617948174477,
-0.10929272323846817,
-0.11249805241823196,
0.004097133409231901,
0.1041325181722641,
-0.10309265553951263,
0.04007059708237648,
-0.07780425995588303,
0.04539043828845024,
-0.014980729669332504,
-0.004951137118041515,
-0.16713187098503113,
0.0299440398812294,
0.0349377803504467,
-0.01909434050321579,
0.0260019451379776,
-0.006366932298988104,
0.08063411712646484,
0.04379884898662567,
-0.0658649355173111,
-0.051489029079675674,
-0.03695746511220932,
-0.0023094965144991875,
-0.07887393981218338,
-0.250311940908432,
-0.05871548503637314,
-0.03260842338204384,
0.16381846368312836,
-0.19844557344913483,
-0.0013350622029975057,
0.04148424416780472,
0.1307249218225479,
0.034926917403936386,
-0.04742912948131561,
0.007598405238240957,
0.08690787851810455,
-0.01618819870054722,
-0.0585455521941185,
0.037430085241794586,
-0.0013802676694467664,
-0.13856960833072662,
0.008164060302078724,
-0.12485022097826004,
0.0630405843257904,
0.09785176068544388,
-0.018340865150094032,
-0.08622818440198898,
-0.06372523307800293,
-0.052839573472738266,
-0.05364595726132393,
-0.016583941876888275,
-0.007105445489287376,
0.19188319146633148,
0.01568613573908806,
0.10999692976474762,
-0.07007617503404617,
-0.02929549477994442,
0.03840070590376854,
0.021645698696374893,
-0.012322723865509033,
0.14183621108531952,
0.05087583512067795,
-0.06437543779611588,
0.08833453804254532,
0.0672587975859642,
-0.04655890539288521,
0.17037084698677063,
-0.061666686087846756,
-0.09293641149997711,
-0.03319892659783363,
0.02065495029091835,
0.011320151388645172,
0.08818349987268448,
-0.1976977437734604,
-0.01661393605172634,
0.019135745242238045,
0.036341018974781036,
0.02487807162106037,
-0.18765634298324585,
-0.0004632451164070517,
0.05328385904431343,
-0.08029439300298691,
-0.0003632933658082038,
0.005862795747816563,
-0.004365680273622274,
0.0904616042971611,
0.01392301544547081,
-0.07396996766328812,
-0.017194820567965508,
-0.03686712682247162,
-0.08611471951007843,
0.16123442351818085,
-0.11226068437099457,
-0.14542146027088165,
-0.10111870616674423,
-0.06294092535972595,
-0.024156365543603897,
-0.0045998552814126015,
0.06418784707784653,
-0.11740712821483612,
-0.022200923413038254,
-0.05441737547516823,
0.050901055335998535,
-0.06688523292541504,
0.039743706583976746,
0.00864473171532154,
-0.004850070457905531,
0.057216767221689224,
-0.10322310030460358,
0.014203235507011414,
-0.023026667535305023,
-0.01822585240006447,
0.003575799521058798,
0.02315150946378708,
0.09068428725004196,
0.16580632328987122,
0.05075168237090111,
0.023259345442056656,
-0.05294664204120636,
0.1851721554994583,
-0.12862269580364227,
-0.015089090913534164,
0.12040109932422638,
0.004341243766248226,
0.026405833661556244,
0.14938955008983612,
0.04662841185927391,
-0.09345124661922455,
0.01492023840546608,
0.03990167751908302,
-0.017577089369297028,
-0.235308438539505,
-0.0377943255007267,
-0.08110436052083969,
-0.028639689087867737,
0.07049266993999481,
0.026728184893727303,
-0.0053286380134522915,
0.010276908054947853,
-0.029761003330349922,
-0.003815140575170517,
0.02105199545621872,
0.044978901743888855,
0.09774212539196014,
0.030528223142027855,
0.12263073772192001,
-0.011902929283678532,
-0.026298683136701584,
0.03739704564213753,
-0.017990412190556526,
0.2285546213388443,
0.027430053800344467,
0.167916938662529,
0.05240432173013687,
0.14543238282203674,
-0.0058723376132547855,
0.02950531430542469,
0.013486048206686974,
-0.020505713298916817,
0.006104703992605209,
-0.05354242026805878,
-0.0349479615688324,
0.028971565887331963,
0.12394315004348755,
0.024329328909516335,
-0.13066846132278442,
-0.02011735737323761,
0.025204481557011604,
0.34772002696990967,
0.08218780159950256,
-0.25735676288604736,
-0.0870581790804863,
0.009188775904476643,
-0.07564723491668701,
-0.03180612623691559,
0.02290239743888378,
0.12408167868852615,
-0.08550512790679932,
0.0840369388461113,
-0.04647452011704445,
0.08785570412874222,
-0.06102341040968895,
0.014902627095580101,
0.05298076197504997,
0.08521498739719391,
-0.003763847751542926,
0.050155602395534515,
-0.2761286795139313,
0.284695565700531,
0.001627917168661952,
0.08174204081296921,
-0.04705635830760002,
0.0378674641251564,
0.038392312824726105,
-0.01698433794081211,
0.05913503095507622,
-0.012640432454645634,
-0.1103331670165062,
-0.16693712770938873,
-0.06820200383663177,
0.01679662987589836,
0.13905522227287292,
-0.05589526519179344,
0.10994001477956772,
-0.027290672063827515,
-0.018933141604065895,
0.05246307700872421,
-0.032339997589588165,
-0.12215476483106613,
-0.10207466781139374,
0.019408825784921646,
0.06795377284288406,
0.07059526443481445,
-0.07371668517589569,
-0.10307731479406357,
-0.09426164627075195,
0.17839255928993225,
-0.07675646990537643,
-0.012613227590918541,
-0.12307040393352509,
0.08925648778676987,
0.1480846107006073,
-0.06442554295063019,
0.03986690565943718,
0.020281990990042686,
0.10514739155769348,
0.015197315253317356,
-0.005810457281768322,
0.1214047372341156,
-0.07532194256782532,
-0.20333260297775269,
-0.062301237136125565,
0.1863252967596054,
0.04739303141832352,
0.07768009603023529,
-0.026210393756628036,
0.02521066926419735,
0.0025744109880179167,
-0.06212397292256355,
0.08623342216014862,
0.03912390395998955,
0.017149947583675385,
0.07422670722007751,
-0.024909021332859993,
-0.021688219159841537,
-0.08228476345539093,
-0.08644194155931473,
0.15023992955684662,
0.29704633355140686,
-0.08008100092411041,
0.05513822287321091,
0.05152638629078865,
-0.050695840269327164,
-0.1371372640132904,
0.020175723358988762,
0.1310257613658905,
0.0532655194401741,
0.015428716316819191,
-0.2202633023262024,
-0.0006281771929934621,
0.07927859574556351,
-0.016143783926963806,
0.0856163427233696,
-0.3313637673854828,
-0.1365160048007965,
0.08917698264122009,
0.07112272083759308,
0.01080815214663744,
-0.1514974683523178,
-0.06052108108997345,
-0.02176538296043873,
-0.08944164216518402,
0.020829683169722557,
-0.02888265810906887,
0.14642196893692017,
0.016598209738731384,
0.034004807472229004,
0.026515407487750053,
-0.03587106615304947,
0.1449110358953476,
-0.010243241675198078,
0.048351798206567764,
-0.011851359158754349,
0.03275280445814133,
-0.04588624835014343,
-0.04384421557188034,
-0.017582884058356285,
-0.08594672381877899,
0.014049570076167583,
-0.11799723654985428,
-0.038650788366794586,
-0.09071113914251328,
0.019675420597195625,
-0.03132108598947525,
-0.04143364727497101,
-0.0345611497759819,
0.037585895508527756,
0.06870512664318085,
0.009355244226753712,
0.10498152673244476,
-0.08938435465097427,
0.1586010903120041,
0.10883913189172745,
0.11591164022684097,
-0.017778225243091583,
-0.06691602617502213,
-0.008133525028824806,
-0.018588058650493622,
0.04679974913597107,
-0.09624743461608887,
0.03475494310259819,
0.13466116786003113,
0.03804294764995575,
0.15786656737327576,
0.04809607192873955,
-0.07988070696592331,
0.029593372717499733,
0.05764436721801758,
-0.07092656195163727,
-0.14873234927654266,
-0.020952124148607254,
-0.004342762753367424,
-0.12152409553527832,
0.008927988819777966,
0.12761268019676208,
-0.02548777312040329,
-0.00846001598984003,
0.0077257356606423855,
0.02601812779903412,
-0.05283784866333008,
0.2353011816740036,
0.0016565026016905904,
0.06987437605857849,
-0.10312703251838684,
0.0628579631447792,
0.04570866376161575,
-0.14324112236499786,
0.04382628574967384,
0.07890422642230988,
-0.04351087659597397,
-0.0165335014462471,
0.018203480169177055,
0.10184505581855774,
0.046981051564216614,
-0.050227921456098557,
-0.10343073308467865,
-0.15217408537864685,
0.0946420356631279,
0.08448606729507446,
0.029050437733530998,
0.023906931281089783,
-0.03476373106241226,
0.04135162755846977,
-0.09546542167663574,
0.08941837400197983,
0.0862571969628334,
0.04775913432240486,
-0.1212851032614708,
0.1573667973279953,
0.01179102249443531,
0.004295989405363798,
-0.0018132033292204142,
-0.011672786436975002,
-0.08608977496623993,
0.03121870756149292,
-0.14051346480846405,
-0.03752869367599487,
-0.04971080645918846,
-0.005908761639147997,
0.011563612148165703,
-0.06624756753444672,
-0.05478903278708458,
0.020515426993370056,
-0.11609987914562225,
-0.042740851640701294,
-0.020058445632457733,
0.06676679849624634,
-0.08872486650943756,
-0.028668740764260292,
0.02624090574681759,
-0.10760301351547241,
0.09298332035541534,
0.06230948120355606,
0.015223664231598377,
0.02210862934589386,
-0.12029071897268295,
-0.011729448102414608,
0.04273577034473419,
-0.009615108370780945,
0.022023633122444153,
-0.17627958953380585,
-0.01766185648739338,
-0.017727622762322426,
0.030694367364048958,
-0.0012236658949404955,
0.033188596367836,
-0.11822492629289627,
-0.021075792610645294,
-0.04498228058218956,
-0.06149621680378914,
-0.04346834495663643,
0.04505547881126404,
0.07589282095432281,
0.03919651731848717,
0.14751258492469788,
-0.09515972435474396,
0.06792070716619492,
-0.22151845693588257,
0.006238051224499941,
-0.04017645865678787,
-0.06712911278009415,
-0.07250039279460907,
-0.038241103291511536,
0.10126789659261703,
-0.05438155308365822,
0.08218267560005188,
-0.04947631061077118,
0.06598960608243942,
0.023838529363274574,
-0.13179156184196472,
0.02689480222761631,
0.03641168400645256,
0.18471473455429077,
0.06068339943885803,
-0.03439436852931976,
0.07640185952186584,
0.013292646035552025,
0.07633385062217712,
0.19021843373775482,
0.14650112390518188,
0.15242540836334229,
0.06917832046747208,
0.09551161527633667,
0.06675165891647339,
-0.12728118896484375,
-0.1318455934524536,
0.1206170916557312,
-0.03986235707998276,
0.1338636577129364,
-0.025289423763751984,
0.21899130940437317,
0.09803251177072525,
-0.19499871134757996,
0.07343067973852158,
-0.04027433693408966,
-0.07795204967260361,
-0.10346952080726624,
-0.04870836064219475,
-0.08471447229385376,
-0.18484732508659363,
0.006968486122786999,
-0.10171770304441452,
0.06594903022050858,
0.02452097274363041,
0.04741509631276131,
0.024558749049901962,
0.10922810435295105,
0.03122139349579811,
-0.010792337357997894,
0.12285935133695602,
0.00486400444060564,
-0.014173826202750206,
-0.05884919315576553,
-0.1137654110789299,
0.07122594863176346,
-0.03627604618668556,
0.06020195782184601,
-0.037498969584703445,
-0.11551578342914581,
0.05311301350593567,
0.006165919825434685,
-0.11491803079843521,
0.021774306893348694,
-0.007707853335887194,
0.08333424478769302,
0.12094274163246155,
0.03450171276926994,
-0.0073342653922736645,
-0.011658400297164917,
0.21311050653457642,
-0.09366334229707718,
-0.05393609404563904,
-0.12553063035011292,
0.22054336965084076,
0.010853726416826248,
0.008517286740243435,
0.01060679741203785,
-0.07820378243923187,
-0.012671809643507004,
0.17041228711605072,
0.16148768365383148,
-0.022540275007486343,
-0.013880337588489056,
0.02589646726846695,
-0.010096109472215176,
-0.02612379752099514,
0.06362353265285492,
0.11795751005411148,
0.041631121188402176,
-0.046342652291059494,
-0.01777222752571106,
-0.04481128603219986,
-0.05339173227548599,
-0.012591873295605183,
0.08329319953918457,
0.02901768498122692,
-0.011650203727185726,
-0.00884155835956335,
0.11808149516582489,
-0.04418246075510979,
-0.16336408257484436,
0.03760332986712456,
-0.1905783861875534,
-0.17559660971164703,
-0.030537476763129234,
0.05380353331565857,
0.06251997500658035,
0.05851789936423302,
-0.012973204255104065,
-0.018265200778841972,
0.10669700056314468,
-0.00414351187646389,
-0.030534984543919563,
-0.1217971220612526,
0.08297962695360184,
-0.09878598898649216,
0.145777627825737,
-0.04219397157430649,
0.03812727332115173,
0.11752622574567795,
0.08761733025312424,
-0.05806521698832512,
0.053116630762815475,
0.0732305496931076,
-0.1268877238035202,
0.05323648080229759,
0.19376353919506073,
-0.04794982075691223,
0.14755725860595703,
0.05415281653404236,
-0.1066582053899765,
0.04220764338970184,
-0.09003274887800217,
-0.06857478618621826,
-0.0526590496301651,
0.008361940272152424,
-0.05130001902580261,
0.13117094337940216,
0.20469707250595093,
-0.05777629464864731,
-0.015401056036353111,
-0.04907190054655075,
0.0023475869093090296,
0.042847178876399994,
0.1336035579442978,
-0.0569467656314373,
-0.29294654726982117,
0.02402801439166069,
0.04036745801568031,
0.020479559898376465,
-0.2252899408340454,
-0.09532944858074188,
0.029191728681325912,
-0.055303312838077545,
-0.059936027973890305,
0.1124541386961937,
0.044511131942272186,
0.03480009734630585,
-0.054918862879276276,
-0.1502867043018341,
-0.013655440881848335,
0.18558946251869202,
-0.1747869998216629,
-0.05478211119771004
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# common8
This model is a fine-tuned version of [wghts/checkpoint-20000](https://huggingface.co/wghts/checkpoint-20000) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - FA dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3174
- Wer: 0.3022
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 6
- total_train_batch_size: 192
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 250.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:-----:|:---------------:|:------:|
| 3.5847 | 1.93 | 500 | 3.5104 | 1.0 |
| 2.7858 | 3.86 | 1000 | 2.9601 | 1.0001 |
| 1.6827 | 5.79 | 1500 | 0.7853 | 0.7030 |
| 1.4656 | 7.72 | 2000 | 0.6076 | 0.6014 |
| 1.3693 | 9.65 | 2500 | 0.5114 | 0.5307 |
| 1.379 | 11.58 | 3000 | 0.4666 | 0.4940 |
| 1.2832 | 13.51 | 3500 | 0.4257 | 0.4593 |
| 1.1931 | 15.44 | 4000 | 0.4039 | 0.4427 |
| 1.2911 | 17.37 | 4500 | 0.3956 | 0.4295 |
| 1.1577 | 19.3 | 5000 | 0.3705 | 0.4114 |
| 1.1135 | 21.24 | 5500 | 0.3740 | 0.4010 |
| 1.19 | 23.17 | 6000 | 0.3611 | 0.3935 |
| 1.1008 | 25.1 | 6500 | 0.3503 | 0.3880 |
| 1.0805 | 27.03 | 7000 | 0.3427 | 0.3781 |
| 1.1556 | 28.96 | 7500 | 0.3442 | 0.3727 |
| 1.0596 | 30.89 | 8000 | 0.3398 | 0.3646 |
| 1.0219 | 32.82 | 8500 | 0.3312 | 0.3660 |
| 1.1042 | 34.75 | 9000 | 0.3287 | 0.3612 |
| 1.0273 | 36.68 | 9500 | 0.3236 | 0.3556 |
| 1.0383 | 38.61 | 10000 | 0.3217 | 0.3558 |
| 1.0498 | 40.54 | 10500 | 0.3205 | 0.3520 |
| 0.9969 | 42.47 | 11000 | 0.3125 | 0.3504 |
| 1.0658 | 44.4 | 11500 | 0.3120 | 0.3493 |
| 0.992 | 46.33 | 12000 | 0.3137 | 0.3476 |
| 0.9737 | 48.26 | 12500 | 0.3085 | 0.3413 |
| 1.0817 | 50.19 | 13000 | 0.3091 | 0.3418 |
| 0.9414 | 52.12 | 13500 | 0.3072 | 0.3344 |
| 0.9295 | 54.05 | 14000 | 0.3039 | 0.3322 |
| 1.0248 | 55.98 | 14500 | 0.2991 | 0.3325 |
| 0.9474 | 57.91 | 15000 | 0.3032 | 0.3348 |
| 0.928 | 59.85 | 15500 | 0.2999 | 0.3285 |
| 1.0321 | 61.78 | 16000 | 0.2982 | 0.3253 |
| 0.9255 | 63.71 | 16500 | 0.2970 | 0.3231 |
| 0.8928 | 65.64 | 17000 | 0.2993 | 0.3250 |
| 1.008 | 67.57 | 17500 | 0.2985 | 0.3222 |
| 0.9371 | 69.5 | 18000 | 0.2968 | 0.3216 |
| 0.9077 | 71.43 | 18500 | 0.3011 | 0.3299 |
| 1.0044 | 73.36 | 19000 | 0.3053 | 0.3306 |
| 0.9625 | 75.29 | 19500 | 0.3159 | 0.3295 |
| 0.9816 | 77.22 | 20000 | 0.3080 | 0.3304 |
| 0.9587 | 119.19 | 20500 | 0.3088 | 0.3284 |
| 0.9178 | 122.09 | 21000 | 0.3132 | 0.3320 |
| 1.0282 | 125.0 | 21500 | 0.3099 | 0.3266 |
| 0.9337 | 127.9 | 22000 | 0.3110 | 0.3317 |
| 0.8822 | 130.81 | 22500 | 0.3037 | 0.3247 |
| 0.9644 | 133.72 | 23000 | 0.3037 | 0.3238 |
| 0.9214 | 136.62 | 23500 | 0.3040 | 0.3234 |
| 0.9167 | 139.53 | 24000 | 0.3079 | 0.3203 |
| 0.9047 | 142.44 | 24500 | 0.3018 | 0.3177 |
| 0.8909 | 145.35 | 25000 | 0.3053 | 0.3181 |
| 0.9646 | 148.25 | 25500 | 0.3095 | 0.3229 |
| 0.8802 | 151.16 | 26000 | 0.3111 | 0.3192 |
| 0.8411 | 154.07 | 26500 | 0.3068 | 0.3123 |
| 0.9235 | 156.97 | 27000 | 0.3090 | 0.3177 |
| 0.8943 | 159.88 | 27500 | 0.3115 | 0.3179 |
| 0.8854 | 162.79 | 28000 | 0.3052 | 0.3157 |
| 0.8734 | 165.69 | 28500 | 0.3077 | 0.3124 |
| 0.8515 | 168.6 | 29000 | 0.3117 | 0.3128 |
| 0.912 | 171.51 | 29500 | 0.3039 | 0.3121 |
| 0.8669 | 174.42 | 30000 | 0.3120 | 0.3123 |
| 0.823 | 177.32 | 30500 | 0.3148 | 0.3118 |
| 0.9129 | 180.23 | 31000 | 0.3179 | 0.3101 |
| 0.8255 | 183.14 | 31500 | 0.3164 | 0.3114 |
| 0.8948 | 186.05 | 32000 | 0.3128 | 0.3101 |
| 0.8397 | 188.95 | 32500 | 0.3143 | 0.3068 |
| 0.8341 | 191.86 | 33000 | 0.3127 | 0.3136 |
| 0.873 | 194.76 | 33500 | 0.3149 | 0.3124 |
| 0.8232 | 197.67 | 34000 | 0.3166 | 0.3086 |
| 0.8002 | 200.58 | 34500 | 0.3149 | 0.3061 |
| 0.8621 | 203.49 | 35000 | 0.3160 | 0.3093 |
| 0.8123 | 206.39 | 35500 | 0.3141 | 0.3063 |
| 0.7995 | 209.3 | 36000 | 0.3174 | 0.3075 |
| 0.8271 | 212.21 | 36500 | 0.3173 | 0.3043 |
| 0.8059 | 215.12 | 37000 | 0.3176 | 0.3079 |
| 0.8835 | 218.02 | 37500 | 0.3169 | 0.3062 |
| 0.8027 | 220.93 | 38000 | 0.3203 | 0.3098 |
| 0.775 | 223.83 | 38500 | 0.3159 | 0.3068 |
| 0.8487 | 226.74 | 39000 | 0.3161 | 0.3072 |
| 0.7929 | 229.65 | 39500 | 0.3143 | 0.3037 |
| 0.7653 | 232.56 | 40000 | 0.3160 | 0.3048 |
| 0.8211 | 235.46 | 40500 | 0.3173 | 0.3031 |
| 0.7761 | 238.37 | 41000 | 0.3176 | 0.3025 |
| 0.7761 | 241.28 | 41500 | 0.3179 | 0.3027 |
| 0.7903 | 244.19 | 42000 | 0.3181 | 0.3016 |
| 0.7807 | 247.09 | 42500 | 0.3170 | 0.3027 |
| 0.8406 | 250.0 | 43000 | 0.3174 | 0.3022 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2
- Datasets 1.18.3.dev0
- Tokenizers 0.10.3
|
{"language": ["fa"], "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "common8", "results": []}]}
|
automatic-speech-recognition
|
ghofrani/xls-r-1b-fa-cv8
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"fa",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fa"
] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us
|
common8
=======
This model is a fine-tuned version of wghts/checkpoint-20000 on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - FA dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3174
* Wer: 0.3022
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 1e-06
* train\_batch\_size: 32
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 6
* total\_train\_batch\_size: 192
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 100
* num\_epochs: 250.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2
* Datasets 1.18.3.dev0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 6\n* total\\_train\\_batch\\_size: 192\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 250.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 6\n* total\\_train\\_batch\\_size: 192\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 250.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
71,
160,
4,
36
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #fa #dataset-common_voice #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-06\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 6\n* total\\_train\\_batch\\_size: 192\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 100\n* num\\_epochs: 250.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2\n* Datasets 1.18.3.dev0\n* Tokenizers 0.10.3"
] |
[
-0.12006473541259766,
0.07056140899658203,
-0.0038534009363502264,
0.03960047662258148,
0.12477318942546844,
0.012470915913581848,
0.08343128859996796,
0.14479820430278778,
-0.09896792471408844,
0.09203992038965225,
0.08775370568037033,
0.08357212692499161,
0.07362598925828934,
0.11699023097753525,
-0.027621079236268997,
-0.29001712799072266,
0.015283568762242794,
-0.012999586760997772,
-0.09530405700206757,
0.10911772400140762,
0.09487499296665192,
-0.10868284851312637,
0.019441621378064156,
0.009200136177241802,
-0.11007357388734818,
0.007731481455266476,
-0.025550801306962967,
-0.04494044929742813,
0.11854290217161179,
0.054965849965810776,
0.063275046646595,
0.033440910279750824,
0.07066306471824646,
-0.27136608958244324,
0.013572520576417446,
0.05625856667757034,
0.045189425349235535,
0.051064178347587585,
0.09882767498493195,
-0.012690425850450993,
0.14064040780067444,
-0.07863011956214905,
0.03981243446469307,
0.05469493940472603,
-0.1017066091299057,
-0.3055418133735657,
-0.0872272402048111,
-0.007719102781265974,
0.11927584558725357,
0.09638264030218124,
-0.04426214098930359,
0.057049766182899475,
-0.08078523725271225,
0.09950006753206253,
0.2334243506193161,
-0.2317659705877304,
-0.07128329575061798,
-0.021019240841269493,
0.049299031496047974,
0.04017923027276993,
-0.11055795103311539,
-0.002529742429032922,
0.02798398770391941,
0.035572510212659836,
0.07606668770313263,
0.0037437842693179846,
-0.0037162124644964933,
-0.002261071465909481,
-0.1355675458908081,
-0.030854186043143272,
0.11295270174741745,
0.08317650109529495,
-0.02209904044866562,
-0.10702624171972275,
-0.02105632983148098,
-0.18922671675682068,
-0.048967063426971436,
-0.004957636818289757,
0.02968239039182663,
-0.02815680019557476,
-0.0874699279665947,
0.011404392309486866,
-0.06383045762777328,
-0.08540582656860352,
0.012333035469055176,
0.15482880175113678,
0.03798256814479828,
-0.04294559732079506,
0.011656945571303368,
0.08599834144115448,
0.03966158628463745,
-0.1399974822998047,
-0.003939364571124315,
0.04754491150379181,
-0.1003282368183136,
-0.015758512541651726,
-0.044862352311611176,
-0.03699108958244324,
0.031647440046072006,
0.10985144972801208,
-0.03097846917808056,
0.09632143378257751,
-0.0074366177432239056,
0.011984593234956264,
-0.07681175321340561,
0.14842762053012848,
-0.06031869351863861,
-0.08555325865745544,
-0.0499727800488472,
0.09345600008964539,
-0.0024353356566280127,
-0.016480058431625366,
-0.07440607249736786,
0.011445821262896061,
0.0979471281170845,
0.04810002073645592,
-0.02503073588013649,
0.020414073020219803,
-0.07170642912387848,
-0.022259047254920006,
-0.013291486538946629,
-0.10875947773456573,
0.04583924263715744,
0.043032143265008926,
-0.06011337414383888,
0.012462214566767216,
-0.01690700836479664,
0.02870727702975273,
-0.0138391749933362,
0.10697333514690399,
-0.04514927417039871,
0.0050108530558645725,
-0.06501396745443344,
-0.11096765100955963,
0.03204929828643799,
-0.045938629657030106,
0.0035625325981527567,
-0.05868706852197647,
-0.08039285987615585,
-0.0653546154499054,
0.04271230846643448,
-0.0641215369105339,
-0.04999283701181412,
-0.09173405915498734,
-0.06519021093845367,
0.0515599250793457,
-0.023276016116142273,
0.19130003452301025,
-0.06401631981134415,
0.10461646318435669,
0.03156917914748192,
0.04907671734690666,
0.04760333150625229,
0.07821982353925705,
-0.029747696593403816,
0.03271010145545006,
-0.14512334764003754,
0.11032790690660477,
-0.08678560703992844,
0.023378800600767136,
-0.13668999075889587,
-0.1031852588057518,
-0.026337450370192528,
0.005719286389648914,
0.10606582462787628,
0.11997216939926147,
-0.19044499099254608,
-0.10975681245326996,
0.16685831546783447,
-0.0643257275223732,
-0.06491414457559586,
0.15104366838932037,
-0.0239366814494133,
-0.02526864968240261,
0.05227109417319298,
0.18383708596229553,
0.09530223160982132,
-0.08792981505393982,
0.016916846856474876,
-0.06519918143749237,
0.11916791647672653,
0.057720039039850235,
0.08881731331348419,
-0.054072305560112,
0.022273756563663483,
-0.004760148003697395,
0.0019163729157298803,
0.07897277176380157,
-0.09509417414665222,
-0.07158538699150085,
-0.013499380089342594,
-0.07894747704267502,
0.029773296788334846,
0.053422506898641586,
0.0180581696331501,
-0.11006269603967667,
-0.11129558086395264,
-0.0011761776404455304,
0.1022455021739006,
-0.1018572673201561,
0.039363760501146317,
-0.07490461319684982,
0.03701217472553253,
-0.015115524642169476,
-0.005382100120186806,
-0.16693049669265747,
0.026959557086229324,
0.03690914809703827,
-0.0182639230042696,
0.028592564165592194,
-0.004630097653716803,
0.07863937318325043,
0.03984113782644272,
-0.06433167308568954,
-0.0519668273627758,
-0.03903863579034805,
-0.003760438645258546,
-0.07936235517263412,
-0.24834735691547394,
-0.05845560505986214,
-0.03251652419567108,
0.16368398070335388,
-0.1974801868200302,
-0.0037297408562153578,
0.041158318519592285,
0.13098984956741333,
0.03299087658524513,
-0.049056846648454666,
0.007664903998374939,
0.0861053615808487,
-0.01573054865002632,
-0.05907135456800461,
0.03708508610725403,
-0.0034455894492566586,
-0.13984020054340363,
0.007335000205785036,
-0.12609541416168213,
0.0570790097117424,
0.09720224142074585,
-0.015584919601678848,
-0.08680860698223114,
-0.06442908197641373,
-0.05435949191451073,
-0.054606031626462936,
-0.020469708368182182,
-0.00300832511857152,
0.19556386768817902,
0.017943453043699265,
0.10697107017040253,
-0.0687992125749588,
-0.027188725769519806,
0.039815254509449005,
0.021137414500117302,
-0.013510946184396744,
0.1458357870578766,
0.05532947927713394,
-0.07023567706346512,
0.08886528760194778,
0.06886391341686249,
-0.046959031373262405,
0.16864840686321259,
-0.061415981501340866,
-0.09470058977603912,
-0.03131260722875595,
0.02307223342359066,
0.010882883332669735,
0.0879100039601326,
-0.19511336088180542,
-0.016751211136579514,
0.017367461696267128,
0.03636618331074715,
0.02364734746515751,
-0.19176355004310608,
-0.0020570619963109493,
0.05248798802495003,
-0.08131010830402374,
-0.0008038489613682032,
0.0046918559819459915,
-0.003911178093403578,
0.09051775187253952,
0.014829198829829693,
-0.07507289946079254,
-0.01705022156238556,
-0.03676927462220192,
-0.0862576961517334,
0.16315661370754242,
-0.10854694992303848,
-0.14383669197559357,
-0.09937898814678192,
-0.05613631010055542,
-0.024012146517634392,
-0.0056496611796319485,
0.06271553039550781,
-0.11928050965070724,
-0.020384952425956726,
-0.05447722598910332,
0.05108954384922981,
-0.06912275403738022,
0.03707606717944145,
0.006962013430893421,
-0.003792030503973365,
0.056148964911699295,
-0.1025114357471466,
0.014602766372263432,
-0.024642331525683403,
-0.0233151875436306,
0.0035187238827347755,
0.020909661427140236,
0.09046574681997299,
0.1687418818473816,
0.04972140118479729,
0.023674586787819862,
-0.05080713331699371,
0.1778106689453125,
-0.12901073694229126,
-0.018812259659171104,
0.11601696163415909,
0.007753911893814802,
0.02716987207531929,
0.15055975317955017,
0.04585178568959236,
-0.0922093540430069,
0.017944445833563805,
0.04254227876663208,
-0.01886052079498768,
-0.23646433651447296,
-0.039553627371788025,
-0.07936298102140427,
-0.030661379918456078,
0.07342544943094254,
0.028623221442103386,
-0.008473342284560204,
0.010997493751347065,
-0.02942565642297268,
-0.00858265906572342,
0.015657708048820496,
0.04523675516247749,
0.09152092784643173,
0.033154167234897614,
0.1213579848408699,
-0.01172990445047617,
-0.027378246188163757,
0.03626733645796776,
-0.017386026680469513,
0.2330707162618637,
0.02814418263733387,
0.16788624227046967,
0.05288761854171753,
0.14710888266563416,
-0.007708702236413956,
0.028701310977339745,
0.017063241451978683,
-0.020285191014409065,
0.0052397847175598145,
-0.051952846348285675,
-0.03818679228425026,
0.0308306273072958,
0.12313014268875122,
0.02721923403441906,
-0.1317773163318634,
-0.020456664264202118,
0.02367423288524151,
0.3519138693809509,
0.08419457823038101,
-0.2569446563720703,
-0.08382326364517212,
0.009805076755583286,
-0.07705672830343246,
-0.031852178275585175,
0.02444290742278099,
0.12315518409013748,
-0.0851343423128128,
0.08046074956655502,
-0.04664324223995209,
0.08851803839206696,
-0.06319728493690491,
0.013368559069931507,
0.051748666912317276,
0.08483105897903442,
-0.0021792377810925245,
0.05264542996883392,
-0.27606120705604553,
0.29088926315307617,
0.0002981515135616064,
0.07989509403705597,
-0.04849947243928909,
0.03482111915946007,
0.034889861941337585,
-0.02399480529129505,
0.06166214123368263,
-0.012367293238639832,
-0.11445540934801102,
-0.16634966433048248,
-0.06750112771987915,
0.018380438908934593,
0.14268210530281067,
-0.05780661106109619,
0.10922561585903168,
-0.026970289647579193,
-0.017406385391950607,
0.054873108863830566,
-0.03071870282292366,
-0.12024790793657303,
-0.10099202394485474,
0.02025654725730419,
0.06929348409175873,
0.06822708994150162,
-0.07381870597600937,
-0.10252302139997482,
-0.08380940556526184,
0.17956414818763733,
-0.07715887576341629,
-0.009849268943071365,
-0.12033586949110031,
0.08976416289806366,
0.1500752568244934,
-0.06547069549560547,
0.03832665830850601,
0.020216049626469612,
0.10359630733728409,
0.016535773873329163,
-0.005750719923526049,
0.12408263981342316,
-0.07356077432632446,
-0.20205767452716827,
-0.0633913055062294,
0.18603608012199402,
0.04943238943815231,
0.07906892150640488,
-0.02720792591571808,
0.026466693729162216,
0.0016775024123489857,
-0.06229306384921074,
0.08786889910697937,
0.03837188333272934,
0.014916042797267437,
0.074225515127182,
-0.024334294721484184,
-0.019447794184088707,
-0.08506456762552261,
-0.08762358874082565,
0.15028664469718933,
0.29457584023475647,
-0.08023828268051147,
0.052322011440992355,
0.051595281809568405,
-0.05031578615307808,
-0.13727828860282898,
0.02087368629872799,
0.130527526140213,
0.05364363268017769,
0.012324660085141659,
-0.22104273736476898,
0.00024093379033729434,
0.08306723833084106,
-0.01662611775100231,
0.084629125893116,
-0.3390688896179199,
-0.13452553749084473,
0.09005841612815857,
0.07043693959712982,
0.008802632801234722,
-0.15141379833221436,
-0.06180040165781975,
-0.021348655223846436,
-0.0829727053642273,
0.025189323350787163,
-0.027990249916911125,
0.1482209414243698,
0.016064686700701714,
0.037370748817920685,
0.026178553700447083,
-0.03636348992586136,
0.14091487228870392,
-0.008178718388080597,
0.047000452876091,
-0.010877327062189579,
0.029483256861567497,
-0.0496826209127903,
-0.04310355708003044,
-0.01649751514196396,
-0.08705122768878937,
0.014660000801086426,
-0.11429178714752197,
-0.03821989521384239,
-0.09315429627895355,
0.020436957478523254,
-0.03049909509718418,
-0.04117295518517494,
-0.034269507974386215,
0.03785539045929909,
0.06908826529979706,
0.00933715607970953,
0.10401929914951324,
-0.08864572644233704,
0.1610548496246338,
0.11014547944068909,
0.11576009541749954,
-0.016861384734511375,
-0.06557168811559677,
-0.008497538045048714,
-0.01602909527719021,
0.04657069221138954,
-0.09754401445388794,
0.03267662972211838,
0.13741344213485718,
0.041456438601017,
0.15774723887443542,
0.049748703837394714,
-0.08003216981887817,
0.03135570138692856,
0.05535150691866875,
-0.06888642907142639,
-0.1533454954624176,
-0.01997956819832325,
0.007475937716662884,
-0.12398126721382141,
0.010503076948225498,
0.12378209084272385,
-0.02832649275660515,
-0.008351561613380909,
0.008031509816646576,
0.025322888046503067,
-0.05231500416994095,
0.23516613245010376,
0.0012950361706316471,
0.07020226866006851,
-0.10209985077381134,
0.0636623203754425,
0.04604122042655945,
-0.146060049533844,
0.044491495937108994,
0.08155469596385956,
-0.041281234472990036,
-0.01593473181128502,
0.01984294503927231,
0.0961473360657692,
0.04556736722588539,
-0.045991212129592896,
-0.10106395184993744,
-0.15288765728473663,
0.09398139268159866,
0.08555400371551514,
0.027941690757870674,
0.026002557948231697,
-0.035765405744314194,
0.04110823944211006,
-0.09795835614204407,
0.08647090196609497,
0.08703751116991043,
0.04939589649438858,
-0.11866690218448639,
0.1563095897436142,
0.01139732263982296,
0.0050835381262004375,
-0.0007233110954985023,
-0.012524559162557125,
-0.08368029445409775,
0.032676782459020615,
-0.13283321261405945,
-0.03859003260731697,
-0.05026169866323471,
-0.008760461583733559,
0.01166914589703083,
-0.0655902549624443,
-0.05435369908809662,
0.01962653361260891,
-0.11596829444169998,
-0.04485228657722473,
-0.02191060222685337,
0.06591824442148209,
-0.08884410560131073,
-0.028745178133249283,
0.027712687849998474,
-0.10582539439201355,
0.08961578458547592,
0.06212739646434784,
0.016444338485598564,
0.02240704745054245,
-0.1225365698337555,
-0.010053513571619987,
0.04338424280285835,
-0.009847728535532951,
0.023136937990784645,
-0.17472702264785767,
-0.019299481064081192,
-0.01797613315284252,
0.032687388360500336,
-0.002833697712048888,
0.03012724220752716,
-0.1194516271352768,
-0.024844441562891006,
-0.0478891022503376,
-0.06588637828826904,
-0.043088678270578384,
0.045437417924404144,
0.07838110625743866,
0.03901040181517601,
0.14750349521636963,
-0.09480989724397659,
0.07013945281505585,
-0.22055983543395996,
0.006794282700866461,
-0.03952232003211975,
-0.06602496653795242,
-0.07325677573680878,
-0.03672272711992264,
0.10079400986433029,
-0.05477820336818695,
0.0775018036365509,
-0.04976936802268028,
0.06523238122463226,
0.02426920458674431,
-0.1326419711112976,
0.027565620839595795,
0.037068337202072144,
0.18566371500492096,
0.06314399093389511,
-0.035117775201797485,
0.07852967083454132,
0.012126846238970757,
0.07313340157270432,
0.19182653725147247,
0.1492755264043808,
0.15574008226394653,
0.06951360404491425,
0.09628041088581085,
0.06705790758132935,
-0.1284315437078476,
-0.13250310719013214,
0.12356072664260864,
-0.038657136261463165,
0.13290594518184662,
-0.022886918857693672,
0.2242698073387146,
0.0951906219124794,
-0.19361895322799683,
0.07330478727817535,
-0.041060324758291245,
-0.07884014397859573,
-0.10254529118537903,
-0.046890292316675186,
-0.08586923032999039,
-0.18454506993293762,
0.0068768952041864395,
-0.10062503814697266,
0.0678332969546318,
0.027499627321958542,
0.0464823953807354,
0.02476361021399498,
0.11092115193605423,
0.036966439336538315,
-0.011025712825357914,
0.12589164078235626,
0.006269009318202734,
-0.014331384561955929,
-0.06129312887787819,
-0.11206666380167007,
0.07269969582557678,
-0.03675748035311699,
0.060482632368803024,
-0.03752083703875542,
-0.11448163539171219,
0.05190768837928772,
0.004391441587358713,
-0.11509230732917786,
0.0212495606392622,
-0.008184878155589104,
0.08319389075040817,
0.11821375042200089,
0.03550901263952255,
-0.007194915320724249,
-0.01119480561465025,
0.21271836757659912,
-0.09592965245246887,
-0.0504767931997776,
-0.12126629054546356,
0.22698873281478882,
0.009783372282981873,
0.009019899182021618,
0.009774073958396912,
-0.07898728549480438,
-0.01292935386300087,
0.16486278176307678,
0.15534710884094238,
-0.02292823977768421,
-0.014497801661491394,
0.024668801575899124,
-0.009831984527409077,
-0.027849014848470688,
0.06307557970285416,
0.11873625218868256,
0.043255992233753204,
-0.048715777695178986,
-0.018796540796756744,
-0.04588627442717552,
-0.0531371645629406,
-0.012051603756844997,
0.08169089257717133,
0.030306214466691017,
-0.010350493714213371,
-0.01215851865708828,
0.11812851577997208,
-0.04467497766017914,
-0.15530997514724731,
0.03660974279046059,
-0.19206666946411133,
-0.1756841093301773,
-0.030660530552268028,
0.05161780118942261,
0.06218264624476433,
0.05988757684826851,
-0.01391562633216381,
-0.016815869137644768,
0.10370044410228729,
-0.003068826859816909,
-0.033847056329250336,
-0.12185702472925186,
0.08106908202171326,
-0.09966089576482773,
0.1490669846534729,
-0.04242503643035889,
0.03673400729894638,
0.1162695586681366,
0.08884644508361816,
-0.059201277792453766,
0.055672600865364075,
0.07361588627099991,
-0.1254614293575287,
0.05336354300379753,
0.1950218379497528,
-0.046990759670734406,
0.1498953402042389,
0.055233024060726166,
-0.11012991517782211,
0.043201617896556854,
-0.09117972105741501,
-0.06542044878005981,
-0.05543564260005951,
0.009942117147147655,
-0.05119280889630318,
0.13244253396987915,
0.20773516595363617,
-0.057282548397779465,
-0.01534624770283699,
-0.04967222362756729,
-0.00025618026847951114,
0.03996579349040985,
0.13934864103794098,
-0.05527901649475098,
-0.29071947932243347,
0.02304496243596077,
0.04370146244764328,
0.019042976200580597,
-0.22983261942863464,
-0.09561406075954437,
0.030265944078564644,
-0.058651503175497055,
-0.055989522486925125,
0.11082734912633896,
0.041166823357343674,
0.03486429527401924,
-0.056324269622564316,
-0.15256355702877045,
-0.011474045924842358,
0.18677105009555817,
-0.17279940843582153,
-0.0552871972322464
] |
null | null |
transformers
|
# Bangla-GPT2
### A GPT-2 Model for the Bengali Language
* Dataset- mc4 Bengali
* Training time- ~40 hours
* Written in- JAX
If you use this model, please cite:
```
@misc{bangla-gpt2,
author = {Ritobrata Ghosh},
year = {2016},
title = {Bangla GPT-2},
publisher = {Hugging Face}
}
```
|
{"language": "bn", "tags": ["text-generation"], "widget": [{"text": "\u0986\u099c \u098f\u0995\u099f\u09bf \u09b8\u09c1\u09a8\u09cd\u09a6\u09b0 \u09a6\u09bf\u09a8 \u098f\u09ac\u0982 \u0986\u09ae\u09bf"}]}
|
text-generation
|
ritog/bangla-gpt2
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"bn",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"bn"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Bangla-GPT2
### A GPT-2 Model for the Bengali Language
* Dataset- mc4 Bengali
* Training time- ~40 hours
* Written in- JAX
If you use this model, please cite:
|
[
"# Bangla-GPT2",
"### A GPT-2 Model for the Bengali Language\n\n* Dataset- mc4 Bengali\n* Training time- ~40 hours\n* Written in- JAX\n\nIf you use this model, please cite:"
] |
[
"TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Bangla-GPT2",
"### A GPT-2 Model for the Bengali Language\n\n* Dataset- mc4 Bengali\n* Training time- ~40 hours\n* Written in- JAX\n\nIf you use this model, please cite:"
] |
[
52,
6,
41
] |
[
"passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Bangla-GPT2### A GPT-2 Model for the Bengali Language\n\n* Dataset- mc4 Bengali\n* Training time- ~40 hours\n* Written in- JAX\n\nIf you use this model, please cite:"
] |
[
-0.10073775053024292,
0.04975242167711258,
-0.00014848654973320663,
0.03923024982213974,
0.06200550124049187,
0.006918168626725674,
0.06193838268518448,
0.08054690062999725,
-0.05107059329748154,
-0.014580450020730495,
0.18984077870845795,
0.06912580877542496,
0.01407772209495306,
0.151948943734169,
-0.010158717632293701,
-0.32482367753982544,
0.030398622155189514,
0.07938738167285919,
-0.09295716136693954,
0.18987210094928741,
0.1366792619228363,
-0.02539004385471344,
0.046305250376462936,
-0.006090396549552679,
-0.20005300641059875,
-0.014443032443523407,
-0.0006154652801342309,
-0.1238866075873375,
0.06217285618185997,
0.03019462339580059,
0.06153552606701851,
0.049861010164022446,
0.02242949604988098,
-0.0444195456802845,
0.022130317986011505,
-0.021482190117239952,
0.014803588390350342,
0.03638169914484024,
-0.024788638576865196,
0.037062712013721466,
0.14612317085266113,
0.026651481166481972,
-0.005098577588796616,
0.019709883257746696,
-0.1533438265323639,
-0.09350045770406723,
0.005690908990800381,
0.03414740785956383,
0.13237391412258148,
0.062146734446287155,
-0.05223502963781357,
0.012242870405316353,
-0.2020883411169052,
0.047070592641830444,
0.11269056051969528,
-0.24664074182510376,
-0.07115743309259415,
0.10065330564975739,
0.13369391858577728,
0.11596217751502991,
-0.07618789374828339,
0.03194967284798622,
0.09125089645385742,
0.002775067463517189,
0.12554898858070374,
-0.09336652606725693,
0.026395507156848907,
0.09368442744016647,
-0.11440537869930267,
0.03396350145339966,
0.19316716492176056,
-0.04912307858467102,
0.0010134951444342732,
-0.06291770190000534,
-0.019724687561392784,
-0.16871421039104462,
-0.060865096747875214,
-0.04995401203632355,
-0.04971962794661522,
0.05034752935171127,
0.004388665780425072,
-0.03607894852757454,
-0.07828201353549957,
-0.11237041652202606,
-0.027689732611179352,
0.045419495552778244,
0.03599049523472786,
-0.009396105073392391,
-0.1319132149219513,
0.10544139891862869,
-0.03429045528173447,
-0.009199156425893307,
-0.00010376350837759674,
-0.03757133334875107,
-0.012546549551188946,
0.05007801204919815,
-0.007813491858541965,
0.03678206354379654,
-0.07192594558000565,
0.09537824988365173,
-0.06142390891909599,
0.009416087530553341,
0.11757554113864899,
0.0335497185587883,
-0.003505768720060587,
0.031001603230834007,
-0.11776785552501678,
-0.11572640389204025,
-0.004400172270834446,
0.0018637259490787983,
-0.07670240104198456,
-0.04125184938311577,
-0.0718395933508873,
-0.06552485376596451,
0.05464595556259155,
0.03608715161681175,
-0.16598933935165405,
0.05896082893013954,
-0.003627385478466749,
-0.032079391181468964,
0.07833144068717957,
-0.0792861059308052,
-0.05272410437464714,
-0.047656185925006866,
-0.08721405267715454,
-0.09711885452270508,
0.01857224479317665,
-0.004099578596651554,
-0.07511977106332779,
-0.12473989278078079,
-0.08663763105869293,
-0.04539039731025696,
-0.12812288105487823,
-0.06201641261577606,
-0.047207336872816086,
-0.03833114728331566,
-0.025022165849804878,
-0.03928745537996292,
-0.22310499846935272,
0.042447496205568314,
0.06993051618337631,
-0.1288364678621292,
-0.10499301552772522,
-0.12800012528896332,
-0.017620015889406204,
0.06690498441457748,
-0.11520927399396896,
0.1784089356660843,
-0.04226380214095116,
0.09204369783401489,
0.05981707200407982,
0.15995949506759644,
0.017797384411096573,
0.07281741499900818,
0.012297012843191624,
0.03575548529624939,
-0.05475655570626259,
0.12226568162441254,
-0.032076649367809296,
-0.019127145409584045,
-0.07845772057771683,
-0.08185218274593353,
-0.05402212589979172,
0.11468162387609482,
0.019371049478650093,
0.10802662372589111,
-0.06721734255552292,
-0.05674039572477341,
0.30659475922584534,
0.021866239607334137,
-0.11064685881137848,
0.10316401720046997,
-0.041440777480602264,
0.2594709098339081,
0.07027704268693924,
0.13776592910289764,
-0.006387222092598677,
0.018426205962896347,
0.16275787353515625,
0.0667203813791275,
-0.02283034846186638,
-0.09029126912355423,
0.058682385832071304,
0.05237621068954468,
-0.06267596036195755,
0.05249127373099327,
0.056122418493032455,
0.07300777733325958,
-0.09406683593988419,
-0.0403832271695137,
0.04316513612866402,
-0.05082414671778679,
0.051853615790605545,
-0.025464210659265518,
0.13043048977851868,
-0.08976489305496216,
-0.0837169662117958,
0.09279783070087433,
0.08415486663579941,
-0.041145630180835724,
0.02355462685227394,
-0.08943313360214233,
-0.060649316757917404,
-0.07530749589204788,
0.01190632302314043,
-0.09008954465389252,
0.08161871880292892,
-0.014114195480942726,
0.16203323006629944,
0.030820904299616814,
0.1542241871356964,
0.06989647448062897,
-0.014847032725811005,
-0.04590069502592087,
0.06123645603656769,
0.11560198664665222,
-0.054391492158174515,
-0.16054658591747284,
-0.032820284366607666,
0.05126390606164932,
-0.037322964519262314,
0.12955516576766968,
-0.2143242061138153,
0.06920105963945389,
-0.02113272249698639,
-0.001322101685218513,
-0.02909674122929573,
0.014896301552653313,
0.04705419763922691,
-0.007300036959350109,
-0.0024899214040488005,
-0.030258361250162125,
0.05026761814951897,
-0.009700574912130833,
-0.034128397703170776,
0.2875348925590515,
-0.09374349564313889,
0.14760172367095947,
0.07498905807733536,
-0.03637756407260895,
-0.014559510163962841,
-0.03971530497074127,
0.0053086127154529095,
-0.07898804545402527,
-0.017429452389478683,
0.048547182232141495,
0.19338932633399963,
0.028928298503160477,
0.20405380427837372,
-0.10651689767837524,
-0.03955867886543274,
0.009419001638889313,
-0.08386220037937164,
0.07920902967453003,
0.09888973087072372,
0.0663718655705452,
-0.17134959995746613,
0.12417235225439072,
-0.014235316775739193,
0.051287271082401276,
0.23255987465381622,
0.07388757914304733,
-0.07870800048112869,
-0.06105223670601845,
-0.00031733812647871673,
-0.09687662869691849,
0.06068176403641701,
-0.18202774226665497,
-0.07022684812545776,
0.0559815987944603,
0.04776822030544281,
0.06460978835821152,
-0.14663411676883698,
-0.08903335779905319,
-0.02076607570052147,
-0.06312578916549683,
-0.0023912766482681036,
0.12505510449409485,
-0.08252351731061935,
0.07435564696788788,
0.011187007650732994,
-0.02538801170885563,
0.037078529596328735,
0.031811341643333435,
-0.08921562135219574,
0.19811585545539856,
-0.06929830461740494,
-0.2691003382205963,
-0.05463424324989319,
-0.10765060037374496,
0.07009550929069519,
-0.005785337649285793,
0.04265303164720535,
-0.18133847415447235,
-0.024193018674850464,
0.0033463495783507824,
0.069574274122715,
-0.058600738644599915,
0.000124047786812298,
-0.12040779739618301,
-0.023876624181866646,
-0.10863114148378372,
-0.08899372071027756,
-0.011488193646073341,
-0.027537450194358826,
-0.04104658216238022,
0.08263552188873291,
-0.2093115597963333,
-0.07922474294900894,
0.18490703403949738,
-0.026786135509610176,
0.054222285747528076,
-0.007703444454818964,
0.14792418479919434,
-0.09250086545944214,
0.05934828892350197,
0.0506017804145813,
0.1243087649345398,
0.019334925338625908,
0.06462389975786209,
-0.01688523218035698,
-0.10752560198307037,
0.0036909026093780994,
0.06289631128311157,
-0.04159212484955788,
-0.21450960636138916,
-0.04257231578230858,
-0.07633208483457565,
0.04504053667187691,
0.07872479408979416,
0.008558705449104309,
0.029720637947320938,
0.07720836997032166,
0.018869200721383095,
0.12098947912454605,
0.06026232987642288,
0.022018183022737503,
-0.0450800396502018,
-0.0755455419421196,
0.10602705180644989,
-0.027526825666427612,
-0.024388276040554047,
0.07624053210020065,
0.09462012350559235,
0.19445201754570007,
0.009885871782898903,
0.0722401812672615,
0.06307198107242584,
0.1302708238363266,
0.09403684735298157,
-0.005352560896426439,
-0.06564002484083176,
-0.05667591094970703,
-0.048624925315380096,
0.019462771713733673,
-0.09070315957069397,
0.08085989207029343,
0.016058314591646194,
-0.08748979866504669,
-0.041034795343875885,
-0.015586597844958305,
0.008725116960704327,
0.10196351259946823,
0.10427683591842651,
-0.2087884396314621,
-0.09927959740161896,
0.0379309244453907,
-0.05222048982977867,
-0.1322728544473648,
0.1285720020532608,
0.05468462035059929,
-0.05905153602361679,
-0.06494878232479095,
-0.07525113224983215,
0.12142100185155869,
0.00006537343870149925,
-0.005711930803954601,
-0.08540774136781693,
-0.02004851959645748,
-0.04284392669796944,
0.10954533517360687,
-0.24355001747608185,
0.2718430757522583,
0.016451602801680565,
0.012687698006629944,
-0.040287408977746964,
-0.050459619611501694,
-0.023551419377326965,
0.08406605571508408,
0.04588378593325615,
0.017206497490406036,
-0.009336805902421474,
-0.05705360695719719,
-0.09129168093204498,
-0.010548164136707783,
0.012329166755080223,
0.016521673649549484,
0.031715113669633865,
0.005849805660545826,
0.07266438752412796,
0.011983868665993214,
-0.1372549831867218,
-0.0554497092962265,
-0.05301152542233467,
0.04298791289329529,
-0.005551962181925774,
-0.006867206655442715,
-0.07604551315307617,
-0.04894494637846947,
0.04493439570069313,
0.22577007114887238,
-0.00009195535676553845,
-0.09963245689868927,
-0.042170070111751556,
0.0412151925265789,
0.07700781524181366,
-0.06100902333855629,
0.04802186042070389,
0.04196738079190254,
0.00781523808836937,
0.01551219541579485,
0.060014884918928146,
0.1322796642780304,
0.00119979795999825,
0.007155663333833218,
-0.03733168542385101,
0.1018449068069458,
0.08269763737916946,
0.019478535279631615,
-0.012423822656273842,
-0.05036089941859245,
-0.09105385839939117,
-0.05993231013417244,
-0.04126076027750969,
-0.06922922283411026,
0.03585856407880783,
-0.02525201253592968,
-0.1266656070947647,
0.11168135702610016,
0.010003456845879555,
-0.15312717854976654,
0.1829691082239151,
0.15643636882305145,
-0.008521669544279575,
0.1809668093919754,
0.18353065848350525,
-0.02049851603806019,
-0.22004397213459015,
-0.06546668708324432,
-0.0005959677509963512,
0.11680259555578232,
-0.10506143420934677,
-0.16064804792404175,
-0.05245586484670639,
0.06738182157278061,
-0.0017777412431314588,
-0.05103236064314842,
-0.2871567904949188,
-0.11645694822072983,
0.11616776883602142,
0.052064571529626846,
0.25908470153808594,
-0.10850778967142105,
-0.008155356161296368,
-0.03785914182662964,
-0.08197836577892303,
0.08291619271039963,
-0.16557782888412476,
0.10122603178024292,
-0.047002241015434265,
0.19613616168498993,
0.0059779053553938866,
-0.05820342153310776,
0.11388304829597473,
0.002799707930535078,
-0.013190566562116146,
-0.08977613598108292,
-0.07895467430353165,
0.11856191605329514,
-0.03252342715859413,
0.07910247892141342,
-0.0950683057308197,
0.012604836374521255,
-0.24504244327545166,
-0.07302124053239822,
-0.041761673986911774,
-0.0981738492846489,
-0.015345870517194271,
-0.08684264868497849,
-0.03600100055336952,
0.09645118564367294,
0.008307491429150105,
-0.01833241805434227,
-0.036096155643463135,
-0.07971741259098053,
-0.00880725309252739,
0.020632708445191383,
0.13195039331912994,
-0.10327766090631485,
0.029512133449316025,
-0.06015396490693092,
0.02410268224775791,
0.08706508576869965,
-0.3469185531139374,
-0.040140047669410706,
0.011445019394159317,
-0.02752845734357834,
0.14426735043525696,
-0.004529304336756468,
0.009981906972825527,
0.0559764988720417,
0.1032174602150917,
-0.0004971915623173118,
0.0052138566970825195,
-0.0254853293299675,
-0.07042764872312546,
0.010634143836796284,
-0.0449480302631855,
0.06050390750169754,
-0.07629089057445526,
-0.04706982150673866,
-0.01603367179632187,
0.0474366620182991,
-0.13108156621456146,
0.19875791668891907,
0.05913424864411354,
0.06999366730451584,
-0.053641919046640396,
0.04483022913336754,
0.03520209342241287,
-0.025448178872466087,
0.04054795950651169,
0.1717255562543869,
-0.15534865856170654,
-0.09174463152885437,
0.015057869255542755,
0.04906146600842476,
-0.0750478208065033,
0.033597346395254135,
-0.0024454735685139894,
-0.04975711926817894,
0.09007028490304947,
-0.016093173995614052,
-0.015313875861465931,
-0.0005513638607226312,
-0.06249307096004486,
0.01381170004606247,
-0.05810363218188286,
0.051700640469789505,
0.11677522212266922,
-0.013666564598679543,
0.029023408889770508,
0.12300138920545578,
0.06756224483251572,
0.10475088655948639,
-0.07302838563919067,
-0.025294682011008263,
-0.12827226519584656,
0.08699001371860504,
-0.19012276828289032,
-0.03647838532924652,
-0.03920132294297218,
-0.041847798973321915,
-0.07832969725131989,
-0.01287052407860756,
-0.09355849772691727,
-0.04863710328936577,
-0.09043393284082413,
0.041318025439977646,
-0.045041684061288834,
0.060521200299263,
0.049591097980737686,
-0.0679263323545456,
0.07896220684051514,
0.0025883777998387814,
0.025571877136826515,
0.07170884311199188,
-0.049469005316495895,
0.07335647195577621,
0.07115073502063751,
0.0619850680232048,
-0.01871771737933159,
0.001569158397614956,
0.04526251554489136,
-0.06593257933855057,
0.01755853369832039,
0.01647358387708664,
0.11862028390169144,
0.0561436265707016,
0.028118252754211426,
-0.05386174097657204,
-0.0499182865023613,
-0.16201196610927582,
-0.0226543340831995,
-0.09312509745359421,
0.09440528601408005,
-0.04882286116480827,
0.0725497230887413,
0.020270997658371925,
-0.05670006945729256,
-0.02119111269712448,
-0.11037750542163849,
0.0037363674491643906,
-0.01374129205942154,
-0.10397384315729141,
0.040579210966825485,
-0.10815102607011795,
0.04275216907262802,
-0.07473917305469513,
0.29495471715927124,
0.026719138026237488,
-0.136147141456604,
0.001939041307196021,
-0.16427209973335266,
0.017303895205259323,
-0.02142929844558239,
0.286801815032959,
0.10896877944469452,
0.0238180048763752,
-0.04037622734904289,
0.1254802644252777,
0.006397258024662733,
0.15697525441646576,
-0.01509840041399002,
0.04858890548348427,
0.06491660326719284,
0.11842360347509384,
-0.12403451651334763,
-0.0860031321644783,
-0.023158028721809387,
0.13157600164413452,
-0.09019671380519867,
0.003990123979747295,
-0.03822580352425575,
0.1733875721693039,
0.2823697030544281,
-0.13332499563694,
0.05111463740468025,
0.020740969106554985,
-0.11493929475545883,
-0.11599413305521011,
-0.09257451444864273,
-0.07592997699975967,
-0.11219087243080139,
0.012370256707072258,
-0.13631701469421387,
0.030243031680583954,
0.006207731552422047,
0.10225830972194672,
0.00013955931353848428,
0.15471890568733215,
0.08767180144786835,
-0.10494302213191986,
0.08222305029630661,
-0.01881672628223896,
0.05474419519305229,
-0.13327454030513763,
-0.01092476211488247,
-0.01822555623948574,
-0.00549166789278388,
0.04642424359917641,
-0.019001955166459084,
-0.0662909597158432,
0.02040962688624859,
-0.006862854119390249,
-0.03328361362218857,
-0.08426622301340103,
0.07020029425621033,
0.0791010856628418,
0.04298460856080055,
0.041926223784685135,
-0.08825989067554474,
-0.0008145496249198914,
0.138058602809906,
-0.014451667666435242,
-0.1176970973610878,
-0.08980906009674072,
0.20532386004924774,
0.07471127808094025,
0.06285454332828522,
-0.0012754613999277353,
0.07603326439857483,
-0.050936922430992126,
0.2553631365299225,
0.23367975652217865,
-0.12553657591342926,
-0.02801644057035446,
0.021187204867601395,
-0.0021515407133847475,
-0.05029302090406418,
0.17166563868522644,
0.041465744376182556,
0.16389715671539307,
-0.07755663245916367,
-0.11032544076442719,
-0.013005395419895649,
-0.06768495589494705,
-0.09368947148323059,
0.1420275866985321,
0.05943204462528229,
-0.0012475252151489258,
-0.0164327509701252,
0.06857772916555405,
-0.04186025261878967,
0.04581533744931221,
-0.13145829737186432,
-0.004034670069813728,
-0.12677167356014252,
0.0013359019067138433,
-0.08428236097097397,
-0.009409102611243725,
0.0560990609228611,
0.0003224446263629943,
0.017360711470246315,
0.057935021817684174,
0.02879485674202442,
-0.12968093156814575,
-0.035525839775800705,
0.15407367050647736,
0.14628514647483826,
0.08359582722187042,
-0.03965899348258972,
0.10161168873310089,
0.08310001343488693,
-0.006930455099791288,
0.03564847260713577,
0.1031232550740242,
0.005372201558202505,
0.10675041377544403,
-0.006755469832569361,
0.06401588767766953,
-0.03338472917675972,
-0.08792112022638321,
0.08167070150375366,
-0.010323048569262028,
0.020250296220183372,
-0.10378792881965637,
-0.028970465064048767,
-0.0880887359380722,
0.013742578215897083,
-0.02203640714287758,
0.09625642746686935,
0.10721126198768616,
-0.034965552389621735,
-0.04313969239592552,
-0.11734211444854736,
0.05343310907483101,
-0.024245336651802063,
0.04737602919340134,
-0.03793395310640335,
-0.2024908810853958,
-0.07856030762195587,
0.1080411747097969,
0.020871317014098167,
-0.3299965560436249,
0.01941400021314621,
-0.05248206853866577,
0.009955449029803276,
-0.006396995857357979,
0.21720491349697113,
0.031946711242198944,
0.07878469675779343,
-0.025556553155183792,
-0.13269753754138947,
-0.053418878465890884,
0.0876322090625763,
-0.17126847803592682,
-0.1080494225025177
] |
null | null |
transformers
|
# Robi Kobi
### Created by [Ritobrata Ghosh](https://ghosh-r.github.io)
A model that writes Bengali poems in the style of Nobel Laureate poet Rabindranath Tagore.
This model is fine-tuned on 1,400+ poems written by Rabindranath Tagore. This model leverages the [Bangla GPT-2](https://huggingface.co/ghosh-r/bangla-gpt2) pretrained model, trained on mc4-Bengali dataset.
|
{"language": "bn", "tags": ["text-generation"], "widget": [{"text": "\u09a4\u09cb\u09ae\u09be\u0995\u09c7 \u09a6\u09c7\u0996\u09c7\u099b\u09bf \u0986\u09ae\u09be\u09b0 \u09b9\u09c3\u09a6\u09df \u09ae\u09be\u099d\u09c7"}]}
|
text-generation
|
ritog/robi-kobi
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"bn",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"bn"
] |
TAGS
#transformers #pytorch #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Robi Kobi
### Created by Ritobrata Ghosh
A model that writes Bengali poems in the style of Nobel Laureate poet Rabindranath Tagore.
This model is fine-tuned on 1,400+ poems written by Rabindranath Tagore. This model leverages the Bangla GPT-2 pretrained model, trained on mc4-Bengali dataset.
|
[
"# Robi Kobi",
"### Created by Ritobrata Ghosh\n\nA model that writes Bengali poems in the style of Nobel Laureate poet Rabindranath Tagore.\n\nThis model is fine-tuned on 1,400+ poems written by Rabindranath Tagore. This model leverages the Bangla GPT-2 pretrained model, trained on mc4-Bengali dataset."
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Robi Kobi",
"### Created by Ritobrata Ghosh\n\nA model that writes Bengali poems in the style of Nobel Laureate poet Rabindranath Tagore.\n\nThis model is fine-tuned on 1,400+ poems written by Rabindranath Tagore. This model leverages the Bangla GPT-2 pretrained model, trained on mc4-Bengali dataset."
] |
[
49,
5,
81
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #bn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Robi Kobi### Created by Ritobrata Ghosh\n\nA model that writes Bengali poems in the style of Nobel Laureate poet Rabindranath Tagore.\n\nThis model is fine-tuned on 1,400+ poems written by Rabindranath Tagore. This model leverages the Bangla GPT-2 pretrained model, trained on mc4-Bengali dataset."
] |
[
0.02369925193488598,
-0.06570658832788467,
-0.003306929487735033,
0.09003010392189026,
0.058827802538871765,
0.007076971232891083,
0.07811751216650009,
0.07028292864561081,
0.023796692490577698,
-0.030551251024007797,
0.17501036822795868,
0.13769496977329254,
-0.02044961042702198,
0.09588848054409027,
0.05548254773020744,
-0.36768391728401184,
0.023822464048862457,
0.0699542909860611,
0.010038996115326881,
0.16874849796295166,
0.10039923340082169,
-0.0017523629358038306,
0.06594393402338028,
0.0464944951236248,
-0.12191077321767807,
-0.01443539373576641,
-0.036835599690675735,
-0.04779065400362015,
0.05371243134140968,
0.028391381725668907,
0.007682671770453453,
0.08619170635938644,
0.00912315584719181,
-0.05642879009246826,
0.03517568111419678,
-0.08078740537166595,
-0.06689231097698212,
0.0121125103905797,
0.020933926105499268,
-0.13874152302742004,
0.21199363470077515,
0.10109125822782516,
-0.01908092387020588,
-0.003897272050380707,
-0.13247522711753845,
-0.020805293694138527,
0.017475690692663193,
0.13858292996883392,
0.07986688613891602,
0.05464434251189232,
-0.051628850400447845,
-0.00771729601547122,
-0.07492310553789139,
0.05914401262998581,
0.12502971291542053,
-0.2932250499725342,
-0.04923912510275841,
0.1888953149318695,
0.11154109984636307,
0.0748688280582428,
-0.09857987612485886,
0.0500996969640255,
0.01511694397777319,
0.03016633912920952,
0.08341419696807861,
-0.1284913867712021,
0.07073231041431427,
0.045503243803977966,
-0.0982508659362793,
0.0895446240901947,
0.13785934448242188,
-0.08756579458713531,
0.0033274937886744738,
-0.03840385377407074,
-0.08838874846696854,
-0.07189815491437912,
-0.011334880255162716,
-0.04792378842830658,
-0.08305378258228302,
0.0628071203827858,
0.01943519525229931,
-0.044767193496227264,
-0.09752365946769714,
-0.05599888786673546,
-0.04863818362355232,
0.07892381399869919,
-0.00038069108268246055,
-0.005305768921971321,
-0.14444349706172943,
0.11728964000940323,
-0.0937977135181427,
-0.06256388872861862,
-0.0093722864985466,
-0.13445478677749634,
0.05453438311815262,
0.05293305590748787,
-0.04321262985467911,
-0.06906688213348389,
-0.030348362401127815,
0.20290806889533997,
0.07271957397460938,
0.03342888504266739,
0.1298726201057434,
0.022685278207063675,
0.0661066398024559,
0.06203799322247505,
-0.10027264058589935,
-0.05009153485298157,
0.023142967373132706,
-0.02841801941394806,
0.011689482256770134,
-0.0505632720887661,
-0.11037696897983551,
-0.06218618527054787,
0.04218665137887001,
-0.033158499747514725,
-0.12466034293174744,
0.1144319400191307,
-0.009910054504871368,
-0.055577341467142105,
-0.00033037428511306643,
0.004436551593244076,
-0.032081156969070435,
-0.024186182767152786,
0.013171170838177204,
-0.04544539377093315,
0.062022458761930466,
0.017776107415556908,
-0.03918778896331787,
0.025977768003940582,
-0.1103760302066803,
-0.05193771421909332,
-0.022588558495044708,
-0.027998125180602074,
-0.03588675335049629,
-0.12901954352855682,
0.05932636559009552,
-0.1338852494955063,
-0.1925283521413803,
-0.0017954404465854168,
0.058610618114471436,
-0.12871643900871277,
-0.046097155660390854,
-0.0659923255443573,
0.021530896425247192,
0.05857614055275917,
-0.05911537632346153,
0.02468482218682766,
-0.0402420274913311,
0.0709245428442955,
-0.037483613938093185,
0.11920227855443954,
-0.06479965150356293,
0.06686673313379288,
-0.07371820509433746,
0.02870023064315319,
-0.052414774894714355,
0.04482683539390564,
0.028038114309310913,
-0.014429713599383831,
-0.04015015438199043,
-0.024782242253422737,
-0.03193582221865654,
0.02836553193628788,
0.01379292644560337,
0.16407351195812225,
-0.002017347374930978,
-0.056131672114133835,
0.23033377528190613,
-0.03926756978034973,
-0.10612323880195618,
0.06340000778436661,
-0.03850515931844711,
0.2653556764125824,
0.06465502083301544,
0.19937337934970856,
0.01846606656908989,
0.1298602968454361,
0.09421546012163162,
0.023261889815330505,
0.046747926622629166,
0.01250249519944191,
0.08553776890039444,
0.06694130599498749,
-0.0658593401312828,
0.04561338201165199,
0.06686591356992722,
0.0009377573733218014,
-0.0532572977244854,
-0.019826460629701614,
0.01000805664807558,
-0.006622300948947668,
0.11609815806150436,
-0.0668438971042633,
0.09112391620874405,
-0.055056240409612656,
-0.0894661396741867,
-0.03997514396905899,
0.05608354136347771,
-0.04714300483465195,
0.04657982289791107,
-0.1002054363489151,
0.07217767834663391,
-0.0314045213162899,
0.05327295884490013,
-0.0903707817196846,
0.1423802524805069,
-0.030904075130820274,
0.07321151345968246,
0.025889523327350616,
0.06641344726085663,
0.05936442315578461,
-0.0097156697884202,
-0.035504549741744995,
0.09102229028940201,
0.10212012380361557,
-0.007822382263839245,
-0.06617964804172516,
-0.07185940444469452,
0.04030127078294754,
-0.04204598441720009,
0.1335306018590927,
-0.13897040486335754,
0.023226389661431313,
0.044440481811761856,
0.02845456637442112,
-0.02065262943506241,
0.016905535012483597,
0.08094563335180283,
-0.025898845866322517,
-0.06132825091481209,
0.04113351181149483,
0.0748697891831398,
0.008687957189977169,
-0.13618572056293488,
0.22702746093273163,
-0.12066327035427094,
0.06824206560850143,
0.07253360748291016,
-0.1424819827079773,
0.0012055617989972234,
0.002140467520803213,
-0.02969975769519806,
-0.031179213896393776,
0.06950078904628754,
0.04190518707036972,
0.094529889523983,
-0.020720452070236206,
0.1489764153957367,
-0.07480726391077042,
-0.011438162066042423,
0.028877275064587593,
-0.09517200291156769,
0.06964481621980667,
0.1469455510377884,
0.047863107174634933,
-0.10182373970746994,
0.1363261640071869,
0.10775411128997803,
0.039474017918109894,
0.18629150092601776,
0.08964220434427261,
0.006666847039014101,
-0.05099262669682503,
-0.06849931925535202,
-0.06732065975666046,
-0.019373398274183273,
-0.25548338890075684,
-0.0475127175450325,
0.05968857556581497,
-0.014461282640695572,
0.05044833570718765,
-0.08011370152235031,
-0.09436384588479996,
-0.0031364234164357185,
0.02839691750705242,
-0.016180066391825676,
0.17748777568340302,
-0.0900304988026619,
0.061487771570682526,
-0.004098031669855118,
-0.014698273502290249,
0.0492401160299778,
0.053357478231191635,
-0.09066332876682281,
0.20165753364562988,
-0.0396554134786129,
-0.2403952032327652,
-0.08141570538282394,
-0.19461160898208618,
-0.0203968845307827,
0.043497808277606964,
0.090812548995018,
-0.11535420268774033,
-0.025386616587638855,
-0.03776905685663223,
0.12544606626033783,
-0.05553865805268288,
-0.01217226404696703,
-0.11949530988931656,
-0.052112579345703125,
-0.1418878734111786,
-0.07098407298326492,
-0.04788494110107422,
-0.01888289488852024,
-0.06115829199552536,
0.1666172444820404,
-0.18911464512348175,
0.018601737916469574,
0.18225044012069702,
0.03213411197066307,
0.006555231288075447,
0.02364703267812729,
0.20361515879631042,
-0.07959787547588348,
0.08615192025899887,
0.09480681270360947,
-0.013017614372074604,
0.06029171869158745,
0.11674124747514725,
-0.011385517194867134,
-0.06022646650671959,
0.0458345003426075,
-0.019092800095677376,
-0.09459679573774338,
-0.1704247146844864,
-0.03630119562149048,
-0.05869262292981148,
0.1106073409318924,
0.04511983320116997,
0.037038303911685944,
0.08597874641418457,
0.20664727687835693,
0.008200405165553093,
0.046831369400024414,
0.03475821018218994,
0.08673249930143356,
0.10559789836406708,
-0.03213518112897873,
0.14575839042663574,
-0.0007931568543426692,
-0.07897283881902695,
0.05979347229003906,
0.010501863434910774,
0.1318395733833313,
0.052952807396650314,
0.10831649601459503,
0.09562251716852188,
0.09300559014081955,
0.11647510528564453,
0.07432684302330017,
-0.05987068638205528,
-0.02314208447933197,
-0.07003340870141983,
-0.04993884637951851,
-0.05205858126282692,
0.11466103047132492,
-0.15137404203414917,
-0.06521394848823547,
0.022197416052222252,
-0.07233032584190369,
0.011519533582031727,
0.03210275620222092,
0.11122800409793854,
-0.14926433563232422,
-0.04463252052664757,
0.07709159702062607,
-0.007794912438839674,
-0.07745519280433655,
0.10061511397361755,
-0.04177580028772354,
-0.1189361959695816,
0.014210419729351997,
-0.046506259590387344,
0.09571283310651779,
-0.04678618162870407,
0.0554790273308754,
-0.1278582066297531,
-0.17303122580051422,
-0.012100878171622753,
0.10080787539482117,
-0.18334920704364777,
0.17874681949615479,
-0.015530913136899471,
-0.07425423711538315,
-0.050934236496686935,
-0.07416197657585144,
0.0017223755130544305,
0.13966885209083557,
0.0984252318739891,
0.008559400215744972,
0.14310522377490997,
-0.015519091859459877,
-0.04550128057599068,
0.02136101946234703,
0.0657249242067337,
-0.1355801224708557,
0.05809146910905838,
-0.01147623360157013,
0.032641079276800156,
-0.0455651693046093,
0.05110665783286095,
-0.021123630926012993,
-0.1021142527461052,
0.09557200223207474,
-0.021196775138378143,
0.08804240822792053,
0.0019038028549402952,
-0.07967040687799454,
0.11319432407617569,
0.10359612107276917,
-0.006541618146002293,
-0.16068944334983826,
-0.03128726780414581,
-0.028864752501249313,
0.061573851853609085,
-0.05957464501261711,
0.024258382618427277,
-0.07517523318529129,
-0.04018627107143402,
0.003677167696878314,
-0.06407598406076431,
0.07277917861938477,
-0.011151080019772053,
-0.07286762446165085,
-0.03683768957853317,
0.10755298286676407,
0.09265544265508652,
0.002568653551861644,
0.03305699676275253,
0.0093366215005517,
-0.1374790519475937,
-0.10325878858566284,
-0.05118301510810852,
-0.040990713983774185,
-0.05848553031682968,
-0.05095754191279411,
-0.0883740782737732,
0.08009789884090424,
-0.026830347254872322,
-0.11519013345241547,
0.11577405780553818,
0.10184338688850403,
-0.014177853241562843,
0.15298035740852356,
0.21904253959655762,
-0.07544714212417603,
-0.23482997715473175,
-0.1870395839214325,
-0.0954015925526619,
0.031291794031858444,
-0.041702501475811005,
-0.1738215833902359,
0.06763904541730881,
-0.007508223410695791,
0.010987414047122002,
0.042918018996715546,
-0.2589181959629059,
-0.11893653869628906,
0.13095378875732422,
0.025390639901161194,
0.4631965160369873,
-0.14501596987247467,
-0.06272494047880173,
-0.04232387989759445,
-0.2193911224603653,
0.07258371263742447,
-0.10288495570421219,
0.1523628979921341,
-0.09639143943786621,
0.14773370325565338,
-0.013497545383870602,
0.023680297657847404,
0.08120933920145035,
0.06489752978086472,
-0.047274019569158554,
-0.1405453383922577,
-0.09307199716567993,
0.09940381348133087,
0.02612898126244545,
0.02117876149713993,
-0.04918627813458443,
0.004817818757146597,
-0.2155531942844391,
-0.10879849642515182,
-0.07993309199810028,
-0.06454359740018845,
-0.015528388321399689,
-0.12583006918430328,
-0.04690789431333542,
0.10134148597717285,
-0.009853041730821133,
-0.02399759739637375,
0.025860819965600967,
-0.10611481219530106,
0.08496903628110886,
0.034105706959962845,
0.06863082945346832,
-0.11313816905021667,
0.02342129871249199,
-0.045715223997831345,
-0.0556473582983017,
0.04413093253970146,
-0.21147799491882324,
-0.03451485559344292,
0.08103271573781967,
0.013556212186813354,
0.047839175909757614,
0.06557412445545197,
-0.05917961150407791,
0.04612206295132637,
0.1237260028719902,
-0.1786240190267563,
-0.06123547628521919,
-0.035567644983530045,
-0.10320816934108734,
0.09675514698028564,
0.03693534806370735,
0.16878022253513336,
-0.06732252985239029,
-0.073743537068367,
0.025190703570842743,
-0.0044601005502045155,
-0.06983275711536407,
0.05566070228815079,
0.017950480803847313,
0.0068177832290530205,
-0.054298367351293564,
0.0027649118565022945,
0.027923623099923134,
0.015889156609773636,
-0.007920436561107635,
0.1825331598520279,
-0.1673397570848465,
-0.08559238910675049,
-0.14863504469394684,
0.012494731694459915,
-0.20664605498313904,
0.06848911941051483,
0.0047361948527395725,
-0.09220504015684128,
0.057098034769296646,
0.16687782108783722,
0.06987728178501129,
0.044122934341430664,
-0.05358235910534859,
-0.0037843843456357718,
0.009142893366515636,
0.019634846597909927,
0.1294184774160385,
-0.02407083846628666,
-0.003185020061209798,
-0.08800248056650162,
-0.007124795578420162,
0.1401083916425705,
-0.09249330312013626,
-0.07297990471124649,
-0.11790242046117783,
0.039243828505277634,
-0.2727368474006653,
-0.019955575466156006,
-0.11519671231508255,
-0.08096829056739807,
-0.0829610675573349,
-0.027324888855218887,
-0.0728357806801796,
-0.0522814579308033,
-0.06537853926420212,
0.010604176670312881,
-0.07656063884496689,
0.032745469361543655,
-0.013438018038868904,
-0.027575364336371422,
0.07787349820137024,
-0.02305312640964985,
0.08734014630317688,
0.07482731342315674,
-0.07003846019506454,
0.05380392074584961,
-0.05928609520196915,
0.07585294544696808,
-0.05581493675708771,
-0.032779477536678314,
-0.0062063904479146,
-0.052042730152606964,
-0.021267596632242203,
-0.015727728605270386,
0.08340498059988022,
0.07296712696552277,
0.07534188032150269,
-0.07761812955141068,
-0.0069385068491101265,
-0.06069059669971466,
-0.04021801799535751,
-0.0694100633263588,
0.023426001891493797,
-0.018912391737103462,
0.08395811915397644,
0.07744444906711578,
-0.03284776210784912,
0.021310696378350258,
-0.04076134413480759,
0.06237475946545601,
0.0035849434789270163,
-0.1053411141037941,
-0.027756687253713608,
-0.12380275130271912,
-0.062169525772333145,
-0.03518364578485489,
0.20819616317749023,
-0.0303928405046463,
-0.13803358376026154,
0.010291769169270992,
0.020898055285215378,
0.06843316555023193,
-0.03486119210720062,
0.18060263991355896,
0.13519887626171112,
-0.051409490406513214,
-0.09965495020151138,
0.12556104362010956,
0.004404447972774506,
0.15934953093528748,
0.023683106526732445,
-0.023942993953824043,
0.11407521367073059,
0.08364357799291611,
-0.0935535877943039,
0.07357227057218552,
0.001009885803796351,
-0.05737578496336937,
-0.022061586380004883,
0.045271459966897964,
-0.055662598460912704,
0.18300263583660126,
0.24397610127925873,
0.0032604567240923643,
0.03432105854153633,
-0.000054659834859194234,
-0.09516582638025284,
-0.11900267750024796,
-0.21026849746704102,
-0.08442816883325577,
-0.08672550320625305,
0.011752691119909286,
-0.09600972384214401,
-0.020967597141861916,
0.13396571576595306,
0.09969013184309006,
-0.03654376417398453,
0.1130046546459198,
0.08278170228004456,
-0.11320041865110397,
0.11443504691123962,
-0.02320808358490467,
0.0334930419921875,
0.025208257138729095,
0.04061093181371689,
-0.08077435195446014,
0.010265265591442585,
-0.00753646669909358,
0.0533149279654026,
-0.015682125464081764,
-0.05511636659502983,
-0.05853559076786041,
-0.0685126781463623,
-0.10542143881320953,
0.09075946360826492,
0.05274847149848938,
0.12456297129392624,
-0.04483402520418167,
-0.06217559054493904,
0.001016362220980227,
0.1856347620487213,
0.05032983049750328,
-0.008616524748504162,
-0.03803727775812149,
0.1341652125120163,
0.04150846228003502,
0.07513027638196945,
0.011592067778110504,
0.025281578302383423,
-0.06397734582424164,
0.2868947684764862,
0.25234225392341614,
-0.0855894535779953,
0.0040474445559084415,
0.06903700530529022,
0.046185728162527084,
0.06488817185163498,
0.10701403766870499,
0.03621228411793709,
0.1995561122894287,
-0.12015746533870697,
-0.009895842522382736,
-0.07162361592054367,
-0.052173979580402374,
-0.013616049662232399,
0.09749041497707367,
0.11009382456541061,
0.0079879155382514,
-0.08834758400917053,
0.08686134964227676,
-0.24260783195495605,
0.05843370035290718,
-0.10463930666446686,
-0.11398083716630936,
-0.09453266859054565,
-0.02022322453558445,
-0.1422724723815918,
-0.0037621152587234974,
0.12262582033872604,
0.0045987400226294994,
-0.0006054672412574291,
0.026827000081539154,
0.05804683268070221,
-0.14187955856323242,
0.028913704678416252,
0.15658807754516602,
0.08086401224136353,
0.06664227694272995,
-0.04225529730319977,
0.018128223717212677,
0.09307762235403061,
0.006316042505204678,
-0.006163651589304209,
0.07639560103416443,
0.01112733967602253,
0.10946977138519287,
-0.03729129955172539,
0.05216794088482857,
0.02090531215071678,
-0.12265900522470474,
0.0624367892742157,
-0.01483123004436493,
0.05244855210185051,
0.06401512771844864,
-0.004447635728865862,
-0.08746646344661713,
0.055468518286943436,
-0.10035731643438339,
0.1019255593419075,
0.11698572337627411,
-0.04441468045115471,
-0.04755851998925209,
-0.057672057300806046,
0.03133508563041687,
-0.058620743453502655,
-0.035506971180438995,
-0.0068591986782848835,
-0.12217343598604202,
-0.04725135490298271,
0.10641264170408249,
0.04868600144982338,
-0.3727516829967499,
-0.0032617172691971064,
-0.10793273150920868,
0.06640876084566116,
-0.050234463065862656,
0.14754627645015717,
-0.015298902057111263,
0.058646462857723236,
0.010945189744234085,
-0.07661017775535583,
-0.01400529034435749,
0.017592761665582657,
-0.14361825585365295,
-0.16502641141414642
] |
null | null |
transformers
|
You can test this model online with the [**Space for Romanian Speech Recognition**](https://huggingface.co/spaces/gigant/romanian-speech-recognition)
The model ranked **TOP-1** on Romanian Speech Recognition during HuggingFace's Robust Speech Challenge :
* [**The 🤗 Speech Bench**](https://huggingface.co/spaces/huggingface/hf-speech-bench)
* [**Speech Challenge Leaderboard**](https://huggingface.co/spaces/speech-recognition-community-v2/FinalLeaderboard)
# Romanian Wav2Vec2
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the [Common Voice 8.0 - Romanian subset](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0) dataset, with extra training data from [Romanian Speech Synthesis](https://huggingface.co/datasets/gigant/romanian_speech_synthesis_0_8_1) dataset.
Without the 5-gram Language Model optimization, it achieves the following results on the evaluation set (Common Voice 8.0, Romanian subset, test split):
- Loss: 0.1553
- Wer: 0.1174
- Cer: 0.0294
## Model description
The architecture is based on [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) with a speech recognition CTC head and an added 5-gram language model (using [pyctcdecode](https://github.com/kensho-technologies/pyctcdecode) and [kenlm](https://github.com/kpu/kenlm)) trained on the [Romanian Corpora Parliament](gigant/ro_corpora_parliament_processed) dataset. Those libraries are needed in order for the language model-boosted decoder to work.
## Intended uses & limitations
The model is made for speech recognition in Romanian from audio clips sampled at **16kHz**. The predicted text is lowercased and does not contain any punctuation.
## How to use
Make sure you have installed the correct dependencies for the language model-boosted version to work. You can just run this command to install the `kenlm` and `pyctcdecode` libraries :
```pip install https://github.com/kpu/kenlm/archive/master.zip pyctcdecode```
With the framework `transformers` you can load the model with the following code :
```
from transformers import AutoProcessor, AutoModelForCTC
processor = AutoProcessor.from_pretrained("gigant/romanian-wav2vec2")
model = AutoModelForCTC.from_pretrained("gigant/romanian-wav2vec2")
```
Or, if you want to test the model, you can load the automatic speech recognition pipeline from `transformers` with :
```
from transformers import pipeline
asr = pipeline("automatic-speech-recognition", model="gigant/romanian-wav2vec2")
```
## Example use with the `datasets` library
First, you need to load your data
We will use the [Romanian Speech Synthesis](https://huggingface.co/datasets/gigant/romanian_speech_synthesis_0_8_1) dataset in this example.
```
from datasets import load_dataset
dataset = load_dataset("gigant/romanian_speech_synthesis_0_8_1")
```
You can listen to the samples with the `IPython.display` library :
```
from IPython.display import Audio
i = 0
sample = dataset["train"][i]
Audio(sample["audio"]["array"], rate = sample["audio"]["sampling_rate"])
```
The model is trained to work with audio sampled at 16kHz, so if the sampling rate of the audio in the dataset is different, we will have to resample it.
In the example, the audio is sampled at 48kHz. We can see this by checking `dataset["train"][0]["audio"]["sampling_rate"]`
The following code resample the audio using the `torchaudio` library :
```
import torchaudio
import torch
i = 0
audio = sample["audio"]["array"]
rate = sample["audio"]["sampling_rate"]
resampler = torchaudio.transforms.Resample(rate, 16_000)
audio_16 = resampler(torch.Tensor(audio)).numpy()
```
To listen to the resampled sample :
```
Audio(audio_16, rate=16000)
```
Know you can get the model prediction by running
```
predicted_text = asr(audio_16)
ground_truth = dataset["train"][i]["sentence"]
print(f"Predicted text : {predicted_text}")
print(f"Ground truth : {ground_truth}")
```
## Training and evaluation data
Training data :
- [Common Voice 8.0 - Romanian subset](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0) : train + validation + other splits
- [Romanian Speech Synthesis](https://huggingface.co/datasets/gigant/romanian_speech_synthesis_0_8_1) : train + test splits
Evaluation data :
- [Common Voice 8.0 - Romanian subset](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0) : test split
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 3
- total_train_batch_size: 48
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|
| 2.9272 | 0.78 | 500 | 0.7603 | 0.7734 | 0.2355 |
| 0.6157 | 1.55 | 1000 | 0.4003 | 0.4866 | 0.1247 |
| 0.4452 | 2.33 | 1500 | 0.2960 | 0.3689 | 0.0910 |
| 0.3631 | 3.11 | 2000 | 0.2580 | 0.3205 | 0.0796 |
| 0.3153 | 3.88 | 2500 | 0.2465 | 0.2977 | 0.0747 |
| 0.2795 | 4.66 | 3000 | 0.2274 | 0.2789 | 0.0694 |
| 0.2615 | 5.43 | 3500 | 0.2277 | 0.2685 | 0.0675 |
| 0.2389 | 6.21 | 4000 | 0.2135 | 0.2518 | 0.0627 |
| 0.2229 | 6.99 | 4500 | 0.2054 | 0.2449 | 0.0614 |
| 0.2067 | 7.76 | 5000 | 0.2096 | 0.2378 | 0.0597 |
| 0.1977 | 8.54 | 5500 | 0.2042 | 0.2387 | 0.0600 |
| 0.1896 | 9.32 | 6000 | 0.2110 | 0.2383 | 0.0595 |
| 0.1801 | 10.09 | 6500 | 0.1909 | 0.2165 | 0.0548 |
| 0.174 | 10.87 | 7000 | 0.1883 | 0.2206 | 0.0559 |
| 0.1685 | 11.65 | 7500 | 0.1848 | 0.2097 | 0.0528 |
| 0.1591 | 12.42 | 8000 | 0.1851 | 0.2039 | 0.0514 |
| 0.1537 | 13.2 | 8500 | 0.1881 | 0.2065 | 0.0518 |
| 0.1504 | 13.97 | 9000 | 0.1840 | 0.1972 | 0.0499 |
| 0.145 | 14.75 | 9500 | 0.1845 | 0.2029 | 0.0517 |
| 0.1417 | 15.53 | 10000 | 0.1884 | 0.2003 | 0.0507 |
| 0.1364 | 16.3 | 10500 | 0.2010 | 0.2037 | 0.0517 |
| 0.1331 | 17.08 | 11000 | 0.1838 | 0.1923 | 0.0483 |
| 0.129 | 17.86 | 11500 | 0.1818 | 0.1922 | 0.0489 |
| 0.1198 | 18.63 | 12000 | 0.1760 | 0.1861 | 0.0465 |
| 0.1203 | 19.41 | 12500 | 0.1686 | 0.1839 | 0.0465 |
| 0.1225 | 20.19 | 13000 | 0.1828 | 0.1920 | 0.0479 |
| 0.1145 | 20.96 | 13500 | 0.1673 | 0.1784 | 0.0446 |
| 0.1053 | 21.74 | 14000 | 0.1802 | 0.1810 | 0.0456 |
| 0.1071 | 22.51 | 14500 | 0.1769 | 0.1775 | 0.0444 |
| 0.1053 | 23.29 | 15000 | 0.1920 | 0.1783 | 0.0457 |
| 0.1024 | 24.07 | 15500 | 0.1904 | 0.1775 | 0.0446 |
| 0.0987 | 24.84 | 16000 | 0.1793 | 0.1762 | 0.0446 |
| 0.0949 | 25.62 | 16500 | 0.1801 | 0.1766 | 0.0443 |
| 0.0942 | 26.4 | 17000 | 0.1731 | 0.1659 | 0.0423 |
| 0.0906 | 27.17 | 17500 | 0.1776 | 0.1698 | 0.0424 |
| 0.0861 | 27.95 | 18000 | 0.1716 | 0.1600 | 0.0406 |
| 0.0851 | 28.73 | 18500 | 0.1662 | 0.1630 | 0.0410 |
| 0.0844 | 29.5 | 19000 | 0.1671 | 0.1572 | 0.0393 |
| 0.0792 | 30.28 | 19500 | 0.1768 | 0.1599 | 0.0407 |
| 0.0798 | 31.06 | 20000 | 0.1732 | 0.1558 | 0.0394 |
| 0.0779 | 31.83 | 20500 | 0.1694 | 0.1544 | 0.0388 |
| 0.0718 | 32.61 | 21000 | 0.1709 | 0.1578 | 0.0399 |
| 0.0732 | 33.38 | 21500 | 0.1697 | 0.1523 | 0.0391 |
| 0.0708 | 34.16 | 22000 | 0.1616 | 0.1474 | 0.0375 |
| 0.0678 | 34.94 | 22500 | 0.1698 | 0.1474 | 0.0375 |
| 0.0642 | 35.71 | 23000 | 0.1681 | 0.1459 | 0.0369 |
| 0.0661 | 36.49 | 23500 | 0.1612 | 0.1411 | 0.0357 |
| 0.0629 | 37.27 | 24000 | 0.1662 | 0.1414 | 0.0355 |
| 0.0587 | 38.04 | 24500 | 0.1659 | 0.1408 | 0.0351 |
| 0.0581 | 38.82 | 25000 | 0.1612 | 0.1382 | 0.0352 |
| 0.0556 | 39.6 | 25500 | 0.1647 | 0.1376 | 0.0345 |
| 0.0543 | 40.37 | 26000 | 0.1658 | 0.1335 | 0.0337 |
| 0.052 | 41.15 | 26500 | 0.1716 | 0.1369 | 0.0343 |
| 0.0513 | 41.92 | 27000 | 0.1600 | 0.1317 | 0.0330 |
| 0.0491 | 42.7 | 27500 | 0.1671 | 0.1311 | 0.0328 |
| 0.0463 | 43.48 | 28000 | 0.1613 | 0.1289 | 0.0324 |
| 0.0468 | 44.25 | 28500 | 0.1599 | 0.1260 | 0.0315 |
| 0.0435 | 45.03 | 29000 | 0.1556 | 0.1232 | 0.0308 |
| 0.043 | 45.81 | 29500 | 0.1588 | 0.1240 | 0.0309 |
| 0.0421 | 46.58 | 30000 | 0.1567 | 0.1217 | 0.0308 |
| 0.04 | 47.36 | 30500 | 0.1533 | 0.1198 | 0.0302 |
| 0.0389 | 48.14 | 31000 | 0.1582 | 0.1185 | 0.0297 |
| 0.0387 | 48.91 | 31500 | 0.1576 | 0.1187 | 0.0297 |
| 0.0376 | 49.69 | 32000 | 0.1560 | 0.1182 | 0.0295 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Tokenizers 0.11.0
- pyctcdecode 0.3.0
- kenlm
|
{"language": ["ro"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "hf-asr-leaderboard", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0", "gigant/romanian_speech_synthesis_0_8_1"], "base_model": "facebook/wav2vec2-xls-r-300m", "model-index": [{"name": "wav2vec2-ro-300m_01", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event", "type": "speech-recognition-community-v2/dev_data", "args": "ro"}, "metrics": [{"type": "wer", "value": 46.99, "name": "Dev WER (without LM)"}, {"type": "cer", "value": 16.04, "name": "Dev CER (without LM)"}, {"type": "wer", "value": 38.63, "name": "Dev WER (with LM)"}, {"type": "cer", "value": 14.52, "name": "Dev CER (with LM)"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice", "type": "mozilla-foundation/common_voice_8_0", "args": "ro"}, "metrics": [{"type": "wer", "value": 11.73, "name": "Test WER (without LM)"}, {"type": "cer", "value": 2.93, "name": "Test CER (without LM)"}, {"type": "wer", "value": 7.31, "name": "Test WER (with LM)"}, {"type": "cer", "value": 2.17, "name": "Test CER (with LM)"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ro"}, "metrics": [{"type": "wer", "value": 43.23, "name": "Test WER"}]}]}]}
|
automatic-speech-recognition
|
gigant/romanian-wav2vec2
|
[
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"hf-asr-leaderboard",
"robust-speech-event",
"ro",
"dataset:mozilla-foundation/common_voice_8_0",
"dataset:gigant/romanian_speech_synthesis_0_8_1",
"base_model:facebook/wav2vec2-xls-r-300m",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ro"
] |
TAGS
#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #ro #dataset-mozilla-foundation/common_voice_8_0 #dataset-gigant/romanian_speech_synthesis_0_8_1 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us
|
You can test this model online with the Space for Romanian Speech Recognition
The model ranked TOP-1 on Romanian Speech Recognition during HuggingFace's Robust Speech Challenge :
* The Speech Bench
* Speech Challenge Leaderboard
Romanian Wav2Vec2
=================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the Common Voice 8.0 - Romanian subset dataset, with extra training data from Romanian Speech Synthesis dataset.
Without the 5-gram Language Model optimization, it achieves the following results on the evaluation set (Common Voice 8.0, Romanian subset, test split):
* Loss: 0.1553
* Wer: 0.1174
* Cer: 0.0294
Model description
-----------------
The architecture is based on facebook/wav2vec2-xls-r-300m with a speech recognition CTC head and an added 5-gram language model (using pyctcdecode and kenlm) trained on the Romanian Corpora Parliament dataset. Those libraries are needed in order for the language model-boosted decoder to work.
Intended uses & limitations
---------------------------
The model is made for speech recognition in Romanian from audio clips sampled at 16kHz. The predicted text is lowercased and does not contain any punctuation.
How to use
----------
Make sure you have installed the correct dependencies for the language model-boosted version to work. You can just run this command to install the 'kenlm' and 'pyctcdecode' libraries :
With the framework 'transformers' you can load the model with the following code :
Or, if you want to test the model, you can load the automatic speech recognition pipeline from 'transformers' with :
Example use with the 'datasets' library
---------------------------------------
First, you need to load your data
We will use the Romanian Speech Synthesis dataset in this example.
You can listen to the samples with the 'IPython.display' library :
The model is trained to work with audio sampled at 16kHz, so if the sampling rate of the audio in the dataset is different, we will have to resample it.
In the example, the audio is sampled at 48kHz. We can see this by checking 'dataset["train"][0]["audio"]["sampling\_rate"]'
The following code resample the audio using the 'torchaudio' library :
To listen to the resampled sample :
Know you can get the model prediction by running
Training and evaluation data
----------------------------
Training data :
* Common Voice 8.0 - Romanian subset : train + validation + other splits
* Romanian Speech Synthesis : train + test splits
Evaluation data :
* Common Voice 8.0 - Romanian subset : test split
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.003
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 3
* total\_train\_batch\_size: 48
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 50.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.2
* Pytorch 1.10.0+cu111
* Tokenizers 0.11.0
* pyctcdecode 0.3.0
* kenlm
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 3\n* total\\_train\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Tokenizers 0.11.0\n* pyctcdecode 0.3.0\n* kenlm"
] |
[
"TAGS\n#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #ro #dataset-mozilla-foundation/common_voice_8_0 #dataset-gigant/romanian_speech_synthesis_0_8_1 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 3\n* total\\_train\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Tokenizers 0.11.0\n* pyctcdecode 0.3.0\n* kenlm"
] |
[
136,
159,
4,
39
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #wav2vec2 #automatic-speech-recognition #hf-asr-leaderboard #robust-speech-event #ro #dataset-mozilla-foundation/common_voice_8_0 #dataset-gigant/romanian_speech_synthesis_0_8_1 #base_model-facebook/wav2vec2-xls-r-300m #license-apache-2.0 #model-index #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 3\n* total\\_train\\_batch\\_size: 48\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.0+cu111\n* Tokenizers 0.11.0\n* pyctcdecode 0.3.0\n* kenlm"
] |
[
-0.08998565375804901,
0.10385368764400482,
-0.0038046150002628565,
0.0593067891895771,
0.1219780445098877,
0.02218812145292759,
0.1303509920835495,
0.11532796919345856,
-0.07923534512519836,
0.09462378174066544,
0.0891098901629448,
0.07306123524904251,
0.06547548621892929,
0.08896151930093765,
-0.011814907193183899,
-0.26663511991500854,
0.0320824570953846,
-0.026463357731699944,
-0.11757393181324005,
0.10233865678310394,
0.06958502531051636,
-0.1210399866104126,
0.024873556569218636,
0.011726442724466324,
-0.09611770510673523,
0.017110159620642662,
-0.043672703206539154,
-0.027463359758257866,
0.11448343098163605,
0.03650720790028572,
0.0625312551856041,
0.03910793364048004,
0.10793018341064453,
-0.29040274024009705,
0.007417554501444101,
0.06046077609062195,
0.022083157673478127,
0.04058550298213959,
0.09793782234191895,
0.01134246401488781,
0.14354164898395538,
-0.08949442952871323,
0.07417719066143036,
0.019259728491306305,
-0.10897299647331238,
-0.24391582608222961,
-0.06526295095682144,
0.03483022749423981,
0.12007158994674683,
0.12394007295370102,
-0.04973869398236275,
0.11717663705348969,
-0.10853450745344162,
0.08122765272855759,
0.1690405160188675,
-0.23865091800689697,
-0.07537688314914703,
-0.025538034737110138,
0.031034240499138832,
0.0642528384923935,
-0.1061546579003334,
-0.016816051676869392,
0.0028393594548106194,
0.021434849128127098,
0.0972588062286377,
0.01800922118127346,
0.03926282376050949,
-0.01586730033159256,
-0.14149752259254456,
-0.062121570110321045,
0.08908580988645554,
0.10950308293104172,
0.0021752133034169674,
-0.1016853079199791,
-0.020748406648635864,
-0.146214097738266,
-0.04944228008389473,
-0.00998188741505146,
0.02406565472483635,
-0.040721595287323,
-0.06565867364406586,
0.046777915209531784,
-0.05890049785375595,
-0.044142916798591614,
0.06530870497226715,
0.15347683429718018,
0.04974592849612236,
-0.04802119359374046,
0.013127725571393967,
0.07956327497959137,
0.06845425069332123,
-0.16924743354320526,
0.0195707269012928,
0.05301915481686592,
-0.09282730519771576,
-0.04507513344287872,
-0.012804944068193436,
-0.01792803592979908,
0.03729667142033577,
0.12846910953521729,
-0.021922755986452103,
0.0893385261297226,
0.03054763935506344,
-0.0006192059954628348,
-0.08727297186851501,
0.15863728523254395,
-0.07012961059808731,
-0.09257780760526657,
-0.05287926644086838,
0.15191172063350677,
-0.0012078714789822698,
-0.005022367928177118,
-0.06440354883670807,
0.039987485855817795,
0.09513325244188309,
0.041439685970544815,
-0.01910584419965744,
0.03679965063929558,
-0.05105310305953026,
-0.01863156072795391,
0.019736917689442635,
-0.1010298877954483,
0.04680461436510086,
0.04226408153772354,
-0.053917232900857925,
-0.0349612720310688,
-0.02955976128578186,
0.037022609263658524,
-0.028854336589574814,
0.13716168701648712,
-0.05756070837378502,
0.0003679863584693521,
-0.06891972571611404,
-0.11446770280599594,
0.01739196851849556,
-0.0029815800953656435,
0.014221373945474625,
-0.0637257993221283,
-0.05080077797174454,
-0.05781048536300659,
0.06501440703868866,
-0.04087439179420471,
-0.05960618332028389,
-0.08594976365566254,
-0.06453055143356323,
0.03852389380335808,
-0.0061539844609797,
0.13463231921195984,
-0.07132767140865326,
0.07839018106460571,
0.012560226954519749,
0.031218428164720535,
0.06961686909198761,
0.05385182425379753,
-0.06414677947759628,
0.040784917771816254,
-0.11835358291864395,
0.07245181500911713,
-0.07947748154401779,
0.06522469222545624,
-0.1224147379398346,
-0.11132806539535522,
-0.04112378880381584,
-0.004768698010593653,
0.09503315389156342,
0.11503969877958298,
-0.16374002397060394,
-0.10259794443845749,
0.18328115344047546,
-0.06086117774248123,
-0.10422952473163605,
0.15605683624744415,
-0.023043248802423477,
-0.041751161217689514,
0.04388764128088951,
0.17030547559261322,
0.10824023932218552,
-0.09469575434923172,
-0.021197807043790817,
-0.09450282156467438,
0.11681942641735077,
0.06936853379011154,
0.08249985426664352,
-0.036843523383140564,
0.06801248341798782,
-0.010262605734169483,
-0.0451086089015007,
0.0882788747549057,
-0.0696190670132637,
-0.09260791540145874,
-0.012991057708859444,
-0.08224207162857056,
0.008064961060881615,
0.07514509558677673,
0.017866067588329315,
-0.09181267023086548,
-0.13924595713615417,
0.02655158005654812,
0.08842117339372635,
-0.11146849393844604,
0.00867529958486557,
-0.08021042495965958,
0.04692817106842995,
-0.008899467997252941,
-0.0011826659319922328,
-0.14140290021896362,
-0.031764425337314606,
0.028709793463349342,
-0.056398116052150726,
0.02463362365961075,
-0.02491382136940956,
0.08600159734487534,
0.07557127624750137,
-0.0609506219625473,
-0.06671341508626938,
-0.06906096637248993,
0.028268281370401382,
-0.06429698318243027,
-0.2329917699098587,
-0.04798451438546181,
-0.01396613847464323,
0.22261522710323334,
-0.23846280574798584,
0.0029114673379808664,
0.008191116154193878,
0.12807317078113556,
0.007389704696834087,
-0.04835199937224388,
-0.03334198147058487,
0.07013387978076935,
-0.012287814170122147,
-0.07718441635370255,
0.028511645272374153,
-0.013657689094543457,
-0.09985668957233429,
-0.04500385373830795,
-0.16512364149093628,
0.10811875015497208,
0.10757207125425339,
-0.005192826967686415,
-0.08313555270433426,
-0.0017551153432577848,
-0.05459590256214142,
-0.051813334226608276,
-0.0264324638992548,
0.022932125255465508,
0.19684834778308868,
0.013085572980344296,
0.10235094279050827,
-0.04993155971169472,
-0.028438854962587357,
0.034956157207489014,
0.02514546923339367,
-0.02260035090148449,
0.1598108857870102,
0.09198591858148575,
-0.13704460859298706,
0.098692387342453,
0.08829004317522049,
-0.062297895550727844,
0.14975382387638092,
-0.050734687596559525,
-0.08617847412824631,
-0.0346437506377697,
0.019812017679214478,
0.029818566516041756,
0.11451779305934906,
-0.15124140679836273,
0.007445861119776964,
-0.0021292532328516245,
0.008369619958102703,
0.011389690451323986,
-0.17720633745193481,
-0.022977164015173912,
0.05483856052160263,
-0.062692791223526,
-0.026590658351778984,
-0.03517220541834831,
-0.008914800360798836,
0.08860106021165848,
0.014057700522243977,
-0.08311059325933456,
-0.03524718061089516,
-0.024411551654338837,
-0.09034305065870285,
0.16789743304252625,
-0.07467112690210342,
-0.1361931562423706,
-0.07411490380764008,
0.0009429440833628178,
-0.023701732978224754,
-0.02183908224105835,
0.03761522099375725,
-0.1280239373445511,
-0.05125315114855766,
-0.09009300172328949,
-0.00583515502512455,
-0.0342179536819458,
0.011875052005052567,
0.028115639463067055,
0.005230388138443232,
0.06563322246074677,
-0.07364746183156967,
0.00459224684163928,
-0.019595423713326454,
-0.008072848431766033,
0.012873639352619648,
0.02188056707382202,
0.10637159645557404,
0.1528702676296234,
0.028552304953336716,
0.033747509121894836,
-0.008704678155481815,
0.22699493169784546,
-0.12748484313488007,
-0.01608092151582241,
0.08897513896226883,
-0.01626051776111126,
0.031156757846474648,
0.1787858009338379,
0.05979360640048981,
-0.0763121098279953,
0.0202801413834095,
0.046839334070682526,
-0.00894066970795393,
-0.17562992870807648,
-0.04779199883341789,
-0.06475462019443512,
-0.07908516377210617,
0.12545564770698547,
0.028884628787636757,
-0.02069309540092945,
0.013117484748363495,
-0.025737397372722626,
-0.031020473688840866,
0.04776895046234131,
0.05273307487368584,
0.08140334486961365,
0.06730563193559647,
0.10607311129570007,
-0.009743810631334782,
-0.07410869747400284,
0.012633917853236198,
-0.034850113093853,
0.20800679922103882,
0.018846232444047928,
0.13231374323368073,
0.03724969923496246,
0.16200734674930573,
-0.008221282623708248,
0.030448470264673233,
0.029170019552111626,
-0.042527783662080765,
0.027352437376976013,
-0.05768744647502899,
0.00282535282894969,
0.02949584275484085,
0.1129378229379654,
0.01319499034434557,
-0.08734774589538574,
-0.001788396737538278,
0.041899923235177994,
0.29531022906303406,
0.08926807343959808,
-0.3080447316169739,
-0.0748719573020935,
-0.0037140154745429754,
-0.05283980071544647,
-0.015085372142493725,
-0.0030940985307097435,
0.09849780797958374,
-0.07842657715082169,
0.08357367664575577,
-0.0583660788834095,
0.07241899520158768,
-0.0506431870162487,
0.028925612568855286,
0.09183903783559799,
0.07472044229507446,
-0.005729102995246649,
0.05746391415596008,
-0.23471662402153015,
0.280979186296463,
-0.012483252212405205,
0.09136659651994705,
-0.050038158893585205,
0.02052290365099907,
0.03557609021663666,
-0.013833901844918728,
0.06280559301376343,
-0.0041779931634664536,
-0.07945995032787323,
-0.2010873407125473,
-0.08406894654035568,
0.037961699068546295,
0.11304159462451935,
-0.05017559975385666,
0.13082356750965118,
-0.03422039747238159,
-0.03568049520254135,
0.05777061730623245,
-0.022822396829724312,
-0.1377459317445755,
-0.1307242065668106,
0.03237919136881828,
0.01600290834903717,
0.038886766880750656,
-0.07862193137407303,
-0.11594530940055847,
-0.09961680322885513,
0.1878078430891037,
-0.019412118941545486,
0.0008228867081925273,
-0.12310824543237686,
0.08134224265813828,
0.14306017756462097,
-0.07082436233758926,
0.034935787320137024,
0.028301388025283813,
0.10057134926319122,
0.014870790764689445,
-0.011930529959499836,
0.10351241379976273,
-0.07711236923933029,
-0.17736110091209412,
-0.042192403227090836,
0.15664628148078918,
0.05823548138141632,
0.0718681737780571,
0.004110563546419144,
0.05226244032382965,
-0.0012296310160309076,
-0.07587487995624542,
0.06081019714474678,
-0.012222531251609325,
0.01652596890926361,
0.02841545082628727,
-0.013607071712613106,
-0.03443969041109085,
-0.10034911334514618,
-0.05321841686964035,
0.12652726471424103,
0.3273453414440155,
-0.08063904196023941,
0.010271002538502216,
0.02172224596142769,
-0.047899287194013596,
-0.1239984855055809,
0.005175544880330563,
0.11603955924510956,
0.029522303491830826,
0.05982240289449692,
-0.1702813059091568,
0.03895995393395424,
0.06830407679080963,
-0.012765721417963505,
0.04576215147972107,
-0.3227475881576538,
-0.13851530849933624,
0.10687080770730972,
0.11529011279344559,
-0.04593481868505478,
-0.13408581912517548,
-0.030902262777090073,
-0.03562845289707184,
-0.09232612699270248,
0.07559961825609207,
-0.0075213233940303326,
0.10742177069187164,
0.017094671726226807,
0.0067451512441039085,
0.03127693012356758,
-0.05779515951871872,
0.15352723002433777,
-0.027796510607004166,
0.041362129151821136,
-0.020516768097877502,
0.020776525139808655,
-0.0054862345568835735,
-0.055104974657297134,
-0.0074340784922242165,
-0.09545798599720001,
0.004848139360547066,
-0.08554357290267944,
-0.04178799316287041,
-0.09077107161283493,
-0.00043093098793178797,
-0.05031060054898262,
-0.027855761349201202,
-0.014424138702452183,
0.057300254702568054,
0.1082964763045311,
0.0136245246976614,
0.08362530171871185,
-0.047059401869773865,
0.13487155735492706,
0.06529877334833145,
0.11368751525878906,
0.0064156390726566315,
-0.06481079012155533,
-0.003560116747394204,
0.005180191248655319,
0.020136484876275063,
-0.04825420305132866,
0.04511412978172302,
0.15648077428340912,
0.027622703462839127,
0.1583653837442398,
0.047098059207201004,
-0.07583330571651459,
-0.008591130375862122,
0.06662110984325409,
-0.09754269570112228,
-0.16828520596027374,
-0.011564548127353191,
-0.03244587406516075,
-0.13127343356609344,
-0.04791455715894699,
0.10398875921964645,
-0.07174180448055267,
0.001677963649854064,
0.01120974961668253,
0.032117810100317,
-0.04321761056780815,
0.2156023532152176,
0.006915046833455563,
0.07606387883424759,
-0.07459787279367447,
0.05256170406937599,
0.06312327831983566,
-0.13653329014778137,
0.033959124237298965,
0.07578912377357483,
-0.06000247970223427,
-0.02775842696428299,
0.04521401971578598,
0.0868271067738533,
0.04086954891681671,
-0.039064209908246994,
-0.14015614986419678,
-0.14751967787742615,
0.06907166540622711,
0.04837360233068466,
0.025385262444615364,
-0.0030458492692559958,
-0.03648443892598152,
0.03853502869606018,
-0.11672686040401459,
0.10764281451702118,
0.08325950056314468,
0.04933503270149231,
-0.11471088975667953,
0.12029818445444107,
0.020125525072216988,
-0.03599041327834129,
0.007146481890231371,
0.010423075407743454,
-0.113575778901577,
0.026386577636003494,
-0.09842471033334732,
-0.045849986374378204,
-0.04667579382658005,
0.0025930784177035093,
-0.00877288542687893,
-0.059742894023656845,
-0.061935581266880035,
0.003648187732324004,
-0.10712778568267822,
-0.059475358575582504,
0.0007452919380739331,
0.05853595212101936,
-0.11140439659357071,
-0.01293029822409153,
0.042173199355602264,
-0.12299437820911407,
0.08023323863744736,
0.02931324951350689,
0.020619504153728485,
0.037283483892679214,
-0.12854807078838348,
-0.014397814869880676,
0.03121829591691494,
0.008538506925106049,
0.039445359259843826,
-0.17023757100105286,
-0.008763874880969524,
-0.010933995246887207,
0.019949916750192642,
0.013873199932277203,
0.029935311526060104,
-0.09128488600254059,
-0.003978223539888859,
-0.04397144913673401,
-0.08689984679222107,
-0.033459920436143875,
0.05454032123088837,
0.04936918988823891,
7.891041491348005e-7,
0.17982180416584015,
-0.0920754075050354,
0.05868195369839668,
-0.2432553917169571,
-0.02081788331270218,
-0.013157747685909271,
-0.07052358239889145,
-0.05751451477408409,
-0.028509318828582764,
0.09976436197757721,
-0.05935871973633766,
0.07706762105226517,
-0.022277045994997025,
0.06656515598297119,
0.043522704392671585,
-0.08781535178422928,
0.019993534311652184,
0.051752720028162,
0.18464817106723785,
0.05841849744319916,
-0.03100300393998623,
0.06668954342603683,
-0.015876848250627518,
0.04933285340666771,
0.13901984691619873,
0.16548216342926025,
0.1757226139307022,
0.04732748493552208,
0.055157873779535294,
0.06558272242546082,
-0.13111503422260284,
-0.12692385911941528,
0.1141713559627533,
-0.0701935738325119,
0.15710517764091492,
-0.01745179481804371,
0.21388261020183563,
0.09826744347810745,
-0.1954547017812729,
0.04386206716299057,
-0.0757773220539093,
-0.09448899328708649,
-0.08498457819223404,
-0.0725867971777916,
-0.08420863747596741,
-0.15591713786125183,
0.025685787200927734,
-0.09593617916107178,
0.04535418003797531,
0.07040737569332123,
0.04595965892076492,
0.016565779224038124,
0.13215185701847076,
0.06999985128641129,
0.010571605525910854,
0.09215576201677322,
0.005539516918361187,
-0.025923719629645348,
-0.04671670123934746,
-0.07304687798023224,
0.031158607453107834,
-0.031084686517715454,
0.060487378388643265,
-0.0528721809387207,
-0.12474548071622849,
0.0634632334113121,
0.017683781683444977,
-0.10519225150346756,
0.03765286132693291,
-0.024328188970685005,
0.07100342959165573,
0.06843820214271545,
0.030553678050637245,
0.000009600993507774547,
-0.02554664947092533,
0.20735709369182587,
-0.05707995966076851,
-0.07703935354948044,
-0.10387041419744492,
0.1949327141046524,
0.01047338917851448,
0.0041468278504908085,
0.02739594876766205,
-0.06943564862012863,
0.015994898974895477,
0.14624148607254028,
0.15970250964164734,
-0.05457377806305885,
-0.009131032973527908,
0.003496985649690032,
-0.013114995323121548,
-0.024315590038895607,
0.0750979334115982,
0.11333594471216202,
0.027510885149240494,
-0.06460072845220566,
-0.006239502690732479,
-0.057472195476293564,
-0.05755609646439552,
-0.020290562883019447,
0.04823661223053932,
0.06390851736068726,
0.012195387855172157,
-0.03694244846701622,
0.1212608814239502,
-0.031868986785411835,
-0.12570007145404816,
0.032481323927640915,
-0.20921877026557922,
-0.20173247158527374,
-0.0280933640897274,
0.07030922919511795,
0.018991898745298386,
0.05127968639135361,
-0.012373463250696659,
-0.03713444247841835,
0.0614030547440052,
0.0036495118401944637,
-0.013291014358401299,
-0.11687397956848145,
0.07357286661863327,
-0.09740246087312698,
0.20231840014457703,
-0.04167217016220093,
0.03771810233592987,
0.12451831251382828,
0.05860866233706474,
-0.07268931716680527,
0.055953167378902435,
0.0813177078962326,
-0.12468872219324112,
0.053800806403160095,
0.15266656875610352,
-0.050235141068696976,
0.13189636170864105,
0.06538160890340805,
-0.1075911596417427,
0.04865960404276848,
-0.11329113692045212,
-0.046349555253982544,
-0.031901318579912186,
0.011711143888533115,
-0.03770000860095024,
0.14746958017349243,
0.22929736971855164,
-0.05840502306818962,
0.004742311779409647,
-0.05369993671774864,
0.01398913748562336,
0.03935907408595085,
0.16680951416492462,
-0.06283482164144516,
-0.2837347686290741,
0.02921247109770775,
-0.01645324006676674,
0.008478682488203049,
-0.21621300280094147,
-0.11070211976766586,
0.024705560877919197,
-0.05948380380868912,
-0.06086082383990288,
0.12109380960464478,
0.08807199448347092,
0.04943902790546417,
-0.07378589361906052,
-0.13025450706481934,
-0.03721686080098152,
0.17735238373279572,
-0.1563905030488968,
-0.05740373209118843
] |
null | null |
transformers
|
# StackOBERTflow-comments-small
StackOBERTflow is a RoBERTa model trained on StackOverflow comments.
A Byte-level BPE tokenizer with dropout was used (using the `tokenizers` package).
The model is *small*, i.e. has only 6-layers and the maximum sequence length was restricted to 256 tokens.
The model was trained for 6 epochs on several GBs of comments from the StackOverflow corpus.
## Quick start: masked language modeling prediction
```python
from transformers import pipeline
from pprint import pprint
COMMENT = "You really should not do it this way, I would use <mask> instead."
fill_mask = pipeline(
"fill-mask",
model="giganticode/StackOBERTflow-comments-small-v1",
tokenizer="giganticode/StackOBERTflow-comments-small-v1"
)
pprint(fill_mask(COMMENT))
# [{'score': 0.019997311756014824,
# 'sequence': '<s> You really should not do it this way, I would use jQuery instead.</s>',
# 'token': 1738},
# {'score': 0.01693696901202202,
# 'sequence': '<s> You really should not do it this way, I would use arrays instead.</s>',
# 'token': 2844},
# {'score': 0.013411642983555794,
# 'sequence': '<s> You really should not do it this way, I would use CSS instead.</s>',
# 'token': 2254},
# {'score': 0.013224546797573566,
# 'sequence': '<s> You really should not do it this way, I would use it instead.</s>',
# 'token': 300},
# {'score': 0.011984303593635559,
# 'sequence': '<s> You really should not do it this way, I would use classes instead.</s>',
# 'token': 1779}]
```
|
{}
|
fill-mask
|
giganticode/StackOBERTflow-comments-small-v1
|
[
"transformers",
"pytorch",
"jax",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us
|
# StackOBERTflow-comments-small
StackOBERTflow is a RoBERTa model trained on StackOverflow comments.
A Byte-level BPE tokenizer with dropout was used (using the 'tokenizers' package).
The model is *small*, i.e. has only 6-layers and the maximum sequence length was restricted to 256 tokens.
The model was trained for 6 epochs on several GBs of comments from the StackOverflow corpus.
## Quick start: masked language modeling prediction
|
[
"# StackOBERTflow-comments-small\n\nStackOBERTflow is a RoBERTa model trained on StackOverflow comments.\nA Byte-level BPE tokenizer with dropout was used (using the 'tokenizers' package).\n\nThe model is *small*, i.e. has only 6-layers and the maximum sequence length was restricted to 256 tokens. \nThe model was trained for 6 epochs on several GBs of comments from the StackOverflow corpus.",
"## Quick start: masked language modeling prediction"
] |
[
"TAGS\n#transformers #pytorch #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n",
"# StackOBERTflow-comments-small\n\nStackOBERTflow is a RoBERTa model trained on StackOverflow comments.\nA Byte-level BPE tokenizer with dropout was used (using the 'tokenizers' package).\n\nThe model is *small*, i.e. has only 6-layers and the maximum sequence length was restricted to 256 tokens. \nThe model was trained for 6 epochs on several GBs of comments from the StackOverflow corpus.",
"## Quick start: masked language modeling prediction"
] |
[
40,
116,
11
] |
[
"passage: TAGS\n#transformers #pytorch #jax #roberta #fill-mask #autotrain_compatible #endpoints_compatible #region-us \n# StackOBERTflow-comments-small\n\nStackOBERTflow is a RoBERTa model trained on StackOverflow comments.\nA Byte-level BPE tokenizer with dropout was used (using the 'tokenizers' package).\n\nThe model is *small*, i.e. has only 6-layers and the maximum sequence length was restricted to 256 tokens. \nThe model was trained for 6 epochs on several GBs of comments from the StackOverflow corpus.## Quick start: masked language modeling prediction"
] |
[
-0.06955541670322418,
-0.09903471916913986,
-0.003654233878478408,
0.07267274707555771,
0.09424722194671631,
-0.04576202854514122,
0.09613797068595886,
0.09959447383880615,
0.025114858523011208,
0.06762509793043137,
0.18578603863716125,
0.004256033338606358,
-0.010015827603638172,
0.29722148180007935,
-0.06725448369979858,
-0.24375338852405548,
0.07863219082355499,
0.042223114520311356,
0.10103970766067505,
0.07253453135490417,
0.1147482693195343,
-0.12535004317760468,
0.07315658777952194,
-0.039175551384687424,
-0.09940898418426514,
0.014375082217156887,
-0.02191472053527832,
-0.058370091021060944,
0.02048303931951523,
0.024013781920075417,
0.0006469847867265344,
0.009586538188159466,
-0.055634792894124985,
0.007449616678059101,
0.04688232019543648,
0.06526681780815125,
0.026947777718305588,
0.07039245218038559,
-0.016746442764997482,
0.03141552209854126,
0.06745435297489166,
-0.09565592557191849,
-0.011638947762548923,
-0.0038282324094325304,
-0.01345770712941885,
-0.15292370319366455,
0.010598571039736271,
0.03887125477194786,
0.09689167886972427,
0.022380031645298004,
-0.002600292908027768,
0.2633189558982849,
-0.07843434810638428,
0.09778862446546555,
0.19527441263198853,
-0.26120346784591675,
-0.04529312625527382,
0.02795797772705555,
0.09221602976322174,
0.053314995020627975,
-0.04691781476140022,
-0.037531252950429916,
0.03166595846414566,
0.05591665208339691,
0.07918521016836166,
-0.06595373153686523,
0.012084460817277431,
-0.0692269504070282,
-0.09349654614925385,
-0.03991212695837021,
0.19426009058952332,
0.0031443736515939236,
-0.04822298139333725,
-0.15306760370731354,
-0.09071571379899979,
-0.0587618313729763,
-0.04249117523431778,
0.0011442428221926093,
-0.004831716418266296,
-0.020027292892336845,
-0.013614263385534286,
0.012907955795526505,
-0.050847601145505905,
-0.04064479470252991,
-0.07722076773643494,
0.16208647191524506,
0.012073579244315624,
0.07210728526115417,
-0.18611107766628265,
0.02259180136024952,
-0.023229137063026428,
-0.09754572063684464,
-0.017710646614432335,
-0.08049226552248001,
0.03786115348339081,
0.05643029510974884,
-0.009016970172524452,
-0.07699667662382126,
0.07384490221738815,
0.06449595838785172,
0.061120420694351196,
0.05466992408037186,
0.10390938073396683,
0.06822165846824646,
0.06703144311904907,
0.04084569960832596,
-0.01704513654112816,
-0.10592715442180634,
0.12380045652389526,
0.0008228073129430413,
0.052804019302129745,
-0.026851598173379898,
-0.1326943039894104,
0.02018432505428791,
0.07452575862407684,
0.050467751920223236,
-0.013922548852860928,
0.033982716500759125,
0.021703992038965225,
-0.0348292700946331,
0.13880302011966705,
-0.11202632635831833,
-0.008173328824341297,
0.01202545315027237,
-0.04827077314257622,
0.10507810860872269,
-0.06588321179151535,
-0.09855248034000397,
-0.10188756883144379,
-0.004497011192142963,
-0.09992498904466629,
-0.0881858840584755,
-0.10386403650045395,
-0.1585572510957718,
-0.01616763509809971,
-0.15174302458763123,
0.04128586873412132,
-0.159486785531044,
-0.22930677235126495,
0.006245845928788185,
-0.002893694443628192,
-0.03997195512056351,
-0.06566416472196579,
-0.025377683341503143,
-0.10100419074296951,
0.038088321685791016,
-0.031376879662275314,
0.1685047149658203,
-0.06315016001462936,
0.022603336721658707,
-0.026968149468302727,
0.12403549998998642,
-0.13690456748008728,
0.004993854556232691,
-0.10692337155342102,
0.010135600343346596,
-0.17608410120010376,
0.05759686231613159,
-0.0013879424659535289,
0.05228293314576149,
-0.062341030687093735,
-0.0647321492433548,
-0.019223330542445183,
0.020249467343091965,
0.023460371419787407,
0.10461851954460144,
-0.26381513476371765,
-0.061429060995578766,
0.14408712089061737,
-0.10851947963237762,
-0.0758003368973732,
0.14358820021152496,
-0.045034609735012054,
0.03790167719125748,
0.07366937398910522,
0.13460728526115417,
0.06477215886116028,
0.034571241587400436,
-0.004715389106422663,
0.09326618909835815,
-0.026241125538945198,
-0.14228662848472595,
0.022871289402246475,
0.08605151623487473,
0.023072272539138794,
0.033428166061639786,
0.04812173545360565,
0.017777545377612114,
-0.06039407104253769,
0.05358516424894333,
-0.0012124758213758469,
-0.013921800069510937,
-0.08732439577579498,
-0.028870118781924248,
0.07283003628253937,
-0.1283947080373764,
-0.07345227897167206,
-0.0006284392438828945,
0.08630701154470444,
-0.0393010713160038,
0.009044320322573185,
-0.07316091656684875,
0.11971262097358704,
-0.09807043522596359,
-0.0016411702381446958,
-0.1438230276107788,
-0.002277885563671589,
0.058162588626146317,
0.06471819430589676,
-0.06394459307193756,
0.16100598871707916,
0.05843910947442055,
-0.03168652951717377,
0.05213480070233345,
0.037732433527708054,
0.08649969846010208,
-0.022505227476358414,
-0.14606890082359314,
-0.08973579853773117,
-0.056989286094903946,
-0.11228620260953903,
-0.13910417258739471,
-0.016887737438082695,
-0.025153761729598045,
0.027196349576115608,
0.03177883103489876,
0.060239408165216446,
-0.00515162805095315,
0.031301308423280716,
-0.012711544521152973,
-0.05017678812146187,
-0.05799650028347969,
0.042826175689697266,
0.055254898965358734,
-0.004912438802421093,
0.08750366419553757,
-0.1694241315126419,
0.1031387448310852,
0.11987587809562683,
0.15214595198631287,
0.01772594265639782,
-0.03415140509605408,
-0.05838596075773239,
-0.01563890650868416,
0.0180854182690382,
-0.009297642856836319,
0.22130955755710602,
-0.011154673993587494,
0.13833209872245789,
-0.11715246737003326,
-0.04603990912437439,
0.053225744515657425,
-0.01671494171023369,
0.006283137481659651,
0.12455599009990692,
-0.05857676640152931,
-0.058861665427684784,
0.06701142340898514,
-0.02006364054977894,
0.0033101721201092005,
0.16541138291358948,
0.002499991562217474,
-0.06840405613183975,
0.05808693915605545,
-0.041140202432870865,
-0.006784411147236824,
0.05218914896249771,
-0.16706699132919312,
-0.06438811123371124,
0.09142503142356873,
0.04646420478820801,
0.07538864016532898,
-0.02441275306046009,
0.05760324373841286,
-0.05623267963528633,
-0.007962440140545368,
0.08716340363025665,
0.05503586307168007,
0.005286663770675659,
0.08540929108858109,
0.019422084093093872,
-0.13754020631313324,
0.04363987222313881,
-0.01569521240890026,
-0.022663837298750877,
0.16305647790431976,
-0.014197818003594875,
-0.29961347579956055,
-0.06455739587545395,
-0.085563525557518,
0.015565917827188969,
0.012502997182309628,
0.06718264520168304,
0.009629927575588226,
-0.07662954181432724,
-0.10826004296541214,
0.0018292984459549189,
0.002867708448320627,
0.08783603459596634,
0.0651126578450203,
-0.035764213651418686,
-0.010323663242161274,
-0.1114177405834198,
-0.031249340623617172,
-0.06195387616753578,
0.05071212351322174,
0.08096906542778015,
-0.018292739987373352,
-0.004187839571386576,
0.10786758363246918,
-0.044571686536073685,
0.04409267380833626,
-0.036159172654151917,
0.16581116616725922,
0.030970608815550804,
0.08027258515357971,
0.20332959294319153,
-0.002804514253512025,
0.023659927770495415,
0.02101915329694748,
-0.05220479890704155,
-0.07261641323566437,
0.07220626622438431,
-0.04987923428416252,
-0.1052444726228714,
-0.09929367899894714,
-0.07322733104228973,
-0.09706313163042068,
0.009455360472202301,
0.054721128195524216,
0.06220770254731178,
-0.15375299751758575,
0.07670991867780685,
0.0023771633859723806,
0.10002787411212921,
-0.0963304340839386,
0.09231780469417572,
-0.060434553772211075,
0.01172710582613945,
0.11974665522575378,
-0.034650906920433044,
-0.00932846125215292,
0.12553338706493378,
-0.06246001645922661,
0.14035794138908386,
-0.08727903664112091,
0.1494196355342865,
0.019001569598913193,
-0.04381745681166649,
0.12884284555912018,
0.1594037264585495,
-0.08782527595758438,
-0.007038358133286238,
-0.09732956439256668,
-0.02753702737390995,
-0.08253400027751923,
0.010553611442446709,
-0.07376907765865326,
0.06100517511367798,
-0.12661105394363403,
0.18693189322948456,
0.04495229572057724,
0.2568187713623047,
0.10699939727783203,
-0.2688659429550171,
-0.12697307765483856,
0.057069048285484314,
-0.07559753209352493,
-0.09543082863092422,
0.08315405994653702,
0.18424543738365173,
-0.037390679121017456,
-0.10545437037944794,
0.0023423696402460337,
0.09725561738014221,
-0.06891892105340958,
0.0690382644534111,
-0.09686259180307388,
0.1543688178062439,
0.0019312138902023435,
0.07535608857870102,
-0.20828206837177277,
0.0842844620347023,
-0.040209803730249405,
0.04404410719871521,
-0.14780519902706146,
0.002737092785537243,
-0.020190918818116188,
-0.09665635228157043,
0.08080646395683289,
-0.0006057063583284616,
-0.07216205447912216,
0.09284395724534988,
-0.19188636541366577,
0.05675188824534416,
0.007097750436514616,
0.040344297885894775,
0.03631560876965523,
-0.02390304021537304,
0.017179129645228386,
0.012852370738983154,
-0.004845777060836554,
0.07996482402086258,
-0.06671839207410812,
-0.015662802383303642,
0.12479697912931442,
-0.15709953010082245,
-0.0628676638007164,
-0.06089863181114197,
0.06352793425321579,
0.2263583093881607,
0.017123006284236908,
-0.10990770161151886,
-0.07293803989887238,
-0.0030708680860698223,
0.09361231327056885,
-0.07918071746826172,
0.02085138112306595,
-0.048781782388687134,
0.12986686825752258,
-0.004580232780426741,
-0.09564326703548431,
0.13615091145038605,
-0.14799489080905914,
-0.1129782497882843,
-0.037095822393894196,
0.0967632532119751,
0.0021291770972311497,
0.006665454246103764,
0.04091882333159447,
-0.05382484570145607,
-0.11284057796001434,
-0.07887224853038788,
-0.04222702980041504,
0.015813738107681274,
0.07346684485673904,
0.030235381796956062,
0.018590888008475304,
-0.05362769216299057,
0.0632060319185257,
0.08587324619293213,
0.14631561934947968,
0.2011236846446991,
-0.055857133120298386,
0.10063997656106949,
0.1969137340784073,
0.007620951626449823,
-0.2202179878950119,
-0.04662657901644707,
0.0013115168549120426,
0.05435468256473541,
-0.17621512711048126,
-0.0316113717854023,
0.1262570172548294,
0.028071898967027664,
0.008596559055149555,
0.15809573233127594,
-0.2347550243139267,
-0.06212978437542915,
0.21430398523807526,
0.005750727374106646,
0.45257803797721863,
-0.1158437728881836,
-0.008808627724647522,
-0.053557634353637695,
0.035466644912958145,
0.08295034617185593,
-0.08395352214574814,
0.11131796985864639,
-0.08667290955781937,
-0.014635392464697361,
-0.004055149853229523,
-0.09788219630718231,
0.04874832555651665,
0.02898668684065342,
0.07191977649927139,
-0.04678932577371597,
-0.015180777758359909,
0.04753118380904198,
-0.06572069972753525,
0.11316462606191635,
-0.0915084183216095,
0.08001423627138138,
-0.1814086139202118,
-0.03033515438437462,
-0.06874096393585205,
0.046178270131349564,
0.04146377742290497,
-0.07273706048727036,
-0.025467276573181152,
-0.0016859783791005611,
-0.02869650349020958,
0.0336553156375885,
0.092989981174469,
-0.0178131852298975,
-0.0020571467466652393,
0.18136997520923615,
0.06913706660270691,
-0.2528837025165558,
-0.05648506060242653,
0.030818307772278786,
-0.00956786796450615,
0.07945826649665833,
-0.07118392735719681,
0.02075691893696785,
0.054382048547267914,
-0.0010813252301886678,
0.04582929238677025,
0.06580702215433121,
-0.04706774279475212,
0.05584035441279411,
0.03701948747038841,
-0.1880902647972107,
0.05056888610124588,
0.031011898070573807,
0.061900924891233444,
-0.09360519796609879,
0.010588550940155983,
0.12419681996107101,
-0.05755459889769554,
-0.025914140045642853,
-0.047366004437208176,
0.043136242777109146,
0.02235223352909088,
0.10937407612800598,
0.023185675963759422,
0.07505516707897186,
-0.12674234807491302,
0.04947696253657341,
-0.02659570425748825,
-0.033849846571683884,
0.05312870442867279,
0.13450029492378235,
-0.13014551997184753,
-0.06439320743083954,
0.018205972388386726,
0.08671288937330246,
0.15088072419166565,
-0.02074653096497059,
-0.09146260470151901,
-0.12074602395296097,
0.09363728761672974,
0.09782171994447708,
0.09488659352064133,
0.029478449374437332,
-0.051834091544151306,
-0.02518138848245144,
-0.11006961762905121,
0.046759165823459625,
0.06370522081851959,
0.0356253907084465,
-0.06270312517881393,
0.11077738553285599,
-0.03284277021884918,
0.060185082256793976,
-0.07436125725507736,
-0.013117160648107529,
-0.1679186224937439,
-0.0031866778153926134,
-0.15803270041942596,
0.11044295877218246,
-0.04831993207335472,
-0.033014364540576935,
-0.01720486767590046,
0.05463415011763573,
-0.06793367862701416,
0.023490969091653824,
-0.05766009911894798,
0.030067814514040947,
0.0027171296533197165,
0.011824549175798893,
0.03536885976791382,
-0.012476911768317223,
-0.017177456989884377,
-0.06539987772703171,
0.14133816957473755,
0.07894908636808395,
-0.0738542303442955,
-0.015744639560580254,
-0.02802187018096447,
-0.04156956449151039,
0.11096656322479248,
0.0327778235077858,
0.09429711848497391,
-0.02287992276251316,
0.022041432559490204,
0.07165573537349701,
0.07449840754270554,
-0.016301661729812622,
0.05640501156449318,
-0.09903908520936966,
0.06047253683209419,
0.01713787205517292,
0.0033888535108417273,
-0.030069805681705475,
-0.03481471166014671,
0.06572414189577103,
0.1307985633611679,
0.1131797507405281,
-0.05394948273897171,
0.021809406578540802,
-0.09847207367420197,
0.002655657473951578,
-0.0064911385998129845,
-0.11050735414028168,
0.036361999809741974,
0.034966468811035156,
0.07414234429597855,
-0.029239626601338387,
0.21568629145622253,
0.06986469030380249,
-0.0747101679444313,
0.008311259560286999,
-0.06025530397891998,
-0.016476450487971306,
0.020020082592964172,
0.1348227858543396,
0.015062653459608555,
-0.02323140576481819,
0.0041349357925355434,
0.045905839651823044,
0.12803404033184052,
0.13863115012645721,
0.2310764640569687,
0.07461751252412796,
0.06639517098665237,
0.17578670382499695,
-0.08704807609319687,
0.003712730947881937,
-0.03294313699007034,
0.03396950662136078,
0.00713792908936739,
0.11926417052745819,
0.0009144468349404633,
0.024399716407060623,
0.16217203438282013,
-0.015540390275418758,
-0.0027798337396234274,
0.009685331955552101,
-0.03614398092031479,
-0.12169713526964188,
-0.10306628793478012,
-0.09964071214199066,
-0.012903989292681217,
-0.04718755558133125,
-0.1264396458864212,
-0.0678851455450058,
0.07120085507631302,
0.03461240977048874,
-0.0015480418223887682,
0.020594432950019836,
-0.04427804797887802,
-0.006098687648773193,
0.09795716404914856,
-0.01842295005917549,
-0.023762822151184082,
0.030651483684778214,
-0.021990980952978134,
-0.02186775952577591,
-0.014067226089537144,
0.03504158556461334,
0.038976188749074936,
0.07250010222196579,
0.026780683547258377,
-0.04645702242851257,
-0.09004416316747665,
-0.06531225144863129,
0.024590598419308662,
-0.010924679227173328,
0.12503118813037872,
0.06584176421165466,
-0.06123805418610573,
-0.010955522768199444,
0.14238213002681732,
-0.06407143175601959,
-0.07609006762504578,
-0.029608387500047684,
0.15073508024215698,
-0.011472498066723347,
0.03819285333156586,
-0.011142638511955738,
-0.03624391928315163,
-0.10096483677625656,
0.2209794521331787,
0.2558881938457489,
-0.05479234829545021,
0.004349493887275457,
0.013013766147196293,
-0.001546503510326147,
-0.04594900831580162,
0.16333168745040894,
0.13126179575920105,
0.23365084826946259,
-0.006175222806632519,
0.014041897840797901,
-0.06013263761997223,
-0.004251919221132994,
-0.131792351603508,
0.008098505437374115,
0.027327626943588257,
-0.025142384693026543,
-0.027624480426311493,
-0.051236141473054886,
-0.14451012015342712,
-0.12004407495260239,
-0.08897923678159714,
-0.06705625355243683,
-0.08207214623689651,
0.008409890346229076,
-0.04933077096939087,
0.06394502520561218,
0.12400265783071518,
-0.04860201105475426,
0.0368044339120388,
0.029821760952472687,
-0.038930073380470276,
-0.13442406058311462,
-0.04340016469359398,
0.09137725830078125,
-0.09284258633852005,
0.058197494596242905,
-0.0072789802215993404,
0.08717896789312363,
0.062131937593221664,
-0.015284964814782143,
-0.16412390768527985,
0.05908769741654396,
-0.05008212849497795,
0.014720343053340912,
0.07312572747468948,
0.09687194973230362,
-0.03753778338432312,
-0.014058969914913177,
0.006804355885833502,
-0.06509822607040405,
-0.08072707056999207,
-0.05947594717144966,
-0.007047218736261129,
-0.0537303201854229,
0.0028081927448511124,
-0.025204544886946678,
0.08578445762395859,
0.07910408824682236,
-0.04170670732855797,
-0.04816853255033493,
-0.020224561914801598,
0.07379033416509628,
-0.017067432403564453,
-0.06978580355644226,
-0.05466345325112343,
-0.1374368667602539,
-0.04378151893615723,
0.053570035845041275,
0.00025872921105474234,
-0.33624178171157837,
0.043271660804748535,
-0.17665116488933563,
-0.044062063097953796,
-0.07840894162654877,
0.053505171090364456,
-0.021801216527819633,
0.04686747491359711,
-0.06335388123989105,
0.13530822098255157,
-0.0029794673901051283,
0.06281593441963196,
-0.116858571767807,
-0.12451725453138351
] |
null | null |
transformers
|
## About
The *french-camembert-postag-model* is a part of speech tagging model for French that was trained on the *free-french-treebank* dataset available on
[github](https://github.com/nicolashernandez/free-french-treebank). The base tokenizer and model used for training is *'camembert-base'*.
## Supported Tags
It uses the following tags:
| Tag | Category | Extra Info |
|----------|:------------------------------:|------------:|
| ADJ | adjectif | |
| ADJWH | adjectif | |
| ADV | adverbe | |
| ADVWH | adverbe | |
| CC | conjonction de coordination | |
| CLO | pronom | obj |
| CLR | pronom | refl |
| CLS | pronom | suj |
| CS | conjonction de subordination | |
| DET | déterminant | |
| DETWH | déterminant | |
| ET | mot étranger | |
| I | interjection | |
| NC | nom commun | |
| NPP | nom propre | |
| P | préposition | |
| P+D | préposition + déterminant | |
| PONCT | signe de ponctuation | |
| PREF | préfixe | |
| PRO | autres pronoms | |
| PROREL | autres pronoms | rel |
| PROWH | autres pronoms | int |
| U | ? | |
| V | verbe | |
| VIMP | verbe imperatif | |
| VINF | verbe infinitif | |
| VPP | participe passé | |
| VPR | participe présent | |
| VS | subjonctif | |
More information on the tags can be found here:
http://alpage.inria.fr/statgram/frdep/Publications/crabbecandi-taln2008-final.pdf
## Usage
The usage of this model follows the common transformers patterns. Here is a short example of its usage:
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("gilf/french-camembert-postag-model")
model = AutoModelForTokenClassification.from_pretrained("gilf/french-camembert-postag-model")
from transformers import pipeline
nlp_token_class = pipeline('ner', model=model, tokenizer=tokenizer, grouped_entities=True)
nlp_token_class('Face à un choc inédit, les mesures mises en place par le gouvernement ont permis une protection forte et efficace des ménages')
```
The lines above would display something like this on a Jupyter notebook:
```
[{'entity_group': 'NC', 'score': 0.5760144591331482, 'word': '<s>'},
{'entity_group': 'U', 'score': 0.9946700930595398, 'word': 'Face'},
{'entity_group': 'P', 'score': 0.999615490436554, 'word': 'à'},
{'entity_group': 'DET', 'score': 0.9995906352996826, 'word': 'un'},
{'entity_group': 'NC', 'score': 0.9995531439781189, 'word': 'choc'},
{'entity_group': 'ADJ', 'score': 0.999183714389801, 'word': 'inédit'},
{'entity_group': 'P', 'score': 0.3710663616657257, 'word': ','},
{'entity_group': 'DET', 'score': 0.9995903968811035, 'word': 'les'},
{'entity_group': 'NC', 'score': 0.9995649456977844, 'word': 'mesures'},
{'entity_group': 'VPP', 'score': 0.9988670349121094, 'word': 'mises'},
{'entity_group': 'P', 'score': 0.9996246099472046, 'word': 'en'},
{'entity_group': 'NC', 'score': 0.9995329976081848, 'word': 'place'},
{'entity_group': 'P', 'score': 0.9996233582496643, 'word': 'par'},
{'entity_group': 'DET', 'score': 0.9995935559272766, 'word': 'le'},
{'entity_group': 'NC', 'score': 0.9995369911193848, 'word': 'gouvernement'},
{'entity_group': 'V', 'score': 0.9993771314620972, 'word': 'ont'},
{'entity_group': 'VPP', 'score': 0.9991101026535034, 'word': 'permis'},
{'entity_group': 'DET', 'score': 0.9995885491371155, 'word': 'une'},
{'entity_group': 'NC', 'score': 0.9995636343955994, 'word': 'protection'},
{'entity_group': 'ADJ', 'score': 0.9991781711578369, 'word': 'forte'},
{'entity_group': 'CC', 'score': 0.9991298317909241, 'word': 'et'},
{'entity_group': 'ADJ', 'score': 0.9992275238037109, 'word': 'efficace'},
{'entity_group': 'P+D', 'score': 0.9993300437927246, 'word': 'des'},
{'entity_group': 'NC', 'score': 0.8353511393070221, 'word': 'ménages</s>'}]
```
|
{"language": "fr", "widget": [{"text": "Face \u00e0 un choc in\u00e9dit, les mesures mises en place par le gouvernement ont permis une protection forte et efficace des m\u00e9nages"}]}
|
token-classification
|
gilf/french-camembert-postag-model
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"camembert",
"token-classification",
"fr",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"fr"
] |
TAGS
#transformers #pytorch #tf #safetensors #camembert #token-classification #fr #autotrain_compatible #endpoints_compatible #region-us
|
About
-----
The *french-camembert-postag-model* is a part of speech tagging model for French that was trained on the *free-french-treebank* dataset available on
github. The base tokenizer and model used for training is *'camembert-base'*.
Supported Tags
--------------
It uses the following tags:
More information on the tags can be found here:
URL
Usage
-----
The usage of this model follows the common transformers patterns. Here is a short example of its usage:
The lines above would display something like this on a Jupyter notebook:
|
[] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #camembert #token-classification #fr #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
49
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #camembert #token-classification #fr #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.06362611055374146,
0.017571667209267616,
-0.008775456808507442,
0.044385481625795364,
0.13044190406799316,
-0.005548839457333088,
0.058732934296131134,
0.06576350331306458,
0.044709462672472,
-0.01122544426470995,
0.12484294921159744,
0.23053795099258423,
-0.04802464693784714,
0.13453654944896698,
-0.11860436946153641,
-0.24532055854797363,
0.07648459076881409,
0.04886955767869949,
-0.05062747746706009,
0.10868494212627411,
0.1077319011092186,
-0.08753231167793274,
0.07557471096515656,
-0.036467667669057846,
-0.1327032893896103,
0.03908145800232887,
0.07124987989664078,
-0.13520337641239166,
0.12623044848442078,
0.05194937437772751,
0.1782873421907425,
0.0648605152964592,
-0.04502337425947189,
-0.10747303813695908,
0.03917361795902252,
0.025270886719226837,
-0.08239241689443588,
0.04371584951877594,
0.08058780431747437,
-0.08707211166620255,
0.017797866836190224,
0.03523222729563713,
0.029220178723335266,
0.0592363104224205,
-0.13288132846355438,
-0.14433936774730682,
-0.027117643505334854,
0.06048049405217171,
0.04599378630518913,
0.046373382210731506,
0.027667567133903503,
0.2379784882068634,
-0.14043934643268585,
0.13634280860424042,
0.08096113801002502,
-0.28545746207237244,
-0.024439703673124313,
0.08886418491601944,
0.0033411860931664705,
-0.0033185025677084923,
-0.04421962797641754,
0.031128408387303352,
0.049758538603782654,
0.017063727602362633,
0.05368990823626518,
-0.07248639315366745,
-0.13108934462070465,
0.00014538921823259443,
-0.10803396999835968,
-0.015472949482500553,
0.18655210733413696,
-0.020932655781507492,
0.021348008885979652,
0.006353069096803665,
-0.11169777065515518,
-0.01713487319648266,
-0.006882222834974527,
-0.03790057823061943,
-0.036383289843797684,
0.028389932587742805,
0.04484459012746811,
0.034683823585510254,
-0.11349282413721085,
0.021704377606511116,
-0.2234712839126587,
0.2581542432308197,
0.0289055984467268,
0.05174393206834793,
-0.16077615320682526,
0.06350181996822357,
-0.019716113805770874,
-0.09698061645030975,
0.03315819799900055,
-0.10010304301977158,
0.001259566517546773,
-0.06301672756671906,
-0.003998854663223028,
0.010588731616735458,
0.10824159532785416,
0.17600110173225403,
-0.022786486893892288,
0.032942138612270355,
-0.033168449997901917,
0.06473629176616669,
0.01635872945189476,
0.04799843579530716,
-0.0043364050798118114,
-0.005311891902238131,
0.08357718586921692,
-0.10523129999637604,
-0.014215037226676941,
-0.03031819500029087,
-0.088377445936203,
-0.023907750844955444,
0.10454508662223816,
0.10371917486190796,
-0.004985200706869364,
0.08127591013908386,
-0.05000363290309906,
-0.005505007226020098,
0.13659267127513885,
-0.07741303741931915,
-0.0034233112819492817,
0.009004971012473106,
0.03824647516012192,
0.06303376704454422,
-0.013285377062857151,
-0.002466742182150483,
-0.030185632407665253,
0.11738552153110504,
-0.06401678919792175,
-0.03474444895982742,
0.0074622975662350655,
-0.07271981984376907,
0.04645046591758728,
-0.11943280696868896,
0.05987009033560753,
-0.2139594405889511,
-0.07788790762424469,
0.036877959966659546,
0.016375241801142693,
0.013991148211061954,
-0.025336792692542076,
0.026373757049441338,
-0.032286349684000015,
-0.014570512808859348,
-0.03881838545203209,
-0.0669349804520607,
-0.06626876443624496,
0.09210781753063202,
-0.019814157858490944,
0.03876524046063423,
-0.10543444752693176,
0.02778630517423153,
-0.11877791583538055,
0.012833540327847004,
-0.1338779777288437,
-0.02785111777484417,
-0.06862340867519379,
0.16314776241779327,
-0.012353102676570415,
-0.059757299721241,
-0.08521275222301483,
0.025701427832245827,
-0.050976622849702835,
0.15661896765232086,
-0.06949539482593536,
-0.07790570706129074,
0.21057936549186707,
-0.15459378063678741,
-0.15674221515655518,
0.11065514385700226,
0.008300314657390118,
-0.007796959951519966,
0.0734487995505333,
0.11681900173425674,
0.12409395724534988,
-0.07276250422000885,
0.0678459033370018,
0.10501983761787415,
-0.14594224095344543,
-0.08798482269048691,
-0.003828526707366109,
0.012875513173639774,
-0.16650502383708954,
0.04221365228295326,
0.04350397363305092,
0.08865422755479813,
-0.08078011870384216,
-0.02733822725713253,
-0.04257694259285927,
-0.033963121473789215,
0.09337983280420303,
0.04985472932457924,
0.08020349591970444,
-0.09032513201236725,
0.01306381355971098,
-0.02955065295100212,
0.010299992747604847,
0.048672109842300415,
-0.011090838350355625,
-0.09355004131793976,
0.1349453330039978,
-0.007746487390249968,
0.005087637808173895,
-0.1731983721256256,
-0.14354446530342102,
0.010853303596377373,
0.09148823469877243,
-0.03878884017467499,
0.11572195589542389,
0.08874624222517014,
-0.0016634021885693073,
-0.02155441604554653,
-0.050347950309515,
0.1774778962135315,
0.04652408882975578,
-0.03212452679872513,
-0.11252973973751068,
0.06146645545959473,
-0.08716700226068497,
0.0339762344956398,
-0.09626006335020065,
0.014548670500516891,
0.14016549289226532,
0.17459595203399658,
0.03234758973121643,
0.08248124271631241,
-0.03169989585876465,
0.03329230099916458,
-0.06904169172048569,
-0.0036224154755473137,
0.07662565261125565,
0.001920092268846929,
-0.055797647684812546,
0.12336210906505585,
-0.1308339536190033,
0.42115142941474915,
0.1882210373878479,
-0.23610979318618774,
-0.05315368250012398,
-0.029352428391575813,
-0.027437597513198853,
0.02330762706696987,
0.03539859503507614,
0.018274541944265366,
-0.007211614865809679,
-0.016584428027272224,
0.15760670602321625,
-0.04687974974513054,
-0.06599652022123337,
0.05276944860816002,
-0.04947197064757347,
-0.05830793082714081,
0.07824619859457016,
0.04506825655698776,
-0.2357206642627716,
0.17318516969680786,
0.2396758645772934,
0.049372751265764236,
0.11955386400222778,
-0.04306212440133095,
0.0284004844725132,
0.04900685325264931,
0.03483401983976364,
-0.0043789781630039215,
0.02069486305117607,
-0.12350314855575562,
-0.009912177920341492,
0.06489381939172745,
0.01727832667529583,
0.02184484526515007,
-0.13396862149238586,
-0.04072260484099388,
0.01449321024119854,
0.03483045846223831,
0.0171182993799448,
0.11147673428058624,
0.03539341315627098,
0.12828777730464935,
-0.03206009417772293,
-0.1474633663892746,
0.10891765356063843,
0.014123176224529743,
-0.09011702984571457,
0.18626506626605988,
-0.12344850599765778,
-0.2885480523109436,
-0.07381875813007355,
-0.11170990020036697,
-0.023467136546969414,
0.03274833410978317,
0.062413301318883896,
-0.09381658583879471,
-0.05994898080825806,
0.025798019021749496,
-0.02636171691119671,
-0.030858570709824562,
0.08374885469675064,
-0.0736420527100563,
0.05585607513785362,
-0.005386417265981436,
-0.056374091655015945,
-0.07024627923965454,
-0.03996686637401581,
-0.046041056513786316,
0.15044008195400238,
-0.07692181318998337,
0.081666499376297,
0.13767413794994354,
-0.026153048500418663,
0.04207201302051544,
-0.0393269881606102,
0.19469459354877472,
-0.06966350972652435,
0.037614159286022186,
0.17907845973968506,
-0.05586590990424156,
0.07537535578012466,
0.15753693878650665,
0.03363710641860962,
-0.06792543083429337,
0.0305104348808527,
-0.022926155477762222,
-0.09813860803842545,
-0.1655048280954361,
-0.1238204762339592,
-0.09427798539400101,
0.04111691564321518,
0.035287145525217056,
0.08115915954113007,
0.15110450983047485,
0.09744445979595184,
0.023119071498513222,
-0.035312503576278687,
0.001090019359253347,
0.05339618772268295,
0.17319734394550323,
0.010082503780722618,
0.12359004467725754,
-0.05949683487415314,
-0.14725539088249207,
0.09687511622905731,
0.020753294229507446,
0.09140942245721817,
0.07521158456802368,
-0.07694052904844284,
0.040683288127183914,
0.1665838062763214,
0.1470160186290741,
0.15261128544807434,
0.008908716030418873,
-0.061327408999204636,
-0.001456002821214497,
-0.019539689645171165,
-0.0304744653403759,
-0.0023602228611707687,
0.026874879375100136,
-0.08254483342170715,
-0.0734720304608345,
-0.09985854476690292,
0.09541124850511551,
0.08074279129505157,
0.04152373597025871,
-0.21349109709262848,
0.0003014895773958415,
0.0515163317322731,
0.0013974702451378107,
-0.05941559746861458,
0.07869803160429001,
0.00509015703573823,
-0.1145319938659668,
0.09328965097665787,
-0.04906829446554184,
0.07401658594608307,
0.005168515723198652,
0.06438866257667542,
-0.00803909357637167,
-0.06686127930879593,
0.006463971920311451,
0.05577360466122627,
-0.24972257018089294,
0.24902553856372833,
0.004147936124354601,
-0.05075225606560707,
-0.07520050555467606,
-0.011801284737884998,
0.04930296912789345,
0.2438141107559204,
0.13980385661125183,
0.0175368320196867,
-0.08793454617261887,
-0.17087790369987488,
-0.001985176233574748,
0.019173666834831238,
0.09053071588277817,
-0.003165102330967784,
-0.015773627907037735,
-0.039796337485313416,
-0.033235762268304825,
0.0006942369509488344,
-0.05662950128316879,
-0.03386712446808815,
-0.09772781282663345,
0.04536934196949005,
0.032140400260686874,
0.01760030724108219,
-0.03267177939414978,
-0.047861550003290176,
-0.15123365819454193,
0.15827718377113342,
-0.07979631423950195,
-0.05661589279770851,
-0.1370471864938736,
-0.10881778597831726,
0.04865336790680885,
-0.09151102602481842,
0.08980585634708405,
-0.08919209986925125,
0.013164508156478405,
-0.04448120668530464,
-0.17750880122184753,
0.14000709354877472,
-0.1499246507883072,
-0.06243366003036499,
-0.0514833927154541,
0.158713698387146,
-0.07083042711019516,
-0.010560143738985062,
0.024124082177877426,
0.006491024978458881,
-0.031133973971009254,
-0.08328402787446976,
-0.0005458438536152244,
-0.0843631848692894,
0.020234836265444756,
0.03488759696483612,
-0.07110310345888138,
-0.10681699961423874,
0.0013486683601513505,
0.022902794182300568,
0.1910865306854248,
0.2345390021800995,
-0.0657861977815628,
0.08175484836101532,
0.14781905710697174,
0.007399903144687414,
-0.3182850480079651,
-0.0458061508834362,
-0.13546989858150482,
-0.06183964014053345,
-0.01627604104578495,
-0.06445872783660889,
0.13508924841880798,
0.03237666189670563,
-0.045135557651519775,
0.11732257902622223,
-0.1282711625099182,
-0.07741162180900574,
0.22133734822273254,
0.04877074435353279,
0.36577391624450684,
-0.10941223800182343,
-0.06403657793998718,
0.01973677985370159,
-0.12740929424762726,
0.12705132365226746,
-0.06460548937320709,
0.021793313324451447,
-0.0023258260916918516,
-0.030414285138249397,
0.044756751507520676,
-0.05969903990626335,
0.08464384824037552,
-0.0038035495672374964,
0.07639104127883911,
-0.1036725863814354,
-0.07763747125864029,
0.0321875736117363,
-0.00579633004963398,
0.017545081675052643,
0.03483330458402634,
0.04426487907767296,
-0.05791143700480461,
-0.021858863532543182,
-0.0815676897764206,
0.1192932203412056,
0.016041090711951256,
-0.0799533948302269,
0.0088246650993824,
0.0018267794512212276,
-0.010613977909088135,
-0.029338225722312927,
0.20453664660453796,
-0.005377247463911772,
0.17697672545909882,
0.14254577457904816,
0.1231478601694107,
-0.15352007746696472,
0.03818294405937195,
-0.04890616610646248,
-0.07298649847507477,
0.07269102334976196,
-0.02452518790960312,
0.0798354297876358,
0.11919349431991577,
-0.024416081607341766,
0.04665978625416756,
0.10114160925149918,
0.03718268498778343,
-0.06703893095254898,
0.1586289256811142,
-0.24095484614372253,
-0.015497246757149696,
-0.04486919194459915,
-0.04750111699104309,
0.08920010924339294,
0.13354404270648956,
0.11590586602687836,
0.020822012796998024,
-0.01404210738837719,
-0.01543489284813404,
-0.0267338789999485,
-0.03889964520931244,
0.0698942244052887,
0.093405582010746,
0.03330349549651146,
-0.11815018951892853,
0.021587014198303223,
0.012433941476047039,
-0.15932588279247284,
-0.05058617517352104,
0.08509630709886551,
-0.16866841912269592,
-0.12933950126171112,
-0.0036125709302723408,
0.07657663524150848,
-0.13174991309642792,
-0.06295182555913925,
-0.057678982615470886,
-0.16052846610546112,
0.0593118853867054,
0.27465322613716125,
0.10858530551195145,
0.1005033552646637,
0.011488920077681541,
-0.029233889654278755,
-0.03762945532798767,
0.016207963228225708,
0.006170108448714018,
0.0571235753595829,
-0.18923386931419373,
0.051336269825696945,
-0.017991557717323303,
0.14401519298553467,
-0.10265479981899261,
-0.034076761454343796,
-0.15923838317394257,
0.02146093174815178,
-0.06445380300283432,
-0.06015285477042198,
-0.06782566010951996,
-0.011602172628045082,
0.015605621039867401,
-0.09364782273769379,
-0.019780248403549194,
-0.01913166046142578,
-0.1005064845085144,
0.07489077001810074,
0.0519588403403759,
0.028141887858510017,
-0.060724757611751556,
-0.06720086932182312,
0.08750220388174057,
-0.04155813530087471,
0.11711414158344269,
0.09567010402679443,
-0.08224315196275711,
0.09077088534832001,
-0.1538355052471161,
-0.09616224467754364,
0.1397123634815216,
0.01336797047406435,
0.09342852234840393,
0.04918834567070007,
0.05104673281311989,
0.07282060384750366,
-0.0004341922758612782,
0.0621119849383831,
0.04728604853153229,
-0.10533513873815536,
0.04833422228693962,
-0.00475988769903779,
-0.1625155657529831,
-0.022172700613737106,
-0.079313263297081,
0.10790874063968658,
-0.046323470771312714,
0.1372978240251541,
-0.045963846147060394,
0.05477793142199516,
-0.0675814226269722,
-0.004709541331976652,
-0.01772671565413475,
-0.21037805080413818,
-0.08497191965579987,
-0.013681448064744473,
0.019956158474087715,
-0.0016390858218073845,
0.197564959526062,
0.01739293709397316,
0.02115379273891449,
0.05813971534371376,
0.019686467945575714,
0.01622564159333706,
0.03468502312898636,
0.15163284540176392,
0.054752472788095474,
-0.03780737891793251,
-0.10320065915584564,
0.04032575339078903,
0.007085768505930901,
-0.13370642066001892,
0.11364053189754486,
0.07774275541305542,
-0.08780261874198914,
0.0329740047454834,
0.026070045307278633,
0.015049188397824764,
-0.08411488682031631,
-0.20277762413024902,
-0.07684078812599182,
0.03461848944425583,
0.026881540194153786,
0.016895772889256477,
0.15439073741436005,
-0.017750389873981476,
0.019505031406879425,
-0.03341611474752426,
-0.01808570697903633,
-0.17990383505821228,
-0.10265718400478363,
-0.09757491201162338,
-0.06579285115003586,
0.014457541517913342,
-0.05150701478123665,
-0.041278522461652756,
0.04191206395626068,
0.07774403691291809,
-0.02980354055762291,
0.14119620621204376,
-0.0005553988739848137,
-0.007219096180051565,
0.0593140609562397,
0.011575963348150253,
0.001954448176547885,
0.0009636966278776526,
-0.03966442123055458,
-0.1535157412290573,
-0.029552804306149483,
-0.08393102139234543,
-0.011872309260070324,
-0.05140329897403717,
0.01685631461441517,
-0.10909996926784515,
-0.0888524204492569,
-0.029074776917696,
0.03791678696870804,
-0.06445673853158951,
0.0648188516497612,
0.00334253185428679,
0.024707363918423653,
0.04022468999028206,
0.14982646703720093,
-0.04226738214492798,
-0.10951677709817886,
-0.058270473033189774,
0.2594740688800812,
0.043440233916044235,
0.1226142942905426,
0.01966838724911213,
0.00033090810757130384,
-0.031258609145879745,
0.27506235241889954,
0.25854161381721497,
-0.041662298142910004,
0.08791874349117279,
0.009601621888577938,
0.019951362162828445,
0.07718589156866074,
0.11852579563856125,
0.08794962614774704,
0.2456551343202591,
-0.06175344064831734,
-0.03404027968645096,
-0.042526718229055405,
0.012015506625175476,
-0.10463365167379379,
0.06119754910469055,
0.04802814871072769,
-0.014150919392704964,
-0.0629909560084343,
0.08278125524520874,
-0.12229456007480621,
0.11215483397245407,
0.04271338880062103,
-0.160873144865036,
-0.06382962316274643,
-0.04099976271390915,
0.07382738590240479,
0.003865262260660529,
0.05617637559771538,
-0.048500318080186844,
-0.09419935941696167,
0.016513288021087646,
0.023922564461827278,
-0.17542129755020142,
-0.04648509621620178,
0.05407487973570824,
0.008295729756355286,
0.10427757352590561,
-0.016654277220368385,
0.03753582760691643,
0.09054578840732574,
0.01797448843717575,
-0.056239642202854156,
0.060847096145153046,
0.030549487099051476,
-0.0555538646876812,
-0.032358258962631226,
-0.023292163386940956,
0.025042418390512466,
-0.04911443963646889,
0.0288339015096426,
-0.18800301849842072,
0.041167598217725754,
-0.041145045310258865,
-0.055113982409238815,
-0.029605787247419357,
0.08379548788070679,
-0.0314759835600853,
0.05376433953642845,
0.06847701221704483,
0.006316922605037689,
-0.003839668817818165,
-0.07452759146690369,
0.008773808367550373,
0.05546528473496437,
-0.10795840620994568,
-0.08448527753353119,
-0.09221731871366501,
-0.017298612743616104,
0.0733518972992897,
-0.017985308542847633,
-0.10327960550785065,
-0.051263924688100815,
-0.11127585172653198,
0.035334400832653046,
-0.1617182046175003,
0.059149350970983505,
0.06659113615751266,
0.051998239010572433,
0.005091524217277765,
0.005646153353154659,
0.003239832352846861,
0.060818783938884735,
-0.09704240411520004,
-0.06998232752084732
] |
null | null |
transformers
|
# GPT-J 6B
## Model Description
GPT-J 6B is a transformer model trained using Ben Wang's [Mesh Transformer JAX](https://github.com/kingoflolz/mesh-transformer-jax/). "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.
<figure>
| Hyperparameter | Value |
|----------------------|------------|
| \\(n_{parameters}\\) | 6053381344 |
| \\(n_{layers}\\) | 28* |
| \\(d_{model}\\) | 4096 |
| \\(d_{ff}\\) | 16384 |
| \\(n_{heads}\\) | 16 |
| \\(d_{head}\\) | 256 |
| \\(n_{ctx}\\) | 2048 |
| \\(n_{vocab}\\) | 50257/50400† (same tokenizer as GPT-2/3) |
| Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) |
| RoPE Dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) |
<figcaption><p><strong>*</strong> Each layer consists of one feedforward block and one self attention block.</p>
<p><strong>†</strong> Although the embedding matrix has a size of 50400, only 50257 entries are used by the GPT-2 tokenizer.</p></figcaption></figure>
The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model
dimension is split into 16 heads, each with a dimension of 256. Rotary Position Embedding (RoPE) is applied to 64
dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as
GPT-2/GPT-3.
## Training data
GPT-J 6B was trained on [the Pile](https://pile.eleuther.ai), a large-scale curated dataset created by [EleutherAI](https://www.eleuther.ai).
## Training procedure
This model was trained for 402 billion tokens over 383,500 steps on TPU v3-256 pod. It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token correctly.
## Intended Use and Limitations
GPT-J learns an inner representation of the English language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt.
### How to use
This model can be easily loaded using the `AutoModelForCausalLM` functionality:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-j-6B")
model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-j-6B")
```
### Limitations and Biases
The core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon GPT-J to produce factually accurate output.
GPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See [Sections 5 and 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
## Evaluation results
<figure>
| Model | Public | Training FLOPs | LAMBADA PPL ↓ | LAMBADA Acc ↑ | Winogrande ↑ | Hellaswag ↑ | PIQA ↑ | Dataset Size (GB) |
|--------------------------|-------------|----------------|--- |--- |--- |--- |--- |-------------------|
| Random Chance | ✓ | 0 | ~a lot | ~0% | 50% | 25% | 25% | 0 |
| GPT-3 Ada‡ | ✗ | ----- | 9.95 | 51.6% | 52.9% | 43.4% | 70.5% | ----- |
| GPT-2 1.5B | ✓ | ----- | 10.63 | 51.21% | 59.4% | 50.9% | 70.8% | 40 |
| GPT-Neo 1.3B‡ | ✓ | 3.0e21 | 7.50 | 57.2% | 55.0% | 48.9% | 71.1% | 825 |
| Megatron-2.5B* | ✗ | 2.4e21 | ----- | 61.7% | ----- | ----- | ----- | 174 |
| GPT-Neo 2.7B‡ | ✓ | 6.8e21 | 5.63 | 62.2% | 56.5% | 55.8% | 73.0% | 825 |
| GPT-3 1.3B*‡ | ✗ | 2.4e21 | 5.44 | 63.6% | 58.7% | 54.7% | 75.1% | ~800 |
| GPT-3 Babbage‡ | ✗ | ----- | 5.58 | 62.4% | 59.0% | 54.5% | 75.5% | ----- |
| Megatron-8.3B* | ✗ | 7.8e21 | ----- | 66.5% | ----- | ----- | ----- | 174 |
| GPT-3 2.7B*‡ | ✗ | 4.8e21 | 4.60 | 67.1% | 62.3% | 62.8% | 75.6% | ~800 |
| Megatron-11B† | ✓ | 1.0e22 | ----- | ----- | ----- | ----- | ----- | 161 |
| **GPT-J 6B‡** | **✓** | **1.5e22** | **3.99** | **69.7%** | **65.3%** | **66.1%** | **76.5%** | **825** |
| GPT-3 6.7B*‡ | ✗ | 1.2e22 | 4.00 | 70.3% | 64.5% | 67.4% | 78.0% | ~800 |
| GPT-3 Curie‡ | ✗ | ----- | 4.00 | 69.3% | 65.6% | 68.5% | 77.9% | ----- |
| GPT-3 13B*‡ | ✗ | 2.3e22 | 3.56 | 72.5% | 67.9% | 70.9% | 78.5% | ~800 |
| GPT-3 175B*‡ | ✗ | 3.1e23 | 3.00 | 76.2% | 70.2% | 78.9% | 81.0% | ~800 |
| GPT-3 Davinci‡ | ✗ | ----- | 3.0 | 75% | 72% | 78% | 80% | ----- |
<figcaption><p>Models roughly sorted by performance, or by FLOPs if not available.</p>
<p><strong>*</strong> Evaluation numbers reported by their respective authors. All other numbers are provided by
running <a href="https://github.com/EleutherAI/lm-evaluation-harness/"><code>lm-evaluation-harness</code></a> either with released
weights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these
might not be directly comparable. See <a href="https://blog.eleuther.ai/gpt3-model-sizes/">this blog post</a> for more
details.</p>
<p><strong>†</strong> Megatron-11B provides no comparable metrics, and several implementations using the released weights do not
reproduce the generation quality and evaluations. (see <a href="https://github.com/huggingface/transformers/pull/10301">1</a>
<a href="https://github.com/pytorch/fairseq/issues/2358">2</a> <a href="https://github.com/pytorch/fairseq/issues/2719">3</a>)
Thus, evaluation was not attempted.</p>
<p><strong>‡</strong> These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models
failed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is
trained on the Pile, which has not been deduplicated against any test sets.</p></figcaption></figure>
## Citation and Related Information
### BibTeX entry
To cite this model:
```bibtex
@misc{gpt-j,
author = {Wang, Ben and Komatsuzaki, Aran},
title = {{GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model}},
howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year = 2021,
month = May
}
```
To cite the codebase that trained this model:
```bibtex
@misc{mesh-transformer-jax,
author = {Wang, Ben},
title = {{Mesh-Transformer-JAX: Model-Parallel Implementation of Transformer Language Model with JAX}},
howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year = 2021,
month = May
}
```
If you use this model, we would love to hear about it! Reach out on [GitHub](https://github.com/kingoflolz/mesh-transformer-jax), Discord, or shoot Ben an email.
## Acknowledgements
This project would not have been possible without compute generously provided by Google through the
[TPU Research Cloud](https://sites.research.google/trc/), as well as the Cloud TPU team for providing early access to the [Cloud TPU VM](https://cloud.google.com/blog/products/compute/introducing-cloud-tpu-vms) Alpha.
Thanks to everyone who have helped out one way or another (listed alphabetically):
- [James Bradbury](https://twitter.com/jekbradbury) for valuable assistance with debugging JAX issues.
- [Stella Biderman](https://www.stellabiderman.com), [Eric Hallahan](https://twitter.com/erichallahan), [Kurumuz](https://github.com/kurumuz/), and [Finetune](https://github.com/finetuneanon/) for converting the model to be compatible with the `transformers` package.
- [Leo Gao](https://twitter.com/nabla_theta) for running zero shot evaluations for the baseline models for the table.
- [Laurence Golding](https://github.com/researcher2/) for adding some features to the web demo.
- [Aran Komatsuzaki](https://twitter.com/arankomatsuzaki) for advice with experiment design and writing the blog posts.
- [Janko Prester](https://github.com/jprester/) for creating the web demo frontend.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["pytorch", "causal-lm"], "datasets": ["The Pile"]}
|
text-generation
|
gilparmentier/pokemon_gptj_model
|
[
"transformers",
"pytorch",
"gptj",
"text-generation",
"causal-lm",
"en",
"arxiv:2104.09864",
"arxiv:2101.00027",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2104.09864",
"2101.00027"
] |
[
"en"
] |
TAGS
#transformers #pytorch #gptj #text-generation #causal-lm #en #arxiv-2104.09864 #arxiv-2101.00027 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
GPT-J 6B
========
Model Description
-----------------
GPT-J 6B is a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.
**\*** Each layer consists of one feedforward block and one self attention block.
**†** Although the embedding matrix has a size of 50400, only 50257 entries are used by the GPT-2 tokenizer.
The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model
dimension is split into 16 heads, each with a dimension of 256. Rotary Position Embedding (RoPE) is applied to 64
dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as
GPT-2/GPT-3.
Training data
-------------
GPT-J 6B was trained on the Pile, a large-scale curated dataset created by EleutherAI.
Training procedure
------------------
This model was trained for 402 billion tokens over 383,500 steps on TPU v3-256 pod. It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token correctly.
Intended Use and Limitations
----------------------------
GPT-J learns an inner representation of the English language that can be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating text from a prompt.
### How to use
This model can be easily loaded using the 'AutoModelForCausalLM' functionality:
### Limitations and Biases
The core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon GPT-J to produce factually accurate output.
GPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
Evaluation results
------------------
Models roughly sorted by performance, or by FLOPs if not available.
**\*** Evaluation numbers reported by their respective authors. All other numbers are provided by
running [for more
details.](URL either with released
weights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these
might not be directly comparable. See <a href=)
**†** Megatron-11B provides no comparable metrics, and several implementations using the released weights do not
reproduce the generation quality and evaluations. (see <a href="URL
<a href="URL <a href="URL
Thus, evaluation was not attempted.</p>
**‡** These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models
failed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is
trained on the Pile, which has not been deduplicated against any test sets.
and Related Information
### BibTeX entry
To cite this model:
To cite the codebase that trained this model:
If you use this model, we would love to hear about it! Reach out on GitHub, Discord, or shoot Ben an email.
Acknowledgements
----------------
This project would not have been possible without compute generously provided by Google through the
TPU Research Cloud, as well as the Cloud TPU team for providing early access to the Cloud TPU VM Alpha.
Thanks to everyone who have helped out one way or another (listed alphabetically):
* James Bradbury for valuable assistance with debugging JAX issues.
* Stella Biderman, Eric Hallahan, Kurumuz, and Finetune for converting the model to be compatible with the 'transformers' package.
* Leo Gao for running zero shot evaluations for the baseline models for the table.
* Laurence Golding for adding some features to the web demo.
* Aran Komatsuzaki for advice with experiment design and writing the blog posts.
* Janko Prester for creating the web demo frontend.
|
[
"### How to use\n\n\nThis model can be easily loaded using the 'AutoModelForCausalLM' functionality:",
"### Limitations and Biases\n\n\nThe core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most \"accurate\" text. Never depend upon GPT-J to produce factually accurate output.\n\n\nGPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.\n\n\nAs with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.\n\n\nEvaluation results\n------------------\n\n\n\n\nModels roughly sorted by performance, or by FLOPs if not available.\n\n\n**\\*** Evaluation numbers reported by their respective authors. All other numbers are provided by\nrunning [for more\ndetails.](URL either with released\nweights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these\nmight not be directly comparable. See <a href=)\n\n\n**†** Megatron-11B provides no comparable metrics, and several implementations using the released weights do not\nreproduce the generation quality and evaluations. (see <a href=\"URL\n<a href=\"URL <a href=\"URL\nThus, evaluation was not attempted.</p>\n**‡** These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models\nfailed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is\ntrained on the Pile, which has not been deduplicated against any test sets.\n\n\n\n\nand Related Information",
"### BibTeX entry\n\n\nTo cite this model:\n\n\nTo cite the codebase that trained this model:\n\n\nIf you use this model, we would love to hear about it! Reach out on GitHub, Discord, or shoot Ben an email.\n\n\nAcknowledgements\n----------------\n\n\nThis project would not have been possible without compute generously provided by Google through the\nTPU Research Cloud, as well as the Cloud TPU team for providing early access to the Cloud TPU VM Alpha.\n\n\nThanks to everyone who have helped out one way or another (listed alphabetically):\n\n\n* James Bradbury for valuable assistance with debugging JAX issues.\n* Stella Biderman, Eric Hallahan, Kurumuz, and Finetune for converting the model to be compatible with the 'transformers' package.\n* Leo Gao for running zero shot evaluations for the baseline models for the table.\n* Laurence Golding for adding some features to the web demo.\n* Aran Komatsuzaki for advice with experiment design and writing the blog posts.\n* Janko Prester for creating the web demo frontend."
] |
[
"TAGS\n#transformers #pytorch #gptj #text-generation #causal-lm #en #arxiv-2104.09864 #arxiv-2101.00027 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### How to use\n\n\nThis model can be easily loaded using the 'AutoModelForCausalLM' functionality:",
"### Limitations and Biases\n\n\nThe core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most \"accurate\" text. Never depend upon GPT-J to produce factually accurate output.\n\n\nGPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.\n\n\nAs with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.\n\n\nEvaluation results\n------------------\n\n\n\n\nModels roughly sorted by performance, or by FLOPs if not available.\n\n\n**\\*** Evaluation numbers reported by their respective authors. All other numbers are provided by\nrunning [for more\ndetails.](URL either with released\nweights or with API access. Due to subtle implementation differences as well as different zero shot task framing, these\nmight not be directly comparable. See <a href=)\n\n\n**†** Megatron-11B provides no comparable metrics, and several implementations using the released weights do not\nreproduce the generation quality and evaluations. (see <a href=\"URL\n<a href=\"URL <a href=\"URL\nThus, evaluation was not attempted.</p>\n**‡** These models have been trained with data which contains possible test set contamination. The OpenAI GPT-3 models\nfailed to deduplicate training data for certain test sets, while the GPT-Neo models as well as this one is\ntrained on the Pile, which has not been deduplicated against any test sets.\n\n\n\n\nand Related Information",
"### BibTeX entry\n\n\nTo cite this model:\n\n\nTo cite the codebase that trained this model:\n\n\nIf you use this model, we would love to hear about it! Reach out on GitHub, Discord, or shoot Ben an email.\n\n\nAcknowledgements\n----------------\n\n\nThis project would not have been possible without compute generously provided by Google through the\nTPU Research Cloud, as well as the Cloud TPU team for providing early access to the Cloud TPU VM Alpha.\n\n\nThanks to everyone who have helped out one way or another (listed alphabetically):\n\n\n* James Bradbury for valuable assistance with debugging JAX issues.\n* Stella Biderman, Eric Hallahan, Kurumuz, and Finetune for converting the model to be compatible with the 'transformers' package.\n* Leo Gao for running zero shot evaluations for the baseline models for the table.\n* Laurence Golding for adding some features to the web demo.\n* Aran Komatsuzaki for advice with experiment design and writing the blog posts.\n* Janko Prester for creating the web demo frontend."
] |
[
70,
26,
493,
228
] |
[
"passage: TAGS\n#transformers #pytorch #gptj #text-generation #causal-lm #en #arxiv-2104.09864 #arxiv-2101.00027 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### How to use\n\n\nThis model can be easily loaded using the 'AutoModelForCausalLM' functionality:"
] |
[
-0.11835873872041702,
0.10682331770658493,
-0.002850749995559454,
0.0836348608136177,
0.13448700308799744,
0.028397077694535255,
0.15984046459197998,
0.10086438059806824,
-0.011290029622614384,
-0.08193451166152954,
0.20230264961719513,
0.28165391087532043,
-0.011612282134592533,
0.09340041875839233,
-0.012473366223275661,
-0.19713658094406128,
0.05119870603084564,
0.04272010922431946,
-0.03725983202457428,
0.09632375836372375,
0.13327711820602417,
-0.011265184730291367,
0.10425441712141037,
0.03767145797610283,
-0.12012605369091034,
0.010106545872986317,
0.021287601441144943,
-0.0874677300453186,
0.10230248421430588,
0.08665767312049866,
-0.02956571616232395,
0.043498191982507706,
0.05928967893123627,
-0.10378353297710419,
0.009349149651825428,
-0.013429262675344944,
-0.037673868238925934,
0.0837622731924057,
0.054932624101638794,
0.012084399349987507,
0.13566942512989044,
0.08808135241270065,
-0.0510878711938858,
0.019306516274809837,
-0.05980720371007919,
-0.10482432693243027,
-0.05062931030988693,
0.058460310101509094,
0.09009743481874466,
0.12306709587574005,
0.04757358133792877,
0.13410405814647675,
-0.0833914652466774,
0.08680199831724167,
0.13640230894088745,
-0.3141954839229584,
-0.003997656051069498,
0.11832042783498764,
-0.003575138049200177,
-0.05605553463101387,
0.05621238052845001,
0.031382862478494644,
0.043645571917295456,
0.007847163826227188,
0.03406132012605667,
-0.08968830108642578,
-0.11450250446796417,
0.02947785146534443,
-0.08225185424089432,
-0.061980415135622025,
0.30745092034339905,
-0.029569676145911217,
-0.00229020812548697,
-0.015236634761095047,
-0.0552775003015995,
0.09682611376047134,
-0.03586442396044731,
0.032630644738674164,
-0.002951181260868907,
0.09009866416454315,
0.08637088537216187,
-0.07008059322834015,
-0.11605420708656311,
-0.04641002044081688,
-0.12012705951929092,
0.08143306523561478,
0.010695001110434532,
0.08760976791381836,
-0.11561694741249084,
0.10924571752548218,
-0.05945725366473198,
-0.0904734879732132,
0.002234636340290308,
-0.06876397877931595,
0.11888940632343292,
0.025633802637457848,
-0.0296354740858078,
-0.019222524017095566,
0.08651165664196014,
0.15797367691993713,
0.07556028664112091,
-0.007997654378414154,
0.06364557147026062,
0.0888042002916336,
-0.01286181341856718,
0.0948450118303299,
-0.07841207832098007,
-0.049545157700777054,
0.1274704486131668,
-0.04393985867500305,
0.06365209072828293,
-0.01164208073168993,
-0.18908560276031494,
-0.08246148377656937,
-0.032088976353406906,
0.1014685332775116,
0.045517295598983765,
0.07589909434318542,
-0.019949225708842278,
-0.05217723548412323,
0.17971749603748322,
-0.06954476982355118,
-0.003114549443125725,
-0.03785672038793564,
-0.021435683593153954,
0.03244129195809364,
0.12556830048561096,
0.017678873613476753,
-0.07500908523797989,
0.01685035042464733,
-0.047459717839956284,
-0.004671772010624409,
-0.06864766031503677,
-0.03374146297574043,
0.025388021022081375,
-0.007510778959840536,
0.05986914783716202,
-0.1559217870235443,
-0.28077271580696106,
0.01802590675652027,
0.09678936749696732,
-0.01044422946870327,
-0.14900153875350952,
-0.007508326321840286,
0.011196098290383816,
-0.0005853918264620006,
-0.07010101526975632,
0.09489157795906067,
-0.060935962945222855,
0.041502561420202255,
-0.04118747636675835,
0.05819639563560486,
-0.16054292023181915,
0.06134941056370735,
-0.10865064710378647,
-0.004395339637994766,
-0.06387557834386826,
0.023234419524669647,
-0.03184925392270088,
0.11828028410673141,
-0.05580819025635719,
-0.021432414650917053,
0.03460480272769928,
0.010349159128963947,
0.049788907170295715,
0.13563328981399536,
-0.0817376971244812,
-0.05694979056715965,
0.10571709275245667,
-0.09541473537683487,
-0.20292095839977264,
0.03995085507631302,
0.014953340403735638,
0.14357629418373108,
0.07002599537372589,
0.05749589577317238,
0.07195813208818436,
-0.03199270740151405,
0.10966325551271439,
0.10059921443462372,
-0.11673727631568909,
-0.1809401661157608,
0.023227455094456673,
-0.013881220482289791,
-0.20181411504745483,
0.0835387110710144,
-0.046522434800863266,
0.1570766717195511,
-0.002333533251658082,
-0.05855325609445572,
-0.07635900378227234,
-0.11534475535154343,
-0.050044622272253036,
0.006329620257019997,
0.06626953184604645,
0.015472369268536568,
0.010291563346982002,
-0.03197691962122917,
0.08744675666093826,
0.000043915806600125507,
-0.0020959689281880856,
-0.047529447823762894,
0.05123915523290634,
-0.09155258536338806,
0.07183276861906052,
-0.13838623464107513,
0.01595098339021206,
-0.01334976777434349,
0.013705804012715816,
0.046685799956321716,
0.014006557874381542,
0.05436720699071884,
0.0077024661004543304,
0.0111601697281003,
-0.0022086615208536386,
0.163035050034523,
0.022295966744422913,
-0.0648278146982193,
-0.1155245378613472,
0.025833118706941605,
-0.022698314860463142,
0.08287644386291504,
-0.04961337894201279,
0.02744140475988388,
-0.11361584067344666,
0.025957413017749786,
-0.07488079369068146,
0.09124189615249634,
0.02439967356622219,
-0.01574082113802433,
-0.047163695096969604,
-0.016758739948272705,
0.09950657188892365,
0.05258376523852348,
-0.04831130430102348,
0.19953054189682007,
-0.09059736877679825,
0.21359895169734955,
0.15641409158706665,
-0.13210533559322357,
0.018574977293610573,
-0.015762021765112877,
-0.0399041585624218,
-0.017312534153461456,
0.007690963335335255,
0.04366595670580864,
0.05681294575333595,
0.011373949237167835,
0.15829071402549744,
-0.05593090131878853,
0.0014234264381229877,
-0.0036131320521235466,
-0.09101514518260956,
0.03607670217752457,
0.06650220602750778,
0.2123294323682785,
-0.13489913940429688,
0.12130627781152725,
0.2484579235315323,
-0.06043547764420509,
0.049009598791599274,
0.0027243089862167835,
-0.024316415190696716,
0.006885624025017023,
-0.05409153550863266,
-0.0200066976249218,
0.02430313266813755,
-0.11545982211828232,
0.016731806099414825,
0.10298565030097961,
-0.03627552092075348,
0.08884487301111221,
-0.11466098576784134,
-0.037384748458862305,
0.020630503073334694,
-0.0178759153932333,
0.005668579135090113,
0.08324778825044632,
-0.03078623116016388,
0.09670968353748322,
-0.059507403522729874,
-0.1030869111418724,
0.09694425761699677,
0.04626161977648735,
-0.11283320188522339,
0.159587562084198,
-0.12092845141887665,
-0.23976749181747437,
-0.14214445650577545,
-0.10718092322349548,
-0.10188575834035873,
0.020424582064151764,
0.09745137393474579,
-0.06579265743494034,
-0.0947728380560875,
-0.019124969840049744,
-0.08501385897397995,
0.056319475173950195,
-0.01754310540854931,
0.010758140124380589,
0.01809859462082386,
0.009857462719082832,
-0.12291926145553589,
-0.05812862887978554,
0.014903758652508259,
-0.03443613275885582,
0.08775810897350311,
-0.09653301537036896,
0.08126289397478104,
0.15094853937625885,
0.03254055231809616,
0.05032864212989807,
0.04178256168961525,
0.2093724012374878,
-0.017856251448392868,
0.02065540850162506,
0.2526920437812805,
-0.004977460950613022,
0.05720337852835655,
0.11793810874223709,
0.020048337057232857,
-0.06976567953824997,
0.006676976103335619,
-0.0768413320183754,
-0.07622537761926651,
-0.20154701173305511,
-0.13741549849510193,
-0.08946818113327026,
0.02741018682718277,
0.08551978319883347,
0.07542990148067474,
0.14345061779022217,
0.14909572899341583,
0.020657625049352646,
0.08234953135251999,
-0.0014710775576531887,
0.0817771703004837,
0.20636120438575745,
-0.018925456330180168,
0.12330307811498642,
-0.09944140166044235,
-0.0676172599196434,
0.12154202163219452,
0.022042563185095787,
0.12749886512756348,
0.04940702021121979,
0.010724437423050404,
0.06198258325457573,
0.12133312225341797,
0.030521290376782417,
0.17402487993240356,
-0.02283463254570961,
-0.031975895166397095,
-0.03552378714084625,
-0.043876633048057556,
-0.062474340200424194,
0.07161210477352142,
-0.09431403130292892,
-0.11632230132818222,
-0.031330354511737823,
0.02380961738526821,
0.02005777135491371,
0.12436690181493759,
0.02210994064807892,
-0.247830331325531,
0.000569407653529197,
0.02855983003973961,
0.028108572587370872,
-0.06435885280370712,
0.030588803812861443,
-0.11063193529844284,
-0.12692253291606903,
0.054222822189331055,
-0.0052392794750630856,
0.14365626871585846,
-0.010262765921652317,
0.03318396955728531,
-0.03145672008395195,
0.0033542511519044638,
0.0323827788233757,
0.17205417156219482,
-0.31065890192985535,
0.19913417100906372,
0.00691926758736372,
-0.009138659574091434,
-0.12348304688930511,
0.03239567577838898,
0.08884251862764359,
0.20299845933914185,
0.10024507343769073,
0.009542117826640606,
-0.04246971011161804,
-0.05535571277141571,
-0.07558462768793106,
0.06408477574586868,
-0.0720568522810936,
0.007336003240197897,
-0.04207362234592438,
-0.04470352828502655,
-0.032048389315605164,
0.0050720893777906895,
0.030695589259266853,
-0.11696676164865494,
-0.15783727169036865,
0.06556129455566406,
0.12504759430885315,
0.013680360279977322,
-0.028739523142576218,
-0.012501337565481663,
-0.07349526137113571,
0.23065412044525146,
0.09453345090150833,
-0.11466635763645172,
-0.09196051955223083,
-0.06916172802448273,
0.07038997858762741,
-0.08995500206947327,
0.03511793911457062,
-0.06459477543830872,
0.012525217607617378,
-0.039455629885196686,
-0.18003584444522858,
0.0758054181933403,
-0.14668630063533783,
-0.007165929768234491,
-0.009118587709963322,
0.046278610825538635,
-0.07488738745450974,
-0.010796661488711834,
0.076527900993824,
-0.008374682627618313,
-0.12489169090986252,
-0.1418396532535553,
-0.06075673922896385,
0.07001575082540512,
0.029091689735651016,
-0.04462655633687973,
-0.0822787657380104,
-0.012349444441497326,
0.020159399136900902,
-0.059341296553611755,
0.19430553913116455,
0.15278850495815277,
-0.06487846374511719,
0.12926052510738373,
0.18112295866012573,
-0.07050243020057678,
-0.2807278037071228,
-0.17191287875175476,
-0.07466720789670944,
-0.04100330173969269,
-0.0368259996175766,
-0.14640718698501587,
0.15224817395210266,
0.033833060413599014,
-0.06752205640077591,
0.0832279622554779,
-0.21519148349761963,
-0.11041643470525742,
0.26542168855667114,
0.020405102521181107,
0.27844318747520447,
-0.09329714626073837,
-0.04938202351331711,
-0.11739158630371094,
-0.13531538844108582,
0.13611994683742523,
-0.07300425320863724,
0.06027743220329285,
-0.04600420594215393,
0.09538818150758743,
0.011015405878424644,
-0.07362135499715805,
0.10541506111621857,
0.014041684567928314,
-0.009698761627078056,
-0.12456508725881577,
0.014416780322790146,
0.0426209419965744,
-0.022423304617404938,
0.188200905919075,
-0.11607349663972855,
0.05236593633890152,
-0.123783178627491,
-0.04232111945748329,
-0.07239893823862076,
0.05035290867090225,
0.024010010063648224,
-0.0777537152171135,
-0.00351125281304121,
-0.05689099803566933,
-0.00426630862057209,
0.0004845744406338781,
0.03133980929851532,
0.02029711753129959,
-0.02236969955265522,
0.16510894894599915,
0.08644789457321167,
-0.2392311990261078,
-0.012032603845000267,
-0.05035020038485527,
-0.053963351994752884,
0.07539908587932587,
-0.1566033512353897,
0.03974944353103638,
0.06743326783180237,
-0.05252176150679588,
0.080450639128685,
0.04580400139093399,
0.02705203928053379,
-0.05129208415746689,
0.09457193315029144,
-0.1957828849554062,
0.10711243003606796,
-0.05738357454538345,
0.09276438504457474,
0.06441765278577805,
0.013017704710364342,
0.11743146926164627,
-0.021473534405231476,
-0.04014861583709717,
0.024639200419187546,
0.029611272737383842,
-0.06623830646276474,
0.09086316078901291,
0.06850374490022659,
-0.011529327370226383,
-0.1417563557624817,
0.06279643625020981,
0.007636889815330505,
-0.044160228222608566,
-0.023751232773065567,
0.08374578505754471,
-0.14406614005565643,
-0.1302536576986313,
-0.0026062042452394962,
0.13903599977493286,
-0.24718278646469116,
-0.11024577915668488,
-0.04342033714056015,
-0.07454583048820496,
0.05965345725417137,
-0.04785352572798729,
0.08787690103054047,
0.010528718121349812,
-0.06072181835770607,
-0.07980933785438538,
-0.048180919140577316,
0.02314828895032406,
0.048316992819309235,
0.030480505898594856,
-0.07226230949163437,
-0.029440665617585182,
0.013904592953622341,
0.1329117864370346,
-0.04828597232699394,
-0.00807257555425167,
-0.08271477371454239,
0.029766034334897995,
-0.16761066019535065,
-0.023535309359431267,
-0.12756094336509705,
-0.0150812529027462,
0.025612760335206985,
-0.03350968286395073,
-0.018890822306275368,
0.024868983775377274,
-0.11208935081958771,
-0.024407921358942986,
-0.01844775304198265,
0.03590868413448334,
-0.08358070254325867,
-0.06496430188417435,
0.04953592270612717,
-0.013043106533586979,
0.08951489627361298,
0.09472812712192535,
-0.05593834072351456,
0.02708231285214424,
-0.12185696512460709,
-0.07212118059396744,
0.06572095304727554,
0.05886046960949898,
0.04261736571788788,
-0.061406105756759644,
0.026333265006542206,
0.11311865597963333,
-0.012504043988883495,
-0.006429455243051052,
0.050850048661231995,
-0.10111143440008163,
0.02226421982049942,
0.012216605246067047,
-0.09868647903203964,
-0.015458312816917896,
-0.03997707739472389,
0.04747634381055832,
0.051812540739774704,
0.1357453465461731,
-0.057187799364328384,
0.04793868958950043,
-0.06617368012666702,
0.023314323276281357,
-0.0361735001206398,
-0.11126599460840225,
-0.13999749720096588,
-0.09292580932378769,
-0.009191961027681828,
-0.004646399524062872,
0.22873593866825104,
0.033503320068120956,
0.008869375102221966,
-0.005420810077339411,
0.1427110880613327,
0.11693001538515091,
-0.04112594574689865,
0.18099132180213928,
0.04600955918431282,
0.061402853578329086,
-0.08316465467214584,
0.10501789301633835,
0.04501139000058174,
-0.06293465942144394,
0.028455136343836784,
-0.06004321575164795,
0.028391685336828232,
0.08122903853654861,
0.00943363830447197,
-0.033641085028648376,
-0.062150925397872925,
-0.11473415046930313,
-0.0131229218095541,
0.06800209730863571,
-0.014901057817041874,
0.09957042336463928,
0.12549613416194916,
-0.0021193521097302437,
0.010260011069476604,
0.01859048195183277,
-0.034479912370443344,
-0.17148441076278687,
-0.17841364443302155,
-0.08173108845949173,
-0.1414320468902588,
-0.017958423122763634,
-0.07251622527837753,
0.0029927457217127085,
-0.011325445957481861,
0.029194923117756844,
-0.04649269953370094,
0.031235070899128914,
-0.009949462488293648,
-0.03468402475118637,
-0.021171187981963158,
-0.045998912304639816,
-0.007825812324881554,
-0.044333502650260925,
0.03130536526441574,
-0.09602458029985428,
0.001322574564255774,
0.012322711758315563,
0.04412032291293144,
0.012963107787072659,
0.042330071330070496,
-0.025860754773020744,
-0.03152494505047798,
-0.09026238322257996,
0.026062877848744392,
0.01571333222091198,
0.13054946064949036,
-0.004396302159875631,
0.036535926163196564,
0.05172736942768097,
0.1272173523902893,
-0.05428940802812576,
-0.16425970196723938,
-0.08541563898324966,
0.22104328870773315,
-0.0070645688101649284,
0.03734443336725235,
-0.002494765678420663,
0.05903562158346176,
-0.028064550831913948,
0.35606080293655396,
0.26519984006881714,
-0.10708588361740112,
0.009933151304721832,
-0.020694999024271965,
0.012573782354593277,
0.026057712733745575,
0.1263994574546814,
0.10741213709115982,
0.19151313602924347,
-0.07764769345521927,
-0.0027661507483571768,
-0.09065965563058853,
-0.00531063973903656,
-0.11743412911891937,
0.06011126562952995,
0.015140819363296032,
-0.09023639559745789,
0.03230658918619156,
0.09655725210905075,
-0.09822908043861389,
0.2171778678894043,
-0.09816792607307434,
-0.020814169198274612,
-0.07511711865663528,
0.007831359282135963,
0.05871791020035744,
0.009503476321697235,
0.04433993995189667,
-0.06351764500141144,
0.012472722679376602,
0.14563710987567902,
-0.03400716558098793,
-0.23424436151981354,
-0.06246502324938774,
0.08082758635282516,
0.016200978308916092,
0.1618577241897583,
0.01157606951892376,
0.07869075238704681,
0.05903826653957367,
0.03491009399294853,
-0.15434134006500244,
0.057226285338401794,
-0.012193533591926098,
-0.07758592069149017,
0.07802433520555496,
-0.1229323223233223,
-0.018228359520435333,
-0.11137751489877701,
-0.009394247084856033,
-0.10622391849756241,
0.03639669716358185,
0.06685368716716766,
-0.02426503598690033,
-0.031965214759111404,
0.04192904010415077,
-0.07224708795547485,
0.07812051475048065,
0.037035197019577026,
-0.03214224800467491,
-0.061660222709178925,
-0.09519797563552856,
0.05621429532766342,
0.009681515395641327,
-0.16568028926849365,
-0.06144437938928604,
-0.02056412398815155,
-0.022068476304411888,
0.012720883823931217,
0.02561783604323864,
-0.09361434727907181,
-0.039356593042612076,
-0.08416986465454102,
-0.046693433076143265,
-0.10256350785493851,
0.05820683017373085,
0.0708848312497139,
0.024358833208680153,
0.012446936219930649,
-0.026178710162639618,
0.006826853379607201,
0.019999846816062927,
-0.10606800764799118,
-0.1372128129005432
] |
null | null |
transformers
|
# Jake Peralta DialoGPT model
|
{"tags": ["conversational"]}
|
text-generation
|
gizmo-dev/DialoGPT-small-jake
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Jake Peralta DialoGPT model
|
[
"# Jake Peralta DialoGPT model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Jake Peralta DialoGPT model"
] |
[
51,
10
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Jake Peralta DialoGPT model"
] |
[
-0.04298555478453636,
0.10136187821626663,
-0.0059355478733778,
0.023458806797862053,
0.15050049126148224,
-0.012076964601874352,
0.13418863713741302,
0.11065901815891266,
-0.02886582538485527,
-0.016496188938617706,
0.11545886844396591,
0.15222607553005219,
-0.01042808499187231,
0.12121753394603729,
-0.03615272417664528,
-0.3176003694534302,
0.04649190977215767,
0.04038818180561066,
0.05800745263695717,
0.13377194106578827,
0.10655084252357483,
-0.04757174849510193,
0.08119609951972961,
0.005306106526404619,
-0.14459668099880219,
0.00021302691311575472,
-0.002311790594831109,
-0.12315580248832703,
0.11502323299646378,
0.03613676875829697,
0.039236389100551605,
0.017177585512399673,
-0.04857926815748215,
-0.09423349797725677,
0.041340071707963943,
0.01417496707290411,
-0.018345097079873085,
0.04379844665527344,
0.04751023277640343,
-0.11250323057174683,
0.12239930778741837,
0.13572491705417633,
0.011679002083837986,
0.03646298870444298,
-0.16470831632614136,
-0.012030010111629963,
0.03701389953494072,
0.08190467953681946,
0.036379944533109665,
0.11535214632749557,
-0.049459535628557205,
0.08788086473941803,
-0.05926137790083885,
0.10433950275182724,
0.07203782349824905,
-0.32745158672332764,
-0.024406375363469124,
0.06856773793697357,
0.04922378435730934,
0.07733584195375443,
-0.04231633245944977,
0.047220926731824875,
0.042312171310186386,
-0.006838374305516481,
-0.03365560993552208,
-0.06040613725781441,
-0.035531025379896164,
-0.01498014759272337,
-0.09120869636535645,
-0.007918739691376686,
0.2704625427722931,
-0.043231721967458725,
0.05625517666339874,
-0.0637894868850708,
-0.10434090346097946,
-0.05414203926920891,
-0.026351310312747955,
-0.035444021224975586,
-0.07682150602340698,
0.06908891350030899,
0.011877971701323986,
-0.09419941157102585,
-0.1168891191482544,
-0.03763599693775177,
-0.20703841745853424,
0.1266908049583435,
0.03871186450123787,
0.014183767139911652,
-0.20632688701152802,
0.10555464029312134,
-0.015528964810073376,
-0.10615944117307663,
0.03611199930310249,
-0.10834860801696777,
0.0024248294066637754,
0.013001171872019768,
-0.03140657767653465,
-0.038464270532131195,
0.05692864954471588,
0.11675795167684555,
0.058157071471214294,
0.012764752842485905,
-0.02586057037115097,
0.07299600541591644,
0.06369738280773163,
0.0796733871102333,
-0.02442863956093788,
-0.023842208087444305,
0.03684912249445915,
-0.07972120493650436,
-0.018039384856820107,
-0.06940211355686188,
-0.1557547152042389,
0.00020576291717588902,
0.10621248185634613,
0.05211644992232323,
0.05099749192595482,
0.10729647427797318,
-0.0018645593663677573,
-0.06019258871674538,
-0.015660684555768967,
-0.04065718501806259,
-0.025270026177167892,
0.0008854937041178346,
-0.005240560509264469,
0.2015400230884552,
0.0025314604863524437,
0.04578527808189392,
-0.11599339544773102,
0.023895036429166794,
-0.059931330382823944,
-0.011453675106167793,
-0.03543579578399658,
-0.057603269815444946,
0.000707716797478497,
-0.02786852791905403,
0.01731291599571705,
-0.16229133307933807,
-0.16213026642799377,
-0.021443229168653488,
-0.003970601130276918,
-0.05356654152274132,
-0.13206684589385986,
-0.10672669857740402,
0.018326902762055397,
0.02358097769320011,
-0.08935379981994629,
-0.05402921885251999,
-0.062318816781044006,
0.08433587849140167,
-0.04471837729215622,
0.08305468410253525,
-0.05780473351478577,
0.08399676531553268,
-0.10273100435733795,
-0.015822727233171463,
-0.07156233489513397,
0.1076432392001152,
0.01901988871395588,
0.08409704267978668,
-0.04133138060569763,
-0.017937755212187767,
-0.11310657858848572,
0.07265522330999374,
-0.06029794365167618,
0.22498933970928192,
-0.10714884847402573,
-0.11964939534664154,
0.29229506850242615,
-0.08742759376764297,
-0.10811799019575119,
0.11763471364974976,
0.000504947267472744,
0.13927243649959564,
0.14044052362442017,
0.22320836782455444,
0.01970018446445465,
0.02038135565817356,
0.09181592613458633,
0.08886236697435379,
-0.0604044608771801,
0.013005975633859634,
0.02323850430548191,
-0.039083026349544525,
-0.07103490829467773,
0.0248895063996315,
0.08090338855981827,
0.059127893298864365,
-0.05959499627351761,
-0.016550494357943535,
0.013802150264382362,
-0.013789743185043335,
0.06377117335796356,
-0.03374120220541954,
0.12979340553283691,
-0.04039977490901947,
-0.0628625676035881,
-0.028993653133511543,
0.012980425730347633,
-0.06764044612646103,
0.04407095909118652,
-0.08874497562646866,
0.06840088218450546,
0.004255861043930054,
0.06218740716576576,
-0.1445692479610443,
-0.04169581085443497,
-0.04430694133043289,
0.15570352971553802,
0.040736354887485504,
0.14102549850940704,
0.057793837040662766,
-0.030721962451934814,
0.008579961955547333,
0.02429434098303318,
0.1480230689048767,
-0.007897470146417618,
-0.05249695107340813,
-0.10888484865427017,
0.10073034465312958,
-0.06753899157047272,
0.11547072976827621,
-0.01808004081249237,
0.006116639357060194,
0.004359913989901543,
0.10444151610136032,
-0.02401440218091011,
0.011231614276766777,
0.0013697913382202387,
-0.005007575731724501,
-0.048000410199165344,
-0.003879216033965349,
0.10700073093175888,
-0.008069772273302078,
-0.04925675690174103,
0.2323746234178543,
-0.19553478062152863,
0.1379278004169464,
0.20838254690170288,
-0.1873226761817932,
0.00014174649550113827,
-0.09831655770540237,
-0.017487086355686188,
0.011394851841032505,
0.09616435319185257,
-0.03253386542201042,
0.2933792173862457,
0.010606176219880581,
0.17740510404109955,
-0.03658280521631241,
-0.012515072710812092,
-0.03076929599046707,
-0.056973494589328766,
0.017125669866800308,
0.11544191837310791,
0.09075169265270233,
-0.1223994717001915,
0.1784558743238449,
0.09783636778593063,
0.04978015273809433,
0.1655757576227188,
0.022342287003993988,
-0.0001255945535376668,
0.08100220561027527,
0.019341597333550453,
-0.031859152019023895,
-0.08460375666618347,
-0.29854291677474976,
-0.04637927561998367,
0.07266711443662643,
0.05201587453484535,
0.11484210193157196,
-0.08767686784267426,
-0.025714801624417305,
0.011682303622364998,
0.0005547069013118744,
0.050322070717811584,
0.11674252152442932,
0.01409465167671442,
0.1129889190196991,
-0.014444384723901749,
-0.08608895540237427,
0.052790261805057526,
0.02764224447309971,
-0.08267755806446075,
0.17723257839679718,
-0.11451929062604904,
-0.34306615591049194,
-0.09597818553447723,
-0.18327631056308746,
-0.0632653459906578,
0.05701104551553726,
0.09422004967927933,
-0.10822878777980804,
-0.018752971664071083,
-0.00459743058308959,
0.11481987684965134,
-0.11764571070671082,
-0.01092474814504385,
0.0030823852866888046,
0.0015623720828443766,
-0.11020367592573166,
-0.08932386338710785,
-0.0414172001183033,
-0.016147851943969727,
-0.10817214101552963,
0.10642547905445099,
-0.14566680788993835,
0.03641396015882492,
0.2273082584142685,
0.06182233989238739,
0.05769471824169159,
-0.028427939862012863,
0.2154439091682434,
-0.09658383578062057,
-0.028462212532758713,
0.2129596620798111,
-0.02782726287841797,
0.05223267897963524,
0.12596875429153442,
-0.028133993968367577,
-0.05689653009176254,
0.018969114869832993,
-0.010980543680489063,
-0.06420417129993439,
-0.1957860141992569,
-0.14652499556541443,
-0.09200355410575867,
0.10240968316793442,
0.07168005406856537,
0.06493470072746277,
0.0982135459780693,
0.0677933469414711,
-0.04303708299994469,
-0.011768086813390255,
0.05243032053112984,
0.09135662019252777,
0.21906068921089172,
-0.09437450021505356,
0.1459643840789795,
0.002515166997909546,
-0.14404381811618805,
0.07712176442146301,
0.06640412658452988,
0.05988049879670143,
0.0782441571354866,
0.05473780632019043,
0.012849845923483372,
0.03798648342490196,
0.13499543070793152,
0.09236475825309753,
-0.00006350681360345334,
-0.030936384573578835,
-0.03131432458758354,
-0.04215286672115326,
-0.0485277958214283,
0.04422425478696823,
0.016436150297522545,
-0.16151629388332367,
-0.040141746401786804,
-0.03202308714389801,
0.06537789851427078,
0.034132156521081924,
0.07160568237304688,
-0.17389163374900818,
-0.008952504955232143,
0.0681418851017952,
-0.038871221244335175,
-0.13627927005290985,
0.0976172387599945,
-0.007473828736692667,
-0.13583654165267944,
0.03855730593204498,
-0.01598576083779335,
0.12999674677848816,
-0.14567551016807556,
0.08463962376117706,
-0.14408986270427704,
-0.06333359330892563,
0.004345767665654421,
0.13252176344394684,
-0.2701348662376404,
0.18757066130638123,
-0.0259280763566494,
-0.06633701920509338,
-0.09552262723445892,
-0.017303185537457466,
-0.011172408238053322,
0.12647779285907745,
0.14174313843250275,
-0.015334956347942352,
0.05461454391479492,
-0.023142173886299133,
-0.09067906439304352,
0.027273274958133698,
0.0961102694272995,
-0.06298995018005371,
-0.01071979384869337,
-0.02472156472504139,
0.015339194796979427,
-0.04069121181964874,
-0.045300569385290146,
0.006166903302073479,
-0.19576182961463928,
0.07888612896203995,
0.02570422552525997,
0.16136501729488373,
0.02773236483335495,
-0.051257167011499405,
-0.13009771704673767,
0.24302102625370026,
-0.05234315246343613,
-0.09879991412162781,
-0.06961113959550858,
-0.061220064759254456,
0.056872230023145676,
-0.05514857918024063,
0.02385316602885723,
-0.06978898495435715,
0.04673522710800171,
-0.09689215570688248,
-0.16765481233596802,
0.10708780586719513,
-0.08701242506504059,
-0.023731788620352745,
-0.017185598611831665,
0.21056239306926727,
-0.017566530033946037,
0.026773013174533844,
0.045224860310554504,
-0.007009509950876236,
-0.15445853769779205,
-0.10031703114509583,
-0.03052590601146221,
0.042931340634822845,
-0.004213985055685043,
-0.009488278068602085,
-0.034200962632894516,
-0.012482467107474804,
-0.04363726079463959,
-0.007416378706693649,
0.3374677300453186,
0.12776389718055725,
-0.01714642159640789,
0.17501339316368103,
0.10850531607866287,
-0.07182787358760834,
-0.270145446062088,
-0.11297690868377686,
-0.07059212774038315,
-0.022598035633563995,
-0.09243924170732498,
-0.18873578310012817,
0.046854548156261444,
0.01673508621752262,
-0.0009790643816813827,
0.08417532593011856,
-0.28937429189682007,
-0.10994640737771988,
0.16469727456569672,
-0.043347373604774475,
0.37803080677986145,
-0.085748091340065,
-0.07005402445793152,
-0.07872563600540161,
-0.11863084882497787,
0.10962467640638351,
-0.01795024983584881,
0.11963141709566116,
0.01194681879132986,
0.15320554375648499,
0.05504359304904938,
0.0016752600204199553,
0.09616317600011826,
0.013392186723649502,
-0.07240988314151764,
-0.09868193417787552,
-0.036838430911302567,
-0.0000555229744350072,
0.033912062644958496,
0.030293675139546394,
-0.06350743770599365,
0.010422603227198124,
-0.15361998975276947,
-0.07770480960607529,
-0.08079030364751816,
0.021308360621333122,
0.0325959175825119,
-0.06575367599725723,
0.01277843676507473,
-0.03586854413151741,
0.0038361772894859314,
0.010189864784479141,
0.11925610154867172,
-0.12590886652469635,
0.16082888841629028,
0.07796818763017654,
0.11964641511440277,
-0.16079014539718628,
-0.09413345158100128,
-0.05771692842245102,
-0.04406038299202919,
0.06816589087247849,
-0.09047164767980576,
0.018626367673277855,
0.11708964407444,
-0.049103476107120514,
0.07125426083803177,
0.08000694960355759,
-0.03629165142774582,
0.03242313489317894,
0.0964825451374054,
-0.2659086287021637,
-0.03945146128535271,
-0.07030265033245087,
0.04127972200512886,
0.09565998613834381,
0.0934172198176384,
0.21081681549549103,
-0.017709437757730484,
-0.028094608336687088,
0.006644100416451693,
0.012357911095023155,
-0.02002890780568123,
0.07658807188272476,
-0.005862075369805098,
0.01815185695886612,
-0.1500067412853241,
0.055046096444129944,
0.03159039840102196,
-0.09102480113506317,
0.028038030490279198,
0.12953828275203705,
-0.09342025220394135,
-0.11645249277353287,
-0.05724100023508072,
0.13420434296131134,
-0.0808262750506401,
-0.014078031294047832,
-0.04964044690132141,
-0.12842270731925964,
0.05175043269991875,
0.08789782226085663,
0.044769302010536194,
0.05655958503484726,
-0.09887681156396866,
-0.05021028593182564,
-0.027794139459729195,
-0.018336471170186996,
0.06741218268871307,
-0.026502318680286407,
-0.06965794414281845,
0.08734434098005295,
-0.02197287417948246,
0.10307284444570541,
-0.0902688279747963,
-0.11802049726247787,
-0.1404213160276413,
0.053427554666996,
-0.05773510783910751,
-0.09120766818523407,
-0.11197657883167267,
-0.04515399411320686,
-0.007064747624099255,
-0.007365409750491381,
-0.03382188826799393,
-0.0480884313583374,
-0.11302013695240021,
0.027023863047361374,
-0.058709997683763504,
0.008021862246096134,
-0.0653262510895729,
0.04620363190770149,
0.053436726331710815,
-0.024691905826330185,
0.1534554362297058,
0.09938084334135056,
-0.1099901869893074,
0.0790710523724556,
-0.1476789265871048,
-0.06811176985502243,
0.10477735102176666,
0.021971046924591064,
0.042481619864702225,
0.05674881115555763,
0.0028871376998722553,
0.05247633531689644,
0.02656870149075985,
0.05235745385289192,
0.05147046595811844,
-0.09193599224090576,
0.025591321289539337,
0.03143569827079773,
-0.12016918510198593,
-0.026749873533844948,
-0.027056841179728508,
0.03155351057648659,
0.03978392109274864,
0.0699949860572815,
-0.05855286121368408,
0.11416959762573242,
-0.046032071113586426,
0.028902515769004822,
0.02444262057542801,
-0.1588214486837387,
0.04831129312515259,
-0.10079718381166458,
0.05167007818818092,
0.026098651811480522,
0.2125881463289261,
0.03747474402189255,
0.04187583178281784,
0.023669356480240822,
0.04882977902889252,
0.05638265237212181,
0.013804948888719082,
0.20969635248184204,
0.12828609347343445,
-0.019505493342876434,
-0.08289473503828049,
0.10052309930324554,
0.03700946643948555,
0.053163252770900726,
0.09138767421245575,
-0.03774378448724747,
-0.030227595940232277,
0.0735379084944725,
0.0018057164270430803,
0.028857162222266197,
-0.10959534347057343,
-0.14462608098983765,
0.0031894289422780275,
0.0501292459666729,
-0.06252359598875046,
0.09640925377607346,
0.14252717792987823,
-0.012359819374978542,
0.015925036743283272,
-0.008760950528085232,
-0.07717199623584747,
-0.17519237101078033,
-0.2237674593925476,
-0.08096376806497574,
-0.15752992033958435,
0.011378414928913116,
-0.1300436407327652,
0.028347710147500038,
0.02768571861088276,
0.09589783102273941,
-0.07580002397298813,
0.08168912678956985,
0.05094415321946144,
-0.14814186096191406,
0.08356133848428726,
-0.025978663936257362,
0.11067341268062592,
-0.0762229636311531,
-0.013888069428503513,
-0.07835094630718231,
0.04728304594755173,
0.011031595058739185,
0.05859309807419777,
-0.024886479601264,
0.005192297510802746,
-0.11082372814416885,
-0.08537255227565765,
-0.05255952849984169,
0.05614812299609184,
-0.0011008076835423708,
0.16385237872600555,
-0.0007164362468756735,
-0.043932586908340454,
0.005456744693219662,
0.2288503795862198,
-0.05030667781829834,
-0.07703901827335358,
-0.05407256633043289,
0.20164062082767487,
0.017152128741145134,
0.11403018981218338,
-0.03116634115576744,
-0.003331640735268593,
-0.10160746425390244,
0.3422560393810272,
0.27626749873161316,
-0.11239590495824814,
0.011485264636576176,
-0.005517974030226469,
0.04401830583810806,
0.11053254455327988,
0.08832605928182602,
0.10112814605236053,
0.26328057050704956,
-0.06973886489868164,
-0.004847546573728323,
-0.011276003904640675,
-0.03444656729698181,
-0.09446536749601364,
0.05891617387533188,
0.0624508336186409,
-0.047208331525325775,
-0.0406012125313282,
0.12276442348957062,
-0.2386368066072464,
0.08585779368877411,
-0.130695641040802,
-0.1796683520078659,
-0.10788998007774353,
-0.007359078619629145,
0.15563462674617767,
0.02953183650970459,
0.09284640848636627,
0.002662714570760727,
-0.06708259880542755,
0.04082586243748665,
0.035365570336580276,
-0.17093530297279358,
-0.05727734789252281,
0.06767185777425766,
-0.033330146223306656,
-0.05492117255926132,
-0.02317197620868683,
0.05504558980464935,
0.05634692683815956,
0.04076957702636719,
-0.016943255439400673,
0.04953419417142868,
-0.008836316876113415,
-0.08413103967905045,
0.030350711196660995,
0.03887832909822464,
0.0090929064899683,
-0.05926625058054924,
0.08317606151103973,
-0.11831685900688171,
0.02901342697441578,
-0.013459569774568081,
-0.015143530443310738,
-0.019046759232878685,
0.04105420038104057,
-0.07692514359951019,
0.07424003630876541,
0.080938920378685,
-0.01700298860669136,
-0.03445584326982498,
-0.033149249851703644,
0.0013260202249512076,
-0.03870074823498726,
-0.08583427965641022,
-0.08492524921894073,
-0.1866115778684616,
-0.14090462028980255,
0.06289097666740417,
0.020488150417804718,
-0.17580290138721466,
0.01997317560017109,
-0.11532745510339737,
0.06300507485866547,
-0.12899820506572723,
0.10224342346191406,
0.04036674275994301,
0.014883968979120255,
-0.005052756518125534,
0.0411711186170578,
0.05212133377790451,
0.07168097048997879,
-0.15087392926216125,
-0.08272240310907364
] |
null | null |
transformers
|
# cse_resnet50
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{}
| null |
glasses/cse_resnet50
|
[
"transformers",
"pytorch",
"arxiv:1512.03385",
"arxiv:1812.01187",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us
|
# cse_resnet50
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# cse_resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us \n",
"# cse_resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
39,
28
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us \n# cse_resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.11671743541955948,
-0.07224924117326736,
-0.002923419466242194,
0.04775555804371834,
0.1061449721455574,
0.03161287307739258,
0.006508366204798222,
0.07830515503883362,
-0.034622110426425934,
0.0600566603243351,
0.10713458806276321,
0.08553630858659744,
-0.005569808650761843,
-0.022290239110589027,
-0.054616235196590424,
-0.2171955108642578,
0.07250174880027771,
0.08523590862751007,
-0.02888445369899273,
0.05571615323424339,
0.06939952075481415,
-0.09663551300764084,
0.11436215043067932,
-0.031165042892098427,
-0.24744680523872375,
0.007854493334889412,
-0.01830313913524151,
-0.06547056138515472,
0.11804676055908203,
0.023281773552298546,
0.13777489960193634,
0.025338292121887207,
0.0967516303062439,
-0.08614437282085419,
0.03310109302401543,
0.03294374793767929,
-0.07622126489877701,
0.04446740821003914,
0.06627369672060013,
-0.03673022612929344,
0.018158474937081337,
0.06480909138917923,
-0.04351063817739487,
-0.06007540971040726,
-0.046091847121715546,
-0.0032399648334831,
0.02077482081949711,
0.1562979519367218,
0.08435889333486557,
0.10030460357666016,
0.024024659767746925,
0.08272861689329147,
-0.08724045753479004,
0.11517077684402466,
0.2291535586118698,
-0.2656135857105255,
-0.03798180818557739,
0.12098422646522522,
-0.01750573143362999,
-0.03616400063037872,
-0.05467449501156807,
0.04203284904360771,
0.04129340127110481,
0.028294844552874565,
-0.16152046620845795,
-0.008779204450547695,
-0.026909010484814644,
-0.011892491020262241,
-0.11125387251377106,
-0.017602141946554184,
0.07493098080158234,
0.06724189221858978,
0.07098416984081268,
0.006836908869445324,
-0.1904238760471344,
-0.03650074452161789,
-0.08783867210149765,
-0.02302725240588188,
0.0027183254715055227,
0.026370670646429062,
-0.13755062222480774,
-0.037633128464221954,
-0.06111176684498787,
-0.05797041580080986,
-0.13687562942504883,
-0.007849173620343208,
0.05356280133128166,
0.11462176591157913,
-0.12243182212114334,
0.018084103241562843,
0.005933395586907864,
-0.018282068893313408,
-0.016385413706302643,
-0.08644970506429672,
-0.03985542804002762,
0.020498743280768394,
-0.07691279798746109,
-0.07673075795173645,
0.04497700184583664,
0.2041085809469223,
-0.0379660502076149,
0.028682060539722443,
0.014205838553607464,
0.12810148298740387,
-0.08394258469343185,
0.06316433101892471,
-0.22126504778862,
0.12101737409830093,
0.010330193676054478,
-0.09284014254808426,
0.010784076526761055,
-0.04335587099194527,
-0.09835037589073181,
-0.03305057808756828,
0.13657796382904053,
-0.03976091742515564,
-0.0741451308131218,
0.057339709252119064,
-0.003497085766866803,
-0.03602275624871254,
0.10455133020877838,
-0.020209014415740967,
-0.04381773993372917,
0.0055071027018129826,
-0.0866495743393898,
0.11414171755313873,
0.07365677505731583,
-0.025224464014172554,
-0.08004996925592422,
0.030139341950416565,
-0.022458326071500778,
-0.019366323947906494,
0.005865828134119511,
-0.07248146086931229,
0.0011731742415577173,
-0.03209388628602028,
0.06990223377943039,
-0.24529868364334106,
-0.11170309036970139,
-0.006822667084634304,
0.04644008353352547,
-0.04029015451669693,
0.05530998855829239,
-0.057811785489320755,
-0.13847418129444122,
0.0006604110822081566,
0.027842139825224876,
-0.03653907775878906,
-0.03973860293626785,
0.014091700315475464,
-0.06097264215350151,
0.15916351974010468,
-0.09302321076393127,
0.007686692290008068,
-0.05278400331735611,
-0.04050155729055405,
-0.10899543762207031,
0.10303018242120743,
-0.047532618045806885,
0.07173486799001694,
-0.09628938138484955,
-0.057907070964574814,
-0.11995891481637955,
0.008870759047567844,
0.018435418605804443,
0.0862850770354271,
-0.11179176717996597,
-0.043939121067523956,
0.1788327395915985,
-0.11836601048707962,
-0.1127571165561676,
0.09188976138830185,
-0.019526230171322823,
-0.11507312208414078,
0.0070680975914001465,
0.17960599064826965,
-0.07327162474393845,
-0.02080865018069744,
-0.042596254497766495,
0.00956873781979084,
-0.14710654318332672,
-0.0885012224316597,
-0.054498929530382156,
0.14681033790111542,
0.05450328812003136,
-0.04389643669128418,
-0.03966941684484482,
0.06252540647983551,
-0.1264733523130417,
-0.08491695672273636,
-0.025232501327991486,
-0.09857103228569031,
-0.01411901693791151,
0.02353017032146454,
0.12894923985004425,
0.041209205985069275,
-0.04855605959892273,
-0.19617825746536255,
0.052390795201063156,
-0.07456106692552567,
0.11148732900619507,
-0.10877552628517151,
0.1816457360982895,
-0.028637295588850975,
0.01894483156502247,
-0.2478276640176773,
-0.04508059099316597,
0.02127993479371071,
0.05218604579567909,
-0.026098480448126793,
0.01136371586471796,
0.04878561943769455,
-0.028158439323306084,
-0.01932677812874317,
-0.0852743461728096,
0.04030952975153923,
-0.022779494524002075,
-0.022092388942837715,
-0.07482267916202545,
0.009920968674123287,
-0.06899696588516235,
-0.009668929502367973,
-0.12648190557956696,
-0.046818941831588745,
0.044492460787296295,
0.13847431540489197,
-0.008820795454084873,
0.07727651298046112,
-0.053232718259096146,
-0.0319257527589798,
-0.02628244459629059,
-0.06844153255224228,
0.062294356524944305,
-0.013676563277840614,
-0.013225726783275604,
0.05002041161060333,
0.01794941909611225,
0.1585245281457901,
0.12957927584648132,
-0.30922091007232666,
-0.1186433732509613,
0.1157943531870842,
-0.004195664077997208,
0.016632962971925735,
0.013599625788629055,
0.09028278291225433,
0.13640393316745758,
-0.0330137275159359,
0.06589744240045547,
-0.0498710572719574,
0.09384869039058685,
0.02194295823574066,
-0.032644737511873245,
-0.00424147117882967,
0.0503157377243042,
0.06291114538908005,
-0.2631867229938507,
0.05822287127375603,
0.024707119911909103,
-0.1399221122264862,
-0.0573592334985733,
0.005513948388397694,
-0.06960081309080124,
0.08435452729463577,
0.08132755011320114,
-0.007657270412892103,
0.08431340754032135,
0.024392513558268547,
-0.0698455199599266,
0.02235451154410839,
-0.10720311105251312,
0.04205543175339699,
-0.1169767677783966,
0.03648541122674942,
-0.03738843649625778,
0.010502953082323074,
0.004847381263971329,
0.037082232534885406,
-0.0019375098636373878,
0.08445966988801956,
-0.0012906873598694801,
-0.2044844627380371,
0.024270951747894287,
-0.06036413088440895,
-0.0188509002327919,
0.21771454811096191,
-0.009399136528372765,
-0.22208110988140106,
-0.008439804427325726,
-0.018647871911525726,
-0.02529187686741352,
0.04778967797756195,
0.03287692740559578,
-0.14311933517456055,
-0.03995339944958687,
0.002509992802515626,
-0.010817908681929111,
0.07358676940202713,
0.04322363808751106,
0.023657212033867836,
-0.01156447734683752,
-0.0283186137676239,
-0.029910722747445107,
0.0023906517308205366,
-0.19722507894039154,
-0.09986717253923416,
0.12528403103351593,
-0.12321344763040543,
0.10191801935434341,
0.12154669314622879,
-0.028471244499087334,
0.021362056955695152,
0.003672252641990781,
0.17113547027111053,
-0.08313175290822983,
-0.006799616850912571,
0.0700654610991478,
-0.008755047805607319,
0.05233920365571976,
0.07125909626483917,
-0.0009561172919347882,
-0.10879486799240112,
-0.004319529514759779,
0.011976917274296284,
-0.05065061151981354,
-0.08901501446962357,
-0.10317841172218323,
-0.03134981170296669,
-0.03401833772659302,
0.12282589077949524,
0.08362627029418945,
0.07821290194988251,
0.10649636387825012,
0.030675126239657402,
-0.035146694630384445,
0.042906928807497025,
0.07917089760303497,
0.09346083551645279,
0.03528374806046486,
0.10651273280382156,
-0.05657673627138138,
-0.08979460597038269,
0.025582486763596535,
0.07760591059923172,
0.23786716163158417,
-0.02521541900932789,
-0.078270323574543,
0.02350645139813423,
0.18635183572769165,
0.13146010041236877,
0.10584758967161179,
-0.05040685087442398,
-0.04223918169736862,
0.001965864794328809,
0.034750357270240784,
-0.0009433948434889317,
0.05441097915172577,
0.07460836321115494,
-0.06862262636423111,
-0.002511583501473069,
0.05351011082530022,
0.017813611775636673,
0.2331715226173401,
0.11663294583559036,
-0.36262625455856323,
-0.01670795865356922,
-0.055107466876506805,
0.030751889571547508,
-0.027576522901654243,
0.12427186220884323,
0.035244379192590714,
-0.044639043509960175,
0.03127152472734451,
-0.08519517630338669,
0.09030163288116455,
-0.09785974770784378,
0.02551247924566269,
0.011232515797019005,
-0.09472115337848663,
0.06461017578840256,
-0.0010088516864925623,
-0.11900458484888077,
0.19260647892951965,
-0.045904889702796936,
-0.05537596717476845,
-0.03586945682764053,
-0.060601428151130676,
0.0706196054816246,
0.11964552104473114,
0.227604940533638,
-0.0024324082769453526,
-0.06002349406480789,
0.006386005785316229,
0.02242608740925789,
0.024171311408281326,
0.1508193016052246,
0.02887647971510887,
-0.035526398569345474,
0.04346024617552757,
-0.05382206663489342,
0.03511113300919533,
0.10397019237279892,
0.013506315648555756,
-0.09060618281364441,
-0.045608196407556534,
0.0386112816631794,
-0.04135853052139282,
-0.00444114301353693,
-0.04477629438042641,
-0.04417053982615471,
0.09399674832820892,
-0.0018146629445254803,
0.007274610921740532,
-0.08865799009799957,
0.05378597602248192,
0.07576747983694077,
-0.08220742642879486,
0.10658002644777298,
-0.03138027340173721,
0.05213407799601555,
-0.009726746007800102,
-0.20399142801761627,
0.19601820409297943,
-0.11184133589267731,
-0.01542846579104662,
0.02786162868142128,
-0.02101806178689003,
-0.006413742899894714,
-0.04696982353925705,
0.044564224779605865,
0.037160322070121765,
-0.1342335194349289,
-0.0036468899343162775,
0.07449190318584442,
-0.05371587350964546,
-0.034111399203538895,
0.1269778460264206,
-0.03644615039229393,
0.00024495157413184643,
-0.07810617238283157,
0.17775766551494598,
0.1438712328672409,
0.016508623957633972,
-0.07561929523944855,
-0.02069135010242462,
0.12921951711177826,
0.053645867854356766,
-0.354499489068985,
-0.06372980773448944,
-0.03266192227602005,
-0.011246387846767902,
-0.11323754489421844,
-0.1058097705245018,
0.16399754583835602,
-0.039167631417512894,
-0.017676640301942825,
0.009122644551098347,
-0.14347641170024872,
-0.05785539001226425,
0.2359093725681305,
0.06325361132621765,
0.30119094252586365,
-0.07140636444091797,
0.007207872811704874,
-0.05145907774567604,
0.03842170163989067,
0.02632036618888378,
0.04768792539834976,
0.10201138257980347,
-0.005270012654364109,
0.0931551456451416,
-0.021584073081612587,
-0.010068226605653763,
-0.032740406692028046,
0.11682416498661041,
0.08362198621034622,
-0.07827430218458176,
0.020171957090497017,
0.05282754451036453,
-0.013833136297762394,
0.002681259997189045,
0.20378176867961884,
0.1275443136692047,
0.029142815619707108,
0.013168970122933388,
-0.08485131710767746,
0.028695804998278618,
0.11697258800268173,
0.007086579222232103,
-0.1186995878815651,
0.020301004871726036,
0.0870499387383461,
0.07545028626918793,
0.2682226002216339,
0.08911025524139404,
0.08083191514015198,
0.031516872346401215,
-0.01054648868739605,
-0.08589226007461548,
-0.10366667807102203,
-0.067633256316185,
-0.03788101673126221,
0.14039048552513123,
-0.06987996399402618,
0.054135244339704514,
0.11038970947265625,
0.01086876168847084,
0.013825560919940472,
0.13954591751098633,
-0.002653364557772875,
0.014361724257469177,
0.0865606889128685,
-0.08425068855285645,
-0.027649780735373497,
0.011949603445827961,
0.004119546618312597,
0.13810692727565765,
0.17111703753471375,
0.14017978310585022,
-0.07220929861068726,
0.00957756768912077,
-0.009318350814282894,
0.04694434255361557,
-0.04099071025848389,
0.07498549669981003,
0.13975226879119873,
0.001494784140959382,
-0.13480740785598755,
0.17680296301841736,
-0.01797109842300415,
-0.1590055674314499,
-0.01882239617407322,
-0.05251779034733772,
-0.13579344749450684,
-0.09756974130868912,
-0.05157800018787384,
0.04239857941865921,
-0.2757738530635834,
-0.05544183775782585,
-0.03214123845100403,
-0.06440611928701401,
0.09703031927347183,
0.13031445443630219,
0.1278102844953537,
-0.007908554747700691,
-0.049747537821531296,
-0.008406093344092369,
-0.07933265715837479,
-0.019500648602843285,
0.030111869797110558,
0.09678846597671509,
-0.1566939651966095,
-0.1738075166940689,
0.006142682861536741,
0.1698809415102005,
-0.10098189115524292,
-0.007414892315864563,
-0.11763413995504379,
0.08206504583358765,
-0.06546837836503983,
0.05738480016589165,
-0.014813314191997051,
-0.011943328194320202,
-0.03623935580253601,
-0.035122085362672806,
-0.08856068551540375,
0.034219495952129364,
-0.09203845262527466,
-0.0029526713769882917,
0.016830390319228172,
-0.057783160358667374,
-0.10373998433351517,
-0.015201984904706478,
-0.0226436760276556,
-0.009135318920016289,
0.13365967571735382,
0.12030459195375443,
-0.09088876843452454,
0.10786110907793045,
-0.14492745697498322,
-0.25520849227905273,
0.15851791203022003,
0.031241459771990776,
-0.008856200613081455,
0.09181299060583115,
0.04287320747971535,
0.028564322739839554,
-0.023638218641281128,
-0.024330351501703262,
0.02306695096194744,
0.00195683422498405,
0.023538440465927124,
-0.24989952147006989,
-0.009236740879714489,
-0.04016896337270737,
0.01980225369334221,
0.07189016044139862,
0.14241549372673035,
0.05438948795199394,
-0.07742530107498169,
0.0013173818588256836,
0.004239555448293686,
0.008386380039155483,
0.01867441087961197,
-0.08579982072114944,
0.0326632522046566,
-0.0623113214969635,
0.05295486003160477,
-0.09629970788955688,
0.1755431741476059,
-0.10495399683713913,
-0.031233131885528564,
0.02192283794283867,
0.20452219247817993,
-0.14065247774124146,
0.08856566995382309,
0.13264349102973938,
0.10607113689184189,
-0.023962056264281273,
0.03829170763492584,
0.09062269330024719,
0.09928122907876968,
0.17844997346401215,
0.03964074328541756,
0.19191139936447144,
-0.00042410718742758036,
0.15107393264770508,
0.06644359230995178,
-0.012293105944991112,
0.06680402159690857,
0.026290075853466988,
-0.10322221368551254,
0.04279026389122009,
0.02821396477520466,
0.008411443792283535,
0.14889942109584808,
0.07190428674221039,
-0.0017907274886965752,
0.016319526359438896,
-0.02516101859509945,
-0.08891741186380386,
-0.11217759549617767,
-0.11525620520114899,
-0.14147210121154785,
0.04913537576794624,
-0.035215701907873154,
-0.07797708362340927,
0.22844046354293823,
0.0906866192817688,
-0.006468570791184902,
0.10693205147981644,
-0.0654025450348854,
-0.049122460186481476,
0.14905814826488495,
-0.00614152429625392,
-0.09115828573703766,
0.13961189985275269,
-0.06398580968379974,
0.05281522497534752,
0.046684734523296356,
-0.012487527914345264,
-0.020465219393372536,
0.03828316181898117,
0.1184300035238266,
-0.03651869297027588,
-0.05574316531419754,
-0.022718898952007294,
0.04923868179321289,
-0.08237836509943008,
-0.08030852675437927,
-0.01371938269585371,
-0.015811951830983162,
0.025778474286198616,
0.2408771514892578,
-0.10518161207437515,
-0.02583884820342064,
-0.09536394476890564,
-0.06004277616739273,
0.02517552115023136,
0.12189114093780518,
-0.07512117922306061,
-0.04470853880047798,
-0.06387996673583984,
0.17906291782855988,
0.20673015713691711,
-0.21035294234752655,
0.0035184978041797876,
0.10376141965389252,
0.02511218562722206,
-0.031808044761419296,
-0.004650143440812826,
0.13882911205291748,
0.13327014446258545,
-0.04662788286805153,
-0.129582479596138,
-0.02490535005927086,
-0.03355642035603523,
-0.07772064954042435,
0.009281202219426632,
0.07626063376665115,
-0.04284089431166649,
-0.1555171012878418,
0.10882078856229782,
-0.17905882000923157,
-0.032602593302726746,
0.11222188174724579,
-0.14307087659835815,
-0.11268967390060425,
-0.02294328063726425,
0.03523719310760498,
0.08151373267173767,
0.08115548640489578,
-0.07233346998691559,
-0.021127883344888687,
0.018312258645892143,
0.05120223015546799,
-0.1732940673828125,
-0.02744406834244728,
-0.07952781766653061,
-0.035132598131895065,
0.08684533834457397,
-0.003905318910256028,
0.005475407000631094,
0.042610615491867065,
0.02997417189180851,
-0.0007925948593765497,
0.013250380754470825,
-0.03596324473619461,
-0.15253087878227234,
0.028622496873140335,
0.14100927114486694,
-0.04068303480744362,
-0.0006043020985089242,
0.06420563906431198,
-0.25943824648857117,
-0.017550209537148476,
0.029937200248241425,
0.025251343846321106,
-0.09614688903093338,
-0.023595787584781647,
-0.06745176017284393,
0.07009837031364441,
0.02364671416580677,
0.01735982485115528,
0.01679307222366333,
-0.013046149164438248,
-0.03344890847802162,
0.05138327553868294,
-0.04654255136847496,
-0.002374047879129648,
-0.0929240807890892,
-0.04895054176449776,
-0.2071797102689743,
0.03307916224002838,
-0.003203711938112974,
-0.03869887813925743,
-0.04159720987081528,
-0.007809944450855255,
-0.05939047038555145,
0.09693777561187744,
0.08320595324039459,
-0.01892685331404209,
0.005040781572461128,
0.019797472283244133,
0.06583653390407562,
0.053126685321331024,
-0.05776207149028778,
-0.1451440006494522
] |
null | null |
transformers
|
# deit_base_patch16_224
Implementation of DeiT proposed in [Training data-efficient image
transformers & distillation through
attention](https://arxiv.org/pdf/2010.11929.pdf)
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.

``` {.sourceCode .}
DeiT.deit_tiny_patch16_224()
DeiT.deit_small_patch16_224()
DeiT.deit_base_patch16_224()
DeiT.deit_base_patch16_384()
```
|
{}
| null |
glasses/deit_base_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# deit_base_patch16_224
Implementation of DeiT proposed in Training data-efficient image
transformers & distillation through
attention
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.
!image
|
[
"# deit_base_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# deit_base_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
29,
70
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# deit_base_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
-0.12342353165149689,
-0.08591429889202118,
-0.0034440243616700172,
0.13312678039073944,
0.10027851164340973,
0.011092718690633774,
0.08120020478963852,
0.03620004653930664,
-0.14579744637012482,
-0.008183698169887066,
0.08923672139644623,
0.21285752952098846,
-0.027985408902168274,
0.13373324275016785,
-0.07885447144508362,
-0.3215305805206299,
0.008859465830028057,
0.16150683164596558,
-0.02202194184064865,
0.04431256651878357,
0.13697214424610138,
-0.1479857712984085,
0.10307334363460541,
-0.026726366952061653,
-0.1936018317937851,
0.016450194641947746,
-0.07253602147102356,
-0.05990986153483391,
0.15063250064849854,
0.025264790281653404,
0.14658194780349731,
0.1072106584906578,
0.07748456299304962,
0.08158237487077713,
0.04147285223007202,
-0.007013957481831312,
-0.012301282957196236,
0.0506487712264061,
-0.03986598551273346,
0.04881737381219864,
0.08735924959182739,
-0.05209902301430702,
0.05386033654212952,
-0.0611894391477108,
-0.15902426838874817,
0.006226659752428532,
-0.04855508729815483,
0.059125881642103195,
0.07672000676393509,
0.08086574077606201,
0.0366898849606514,
0.050204914063215256,
-0.03721309453248978,
0.11457241326570511,
0.07224663347005844,
-0.07750970125198364,
-0.07067618519067764,
0.006843685172498226,
0.007893607951700687,
-0.03951289504766464,
0.0313299223780632,
0.016861770302057266,
0.0793120339512825,
0.08896815776824951,
-0.018932554870843887,
-0.0042191483080387115,
0.015478740446269512,
0.022795969620347023,
-0.10489532351493835,
-0.047328606247901917,
0.26727086305618286,
0.01742561347782612,
-0.025116119533777237,
0.018267877399921417,
-0.07398860156536102,
-0.11992660164833069,
-0.10970290005207062,
-0.1090971902012825,
0.002202762058004737,
-0.007101341150701046,
-0.04065826162695885,
-0.04205150529742241,
-0.1342567503452301,
0.005422834772616625,
-0.19715715944766998,
0.12711825966835022,
0.05335254594683647,
0.006023502442985773,
-0.13657993078231812,
0.11120389401912689,
-0.031291309744119644,
-0.08699971437454224,
-0.06671103835105896,
-0.11431242525577545,
0.03372655063867569,
-0.11411873996257782,
-0.034587886184453964,
-0.19019654393196106,
-0.029868027195334435,
0.14071235060691833,
-0.04655721038579941,
-0.01610425114631653,
-0.026329128071665764,
0.034710951149463654,
0.051851898431777954,
0.1393771767616272,
-0.04064922779798508,
0.03260469809174538,
0.05627932772040367,
0.007407806348055601,
0.04333646968007088,
0.027260683476924896,
-0.11518746614456177,
-0.03594993054866791,
0.05804210528731346,
-0.02905092015862465,
-0.051356639713048935,
0.1410021185874939,
-0.05855896696448326,
-0.05528271943330765,
0.2075372338294983,
-0.040323805063962936,
-0.06307811290025711,
-0.06637408584356308,
-0.01421173382550478,
0.21542908251285553,
0.07152441889047623,
-0.022325098514556885,
0.020448239520192146,
0.060458600521087646,
-0.05211356654763222,
-0.01712440326809883,
-0.07204744964838028,
-0.10788293182849884,
-0.0010706378379836679,
-0.11364974826574326,
0.08348396420478821,
-0.20988225936889648,
-0.23541763424873352,
0.0747143104672432,
0.031576018780469894,
-0.01240762323141098,
0.03691261634230614,
-0.07389754056930542,
-0.0942593514919281,
-0.021182553842663765,
0.029266249388456345,
-0.08703874051570892,
-0.010516656562685966,
0.06357750296592712,
0.0591285265982151,
0.16074348986148834,
-0.2180948406457901,
0.0519883893430233,
-0.045643195509910583,
0.014102568849921227,
-0.251139372587204,
0.1319318264722824,
0.051540348678827286,
0.15367549657821655,
-0.0436411127448082,
-0.14556720852851868,
-0.15458013117313385,
0.016238128766417503,
0.05876331031322479,
0.191110298037529,
-0.2813400626182556,
-0.049187883734703064,
-0.09720511734485626,
-0.15812264382839203,
-0.06914129108190536,
0.0953606367111206,
-0.07833291590213776,
0.13981878757476807,
0.11964572966098785,
0.05150926485657692,
-0.018425585702061653,
-0.14489974081516266,
-0.009469214826822281,
-0.007761143147945404,
-0.2592669427394867,
0.016547411680221558,
-0.07815009355545044,
0.20119552314281464,
0.04435413330793381,
0.015972083434462547,
0.07021868973970413,
0.008330553770065308,
-0.1364692598581314,
-0.0676768347620964,
-0.006385451182723045,
-0.017650630325078964,
0.0486934632062912,
0.059100259095430374,
0.0694207102060318,
-0.03115750662982464,
0.031448084861040115,
0.1861775666475296,
0.07248274236917496,
-0.03214825317263603,
0.000089325949375052,
-0.15746985375881195,
0.06700778752565384,
-0.06797397136688232,
0.02016771212220192,
-0.1368660032749176,
-0.01122943963855505,
-0.05939878895878792,
0.17288339138031006,
0.03573174774646759,
0.07010171562433243,
0.08299539983272552,
0.025504503399133682,
-0.025537602603435516,
-0.05217752978205681,
-0.0511644147336483,
0.05706062167882919,
-0.03681720048189163,
-0.1051049530506134,
0.006422974169254303,
-0.06421597301959991,
-0.035106997936964035,
-0.03631225973367691,
0.011373351328074932,
0.14238765835762024,
0.08668682724237442,
0.046488724648952484,
0.06864041835069656,
-0.0279970932751894,
0.05445850268006325,
-0.1005994901061058,
-0.08157288283109665,
0.02633765898644924,
-0.028927113860845566,
0.02817450650036335,
-0.04732678085565567,
0.05278580263257027,
0.15909890830516815,
0.1633012890815735,
-0.29312047362327576,
-0.11254949122667313,
0.056498296558856964,
-0.029315078631043434,
0.02340998873114586,
0.08599524199962616,
-0.06938831508159637,
0.05464198440313339,
-0.06007944047451019,
0.08551010489463806,
-0.007666027173399925,
0.027421627193689346,
0.045137036591768265,
0.04169740155339241,
-0.08944724500179291,
0.036805231124162674,
0.06475690752267838,
-0.22078178822994232,
0.07560990005731583,
0.09155138581991196,
0.009870821610093117,
0.007856105454266071,
0.01326390728354454,
-0.04325799271464348,
0.006835542619228363,
0.002003607340157032,
0.02119947224855423,
0.13521219789981842,
-0.13822831213474274,
-0.039245590567588806,
0.01018157321959734,
-0.03838924691081047,
0.03652583435177803,
-0.11191560328006744,
-0.0014197794953361154,
0.03683468699455261,
-0.01551495585590601,
-0.0984988659620285,
0.053899504244327545,
-0.014941288158297539,
0.06751372665166855,
0.046545445919036865,
-0.06204887479543686,
0.09053675085306168,
0.06531534343957901,
0.011822296306490898,
0.09953541308641434,
-0.058914389461278915,
-0.22912055253982544,
-0.0706525519490242,
-0.07195918262004852,
0.03441063314676285,
0.11404632031917572,
0.07649990171194077,
-0.09103912115097046,
-0.049284130334854126,
-0.01176645327359438,
0.03624693676829338,
-0.05989343672990799,
0.052547913044691086,
0.024971481412649155,
0.020001642405986786,
-0.052415795624256134,
-0.07900132238864899,
-0.027576245367527008,
-0.09130924195051193,
0.012336627580225468,
0.17827634513378143,
-0.14009423553943634,
0.12911149859428406,
0.0445101223886013,
-0.06843177229166031,
0.039743803441524506,
-0.06821513175964355,
0.2217537760734558,
-0.09076466411352158,
0.09096602350473404,
0.21416595578193665,
0.016230011358857155,
0.013034280389547348,
0.08356918394565582,
-0.021645033732056618,
-0.13561125099658966,
0.01917935535311699,
-0.031298983842134476,
-0.1200120821595192,
-0.06332790851593018,
-0.058972589671611786,
-0.1076386347413063,
-0.03690463304519653,
0.07229725271463394,
0.04739811271429062,
0.1672179251909256,
0.06702359020709991,
0.05273452401161194,
0.13768377900123596,
0.00816874299198389,
0.10315709561109543,
0.29745620489120483,
-0.04475241154432297,
0.09366744756698608,
-0.0879337266087532,
-0.1223098635673523,
0.07992637902498245,
0.04685129597783089,
0.25126928091049194,
0.0013117932248860598,
-0.08217225223779678,
0.05997703969478607,
-0.0005106512107886374,
0.0556691475212574,
0.16427366435527802,
-0.02458055131137371,
-0.037525199353694916,
-0.0633782148361206,
-0.01613505184650421,
-0.006993110757321119,
0.048900604248046875,
-0.036438822746276855,
-0.05401095747947693,
-0.01977749913930893,
0.05957586690783501,
0.06504851579666138,
0.1586771458387375,
0.07511229813098907,
-0.4433477818965912,
-0.10949768126010895,
-0.023841332644224167,
0.05967161804437637,
-0.029846811667084694,
-0.02924543060362339,
-0.035974595695734024,
0.06158025190234184,
0.05109642073512077,
0.00019541078654583544,
0.05748947709798813,
-0.10236970335245132,
0.05118783563375473,
0.04094152897596359,
0.026117414236068726,
0.012144390493631363,
-0.05577825382351875,
-0.2914184331893921,
0.16704531013965607,
-0.007774484343826771,
0.09253084659576416,
0.005178641527891159,
-0.020099036395549774,
0.0394265241920948,
0.26569581031799316,
0.08539249747991562,
-0.026995407417416573,
0.06164413318037987,
-0.19853371381759644,
0.01539107970893383,
-0.01900823973119259,
0.05265732482075691,
0.087620809674263,
0.017418310046195984,
0.04957742244005203,
0.014396118931472301,
0.04771070182323456,
0.10307063162326813,
-0.10293269157409668,
-0.1107247844338417,
0.014077971689403057,
-0.057964522391557693,
0.06829553842544556,
0.018817530944943428,
-0.08122234046459198,
-0.15137693285942078,
0.03295593708753586,
0.03112560510635376,
-0.07011111825704575,
-0.13337190449237823,
0.08407406508922577,
-0.0014345800736919045,
-0.012049130164086819,
0.09456721693277359,
0.0005526996683329344,
0.008419003337621689,
-0.018748125061392784,
-0.14218126237392426,
0.08805976063013077,
-0.10045207291841507,
0.02070612646639347,
-0.019693972542881966,
0.0063121323473751545,
0.022795172408223152,
-0.03767223656177521,
0.041348230093717575,
-0.005239290650933981,
-0.009278600104153156,
-0.009730176068842411,
0.02081475220620632,
-0.017481224611401558,
0.11230786889791489,
-0.01753312721848488,
-0.09491461515426636,
-0.010190887376666069,
-0.044224224984645844,
0.0684228464961052,
0.01757139153778553,
0.0640067458152771,
-0.013716695830225945,
0.01248864084482193,
-0.047788046300411224,
0.01638863794505596,
-0.22309526801109314,
0.08722443878650665,
-0.016950244084000587,
0.04062001034617424,
-0.0625469982624054,
-0.15254522860050201,
0.015114394947886467,
0.07512001693248749,
0.05282406136393547,
0.10277396440505981,
-0.16115280985832214,
-0.05488256737589836,
0.07855655997991562,
0.05023927986621857,
0.26532331109046936,
-0.05446435883641243,
0.04917009547352791,
-0.07014373689889908,
-0.1730954945087433,
0.0013394643319770694,
0.06010182574391365,
0.07519067823886871,
-0.01640944927930832,
0.057263970375061035,
0.012833395041525364,
-0.016319915652275085,
0.032563790678977966,
0.07496679574251175,
0.12877847254276276,
-0.060784049332141876,
-0.15974344313144684,
0.14903505146503448,
-0.027273360639810562,
0.04935205727815628,
0.22269515693187714,
0.09440196305513382,
0.0135395098477602,
-0.042488906532526016,
-0.023744020611047745,
0.04364914447069168,
0.048535190522670746,
-0.07779867947101593,
-0.11225390434265137,
0.012520593591034412,
0.003319096751511097,
-0.004972248338162899,
0.07093495875597,
0.0007273433147929609,
-0.1514410674571991,
-0.07632237672805786,
0.08975747227668762,
-0.14457565546035767,
0.00351794995367527,
0.001225203275680542,
-0.053485408425331116,
0.08165021240711212,
-0.14859133958816528,
0.06292594224214554,
0.1307668685913086,
-0.01178604643791914,
0.09015393257141113,
0.1346309781074524,
0.05496928468346596,
0.0952630564570427,
0.11845984309911728,
-0.06575224548578262,
-0.03361502289772034,
-0.007355326786637306,
-0.12153743952512741,
0.1536807119846344,
0.20114950835704803,
0.10757021605968475,
-0.03815584257245064,
0.03793705627322197,
-0.015194051899015903,
-0.026812994852662086,
-0.05615483969449997,
0.06755673140287399,
-0.04374537989497185,
-0.039093974977731705,
-0.0749802514910698,
-0.005148585420101881,
-0.035733629018068314,
-0.1532185673713684,
-0.019824830815196037,
-0.06220695376396179,
-0.1337994486093521,
-0.06328825652599335,
-0.13598141074180603,
0.0478551872074604,
-0.12719856202602386,
-0.024995919317007065,
-0.08235861361026764,
-0.09933043271303177,
0.10606903582811356,
0.10759620368480682,
0.08017732203006744,
0.07627004384994507,
-0.09164737164974213,
0.11845791339874268,
-0.14897538721561432,
-0.02856496535241604,
-0.020135516300797462,
0.13786254823207855,
-0.1583883911371231,
0.07153748720884323,
-0.07345273345708847,
0.09233231097459793,
-0.10854184627532959,
-0.03161969035863876,
-0.004735720809549093,
0.046321045607328415,
-0.03388240188360214,
-0.026739148423075676,
-0.0659029632806778,
-0.02096276357769966,
0.035490985959768295,
-0.02988746389746666,
-0.041348718106746674,
0.018420811742544174,
-0.036358628422021866,
0.06392934173345566,
0.07235155999660492,
0.007481675129383802,
0.00470487866550684,
-0.05935288965702057,
-0.043224483728408813,
-0.06784362345933914,
0.11223605275154114,
0.08305241167545319,
-0.06463363021612167,
-0.004188328515738249,
0.002918022917583585,
-0.10013160854578018,
0.17984376847743988,
0.058892033994197845,
0.059509314596652985,
0.07503898441791534,
0.0036213563289493322,
0.06737212091684341,
-0.036139219999313354,
-0.01366454642266035,
0.016372688114643097,
-0.016139300540089607,
0.10710098594427109,
-0.0562061108648777,
-0.07991441339254379,
-0.05312976613640785,
0.019654283300042152,
0.11367686837911606,
0.08990586549043655,
0.1447848230600357,
-0.056248486042022705,
-0.04912887141108513,
-0.05924677103757858,
-0.030024925246834755,
0.026950204744935036,
-0.11173655837774277,
-0.04797804355621338,
-0.09314491599798203,
0.04495731368660927,
-0.0138503797352314,
0.11304974555969238,
-0.012605269439518452,
-0.023091057315468788,
0.014037376269698143,
0.01811360940337181,
-0.12881262600421906,
0.019443223252892494,
0.24454262852668762,
0.053888481110334396,
-0.03145113214850426,
-0.11275109648704529,
0.10330848395824432,
0.09371470659971237,
0.08638632297515869,
0.04617825895547867,
0.1630534827709198,
-0.08903458714485168,
0.0932757556438446,
0.05356655269861221,
0.1325431913137436,
-0.21243609488010406,
-0.01744948886334896,
-0.04137628152966499,
0.012253962457180023,
0.061424821615219116,
-0.03686363250017166,
0.15622903406620026,
-0.01855085976421833,
0.03515499830245972,
0.029114654287695885,
-0.037396982312202454,
0.022750457748770714,
-0.059345945715904236,
-0.08464325964450836,
-0.1281631588935852,
0.011683412827551365,
-0.08581570535898209,
-0.03231261670589447,
0.001135483500547707,
0.09928616136312485,
-0.050317466259002686,
0.12442130595445633,
0.021528132259845734,
0.020001886412501335,
0.06188354641199112,
-0.03585994243621826,
-0.10467901825904846,
-0.007700147107243538,
-0.07253842800855637,
-0.024642765522003174,
0.037051886320114136,
-0.01913515478372574,
-0.03606318309903145,
0.06040418893098831,
-0.04794944450259209,
-0.024527480825781822,
-0.01309669204056263,
-0.019102714955806732,
0.08482466638088226,
-0.08439711481332779,
-0.057473938912153244,
0.0003438266576267779,
-0.08291798084974289,
0.05488068237900734,
-0.0595569983124733,
0.007313325069844723,
-0.20756301283836365,
-0.12967362999916077,
0.253250390291214,
0.059846267104148865,
0.1011817678809166,
-0.012083693407475948,
-0.04878555238246918,
-0.013621934689581394,
0.1309088170528412,
0.18278121948242188,
-0.11115918308496475,
-0.04638355225324631,
-0.018635710701346397,
0.019006570801138878,
-0.06590352207422256,
-0.008272015489637852,
0.08969224244356155,
0.10012229532003403,
0.02568092942237854,
-0.10590853542089462,
-0.0972958654165268,
0.06971258670091629,
0.0068965177051723,
-0.028009023517370224,
0.06301788985729218,
-0.031918350607156754,
-0.10082290321588516,
0.07356667518615723,
-0.032907288521528244,
-0.05851791426539421,
0.21185804903507233,
-0.06365012377500534,
-0.04721059277653694,
0.033546920865774155,
-0.03307529166340828,
-0.0128640066832304,
0.0747053474187851,
-0.09955506026744843,
-0.038429874926805496,
0.04625985398888588,
0.0015153138665482402,
-0.0946233719587326,
0.07827582210302353,
0.11081670224666595,
0.03960550203919411,
-0.003951604478061199,
-0.06605078279972076,
0.05864938348531723,
0.05966371297836304,
0.08953003585338593,
-0.046895042061805725,
0.09167485684156418,
-0.08056221157312393,
-0.020433146506547928,
0.013344647362828255,
0.028342319652438164,
-0.020089657977223396,
-0.06943738460540771,
-0.0014440665254369378,
-0.2258237600326538,
-0.006016506813466549,
0.07378267496824265,
-0.08793183416128159,
-0.07779357582330704,
0.02593042142689228,
-0.03558870032429695,
0.045281488448381424,
0.012967719696462154,
0.050295256078243256,
0.06179950386285782,
-0.06904362887144089,
0.03790484741330147,
0.10444819182157516,
-0.04191597178578377,
-0.08308325707912445,
-0.07956112921237946,
-0.03699001669883728,
-0.11986613273620605,
-0.04012438282370567,
0.005253848619759083,
-0.10867656022310257,
-0.056690726429224014,
-0.01689564809203148,
-0.042155686765909195,
0.07761947810649872,
0.13991740345954895,
0.03793599456548691,
-0.015706341713666916,
0.11850710958242416,
-0.016732191666960716,
0.11764074862003326,
-0.05687762796878815,
-0.048059917986392975
] |
null | null |
transformers
|
# deit_base_patch16_384
Implementation of DeiT proposed in [Training data-efficient image
transformers & distillation through
attention](https://arxiv.org/pdf/2010.11929.pdf)
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.

``` {.sourceCode .}
DeiT.deit_tiny_patch16_224()
DeiT.deit_small_patch16_224()
DeiT.deit_base_patch16_224()
DeiT.deit_base_patch16_384()
```
|
{}
| null |
glasses/deit_base_patch16_384
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# deit_base_patch16_384
Implementation of DeiT proposed in Training data-efficient image
transformers & distillation through
attention
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.
!image
|
[
"# deit_base_patch16_384\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# deit_base_patch16_384\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
29,
69
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# deit_base_patch16_384\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
-0.1216820478439331,
-0.08668020367622375,
-0.0034311162307858467,
0.13173946738243103,
0.09654231369495392,
0.01010139100253582,
0.08760848641395569,
0.03689934313297272,
-0.13984088599681854,
-0.005575902294367552,
0.08697396516799927,
0.21406501531600952,
-0.025734730064868927,
0.13777782022953033,
-0.08141122758388519,
-0.3243806064128876,
0.005486351437866688,
0.16266964375972748,
-0.01636776328086853,
0.04547693207859993,
0.13865824043750763,
-0.14949257671833038,
0.10135046392679214,
-0.02717607095837593,
-0.19679321348667145,
0.01526726596057415,
-0.06902948766946793,
-0.06106666848063469,
0.14701001346111298,
0.022661203518509865,
0.14972813427448273,
0.11275959759950638,
0.07709944248199463,
0.0874713808298111,
0.04139070585370064,
-0.006349703762680292,
-0.007922820746898651,
0.05226872116327286,
-0.03809911012649536,
0.05453263968229294,
0.08928805589675903,
-0.05953408405184746,
0.05206172913312912,
-0.05939468741416931,
-0.15810056030750275,
0.012416052632033825,
-0.05374579876661301,
0.062241487205028534,
0.08290460705757141,
0.07505029439926147,
0.03439199551939964,
0.0521848201751709,
-0.03800917789340019,
0.11700119823217392,
0.06590006500482559,
-0.08345098048448563,
-0.06807874888181686,
0.0175975039601326,
0.003156443126499653,
-0.03938686475157738,
0.0327407531440258,
0.019048361107707024,
0.08024822175502777,
0.0832405835390091,
-0.02149185724556446,
-0.006049121264368296,
0.025348937138915062,
0.018246455118060112,
-0.10764337331056595,
-0.050412267446517944,
0.2718082666397095,
0.020616887137293816,
-0.027171600610017776,
0.011015850119292736,
-0.07614173740148544,
-0.11734765022993088,
-0.11045970767736435,
-0.10609748214483261,
0.0021613615099340677,
-0.0068366979248821735,
-0.04050355404615402,
-0.042017169296741486,
-0.13733014464378357,
0.008333759382367134,
-0.1952890306711197,
0.12002930045127869,
0.05363365262746811,
0.0064865052700042725,
-0.14142963290214539,
0.11234150826931,
-0.03272245079278946,
-0.08911225199699402,
-0.06069693714380264,
-0.11580407619476318,
0.03762750327587128,
-0.11585191637277603,
-0.03708774596452713,
-0.2014506459236145,
-0.0306865181773901,
0.14289900660514832,
-0.04833275079727173,
-0.016829445958137512,
-0.03058636374771595,
0.03743154555559158,
0.05298281088471413,
0.14150650799274445,
-0.039263803511857986,
0.03432663902640343,
0.05896656587719917,
0.005296903662383556,
0.048197824507951736,
0.026798952370882034,
-0.11418941617012024,
-0.038174960762262344,
0.05648983269929886,
-0.029231034219264984,
-0.04922754690051079,
0.14475500583648682,
-0.05851750820875168,
-0.05601485073566437,
0.2123500108718872,
-0.042848359793424606,
-0.0624379999935627,
-0.06549040973186493,
-0.014637707732617855,
0.2233763337135315,
0.07406076788902283,
-0.019407158717513084,
0.019091404974460602,
0.0587230883538723,
-0.04840405657887459,
-0.01776058040559292,
-0.07260539382696152,
-0.10403928905725479,
0.00019872802658937871,
-0.11488218605518341,
0.08395228534936905,
-0.20746967196464539,
-0.23376253247261047,
0.07533124089241028,
0.03032865561544895,
-0.008590273559093475,
0.03477037325501442,
-0.07114046066999435,
-0.09213525801897049,
-0.020379377529025078,
0.029633665457367897,
-0.08332798629999161,
-0.007097494788467884,
0.06678768992424011,
0.05840980261564255,
0.16095104813575745,
-0.21748262643814087,
0.052245087921619415,
-0.0438443161547184,
0.012694990262389183,
-0.2595226764678955,
0.1374056339263916,
0.05040259286761284,
0.15243680775165558,
-0.045257534831762314,
-0.1489778459072113,
-0.1481141895055771,
0.016537627205252647,
0.059507135301828384,
0.19366446137428284,
-0.27594858407974243,
-0.0496964231133461,
-0.10164690017700195,
-0.15687134861946106,
-0.0662381500005722,
0.09664881229400635,
-0.07904182374477386,
0.1383499652147293,
0.11970287561416626,
0.06254013627767563,
-0.01337435096502304,
-0.1416528970003128,
-0.010402070358395576,
-0.004503499250859022,
-0.25605064630508423,
0.010615664534270763,
-0.07940517365932465,
0.20062409341335297,
0.0497758686542511,
0.01656428910791874,
0.07364168763160706,
0.007149938493967056,
-0.13553492724895477,
-0.06940347701311111,
-0.009700230322778225,
-0.016474144533276558,
0.04571349173784256,
0.060930561274290085,
0.0736248642206192,
-0.03241461515426636,
0.03251853585243225,
0.18230997025966644,
0.07365044206380844,
-0.03095880150794983,
0.0007719223503954709,
-0.16064107418060303,
0.06744658201932907,
-0.06602830439805984,
0.018964577466249466,
-0.13937628269195557,
-0.007526135537773371,
-0.05703746899962425,
0.1716517210006714,
0.03615817055106163,
0.06783974170684814,
0.08292601257562637,
0.023346994072198868,
-0.023025041446089745,
-0.05369191989302635,
-0.04666104167699814,
0.05578991770744324,
-0.04030374065041542,
-0.09989576786756516,
0.0015785744180902839,
-0.0639134868979454,
-0.03480805456638336,
-0.024755118414759636,
0.013337395153939724,
0.14845579862594604,
0.090212382376194,
0.04672927409410477,
0.06703481823205948,
-0.021755358204245567,
0.054186854511499405,
-0.09703034907579422,
-0.08080070465803146,
0.02938821353018284,
-0.029667062684893608,
0.02443883754312992,
-0.050820186734199524,
0.05622514709830284,
0.145776629447937,
0.16338810324668884,
-0.29869458079338074,
-0.11103741079568863,
0.05837295204401016,
-0.030472641810774803,
0.025139806792140007,
0.08776428550481796,
-0.06747955083847046,
0.054691292345523834,
-0.06224576756358147,
0.08594454824924469,
-0.0074699013493955135,
0.02743127942085266,
0.04555979743599892,
0.04025675356388092,
-0.0903191938996315,
0.036304838955402374,
0.0625736340880394,
-0.22131386399269104,
0.07340673357248306,
0.09416653960943222,
0.0136810801923275,
0.006120902020484209,
0.0033342635724693537,
-0.041891127824783325,
0.007706103380769491,
0.0016796037089079618,
0.021000951528549194,
0.12654592096805573,
-0.14001573622226715,
-0.03961413353681564,
0.012099241837859154,
-0.037044327706098557,
0.03541570156812668,
-0.1144525483250618,
0.001987183466553688,
0.03376813605427742,
-0.01628081314265728,
-0.09778664261102676,
0.057097841054201126,
-0.013855990022420883,
0.06977476924657822,
0.04352105036377907,
-0.06281305104494095,
0.09053579717874527,
0.062088653445243835,
0.00944282952696085,
0.098436139523983,
-0.05689422786235809,
-0.2212027907371521,
-0.07481728494167328,
-0.0711602121591568,
0.03151201829314232,
0.11132034659385681,
0.07654719054698944,
-0.0895218551158905,
-0.04764441028237343,
-0.012991821393370628,
0.028703194111585617,
-0.056604061275720596,
0.05480042099952698,
0.029047416523098946,
0.014938442036509514,
-0.05271606892347336,
-0.08078604936599731,
-0.0312491487711668,
-0.09343599528074265,
0.009828724898397923,
0.1821947693824768,
-0.1407618224620819,
0.12583379447460175,
0.04937155172228813,
-0.06971040368080139,
0.039904192090034485,
-0.06913914531469345,
0.22432541847229004,
-0.0911492332816124,
0.10180139541625977,
0.21873122453689575,
0.018897591158747673,
0.01583736017346382,
0.08203580230474472,
-0.01942852884531021,
-0.13819994032382965,
0.01964297518134117,
-0.034870781004428864,
-0.12009799480438232,
-0.06557981669902802,
-0.05961128696799278,
-0.10857071727514267,
-0.03551872447133064,
0.07079137116670609,
0.04351181536912918,
0.16616812348365784,
0.06925052404403687,
0.053094614297151566,
0.14326028525829315,
0.011572537012398243,
0.10288403928279877,
0.2981162965297699,
-0.0457213819026947,
0.09430423378944397,
-0.09210281074047089,
-0.11884574592113495,
0.07869133353233337,
0.04481949657201767,
0.24309562146663666,
-0.0023307218216359615,
-0.07693378627300262,
0.06045283377170563,
0.003967632073909044,
0.05843765661120415,
0.1634977161884308,
-0.02727382630109787,
-0.034888315945863724,
-0.06325605511665344,
-0.018179941922426224,
-0.010846991091966629,
0.04688932001590729,
-0.03167129307985306,
-0.055556152015924454,
-0.024940717965364456,
0.05761506408452988,
0.06465496867895126,
0.17145971953868866,
0.07574062049388885,
-0.4385678470134735,
-0.10969173163175583,
-0.01804168149828911,
0.06440842151641846,
-0.02586038038134575,
-0.02578135021030903,
-0.03861786052584648,
0.0616716630756855,
0.055656030774116516,
-0.002968337619677186,
0.05720307305455208,
-0.10403329133987427,
0.05056864768266678,
0.03657938539981842,
0.025183076038956642,
0.012990226037800312,
-0.05509524419903755,
-0.2990990877151489,
0.1645917147397995,
-0.0050272708758711815,
0.09421338886022568,
0.006636031903326511,
-0.014816390350461006,
0.03776320442557335,
0.2698597013950348,
0.08606081455945969,
-0.02754274569451809,
0.059676919132471085,
-0.20130643248558044,
0.011296670883893967,
-0.02349310740828514,
0.05089781433343887,
0.08969391137361526,
0.01605340652167797,
0.05001645162701607,
0.012347588315606117,
0.044755805283784866,
0.09832695126533508,
-0.10243609547615051,
-0.11046236753463745,
0.015077958814799786,
-0.05510875582695007,
0.06048054248094559,
0.017383532598614693,
-0.0779014453291893,
-0.15640868246555328,
0.037573110312223434,
0.03028314746916294,
-0.07634104043245316,
-0.1334877461194992,
0.08365932106971741,
-0.0042656585574150085,
-0.013379870913922787,
0.09065122902393341,
-0.0008057297673076391,
0.00696431752294302,
-0.02187538892030716,
-0.1443559229373932,
0.08624692261219025,
-0.09867440164089203,
0.020071417093276978,
-0.01769024133682251,
0.012223524041473866,
0.0162127073854208,
-0.03826802968978882,
0.042878657579422,
-0.004637328442186117,
-0.006629432551562786,
-0.008978867903351784,
0.0252592284232378,
-0.013028392568230629,
0.11947258561849594,
-0.015248900279402733,
-0.09418857842683792,
-0.015229997225105762,
-0.04365220665931702,
0.0736108273267746,
0.018652699887752533,
0.06977320462465286,
-0.013240735046565533,
0.014412366785109043,
-0.04310562089085579,
0.015761665999889374,
-0.2209467589855194,
0.09090358763933182,
-0.020319046452641487,
0.03849276900291443,
-0.06284531205892563,
-0.15137648582458496,
0.011098022572696209,
0.06796012818813324,
0.051881302148103714,
0.09176811575889587,
-0.16541524231433868,
-0.05559588596224785,
0.07494745403528214,
0.05327026918530464,
0.2581307590007782,
-0.057110048830509186,
0.04846663400530815,
-0.06325630098581314,
-0.17962788045406342,
-0.004954305477440357,
0.06266055256128311,
0.07407145202159882,
-0.01391194574534893,
0.06080053001642227,
0.015100553631782532,
-0.02187221497297287,
0.029897689819335938,
0.0753425881266594,
0.1286320835351944,
-0.06084033474326134,
-0.16226641833782196,
0.1458069086074829,
-0.02956424281001091,
0.05127638578414917,
0.22882196307182312,
0.09536948800086975,
0.002714781556278467,
-0.0386759415268898,
-0.021087557077407837,
0.04392301291227341,
0.045745231211185455,
-0.0789259672164917,
-0.11502109467983246,
0.011111914180219173,
0.004031797405332327,
-0.005619541276246309,
0.07071997970342636,
-0.00014559603005181998,
-0.14724722504615784,
-0.0672626793384552,
0.09240861237049103,
-0.14510208368301392,
0.002630597911775112,
-0.003819923847913742,
-0.053812045603990555,
0.08026764541864395,
-0.15470877289772034,
0.06048600748181343,
0.13115926086902618,
-0.011553012765944004,
0.09214407205581665,
0.1310434490442276,
0.05379410460591316,
0.0982566550374031,
0.1164773479104042,
-0.06788498163223267,
-0.029436998069286346,
-0.009133681654930115,
-0.12070303410291672,
0.1544875204563141,
0.20135238766670227,
0.10746229439973831,
-0.03841094300150871,
0.03804668411612511,
-0.01410333625972271,
-0.024660222232341766,
-0.05512756481766701,
0.06332693994045258,
-0.04178250581026077,
-0.03601707145571709,
-0.0766269862651825,
-0.004325288813561201,
-0.033804651349782944,
-0.15413163602352142,
-0.018802804872393608,
-0.06147937849164009,
-0.13600000739097595,
-0.062837153673172,
-0.13503234088420868,
0.04676419124007225,
-0.12565787136554718,
-0.019619565457105637,
-0.08258313685655594,
-0.10047628730535507,
0.10360080003738403,
0.09920471161603928,
0.08133701235055923,
0.07463576644659042,
-0.09396225214004517,
0.11740289628505707,
-0.15139812231063843,
-0.030788946896791458,
-0.023067429661750793,
0.1359705626964569,
-0.1559755802154541,
0.07896801084280014,
-0.07270418107509613,
0.0941973328590393,
-0.10787511616945267,
-0.033199068158864975,
-0.005975019186735153,
0.04827112704515457,
-0.026663200929760933,
-0.026125799864530563,
-0.06616082787513733,
-0.020571285858750343,
0.033713411539793015,
-0.030721202492713928,
-0.03993199020624161,
0.0196650680154562,
-0.03714826703071594,
0.06528580188751221,
0.0704033300280571,
0.009434503503143787,
0.0021789041347801685,
-0.05811513960361481,
-0.045224323868751526,
-0.06755917519330978,
0.11079344898462296,
0.07935130596160889,
-0.06689285486936569,
-0.006354701239615679,
0.0036575286649167538,
-0.10055093467235565,
0.17911455035209656,
0.05744646489620209,
0.058653056621551514,
0.07870282232761383,
0.005092554725706577,
0.06837684661149979,
-0.03718520328402519,
-0.015126311220228672,
0.014513978734612465,
-0.015983231365680695,
0.11136282980442047,
-0.05606405436992645,
-0.07535546272993088,
-0.06028926745057106,
0.022035589441657066,
0.11863309890031815,
0.09095551818609238,
0.14569218456745148,
-0.05954679474234581,
-0.048313599079847336,
-0.061062686145305634,
-0.03171415254473686,
0.02786017209291458,
-0.1147635281085968,
-0.047843802720308304,
-0.09695394337177277,
0.044372864067554474,
-0.014502439647912979,
0.10805383324623108,
-0.010345828719437122,
-0.026071954518556595,
0.016230149194598198,
0.012647723779082298,
-0.12567336857318878,
0.020970283076167107,
0.24802187085151672,
0.050926513969898224,
-0.028735361993312836,
-0.10914794355630875,
0.1037290170788765,
0.09200812131166458,
0.08984040468931198,
0.047397881746292114,
0.15909379720687866,
-0.09301617741584778,
0.09405980259180069,
0.05049574747681618,
0.13948214054107666,
-0.2150818258523941,
-0.01967703551054001,
-0.0422971211373806,
0.017776785418391228,
0.05959977209568024,
-0.042191315442323685,
0.15022966265678406,
-0.024295596405863762,
0.03529033064842224,
0.030098704621195793,
-0.0378984734416008,
0.020721785724163055,
-0.05483999848365784,
-0.08378946781158447,
-0.12307892739772797,
0.012800662778317928,
-0.08557405322790146,
-0.028459038585424423,
-0.004314325284212828,
0.09866803884506226,
-0.05442605912685394,
0.11973287910223007,
0.014312018640339375,
0.02741711027920246,
0.06542432308197021,
-0.035533420741558075,
-0.10812204331159592,
-0.007849159650504589,
-0.0772530734539032,
-0.02540181204676628,
0.03694755956530571,
-0.016949256882071495,
-0.034015219658613205,
0.06259548664093018,
-0.04730667173862457,
-0.023548971861600876,
-0.015465278178453445,
-0.018397971987724304,
0.08192998170852661,
-0.07661380618810654,
-0.058808472007513046,
0.00039303835364989936,
-0.0842084065079689,
0.05156547948718071,
-0.06288004666566849,
0.008501065894961357,
-0.20324580371379852,
-0.1305951178073883,
0.2554410696029663,
0.055301010608673096,
0.09742273390293121,
-0.010190829634666443,
-0.05107896775007248,
-0.01334945484995842,
0.13280199468135834,
0.18304800987243652,
-0.10532945394515991,
-0.04396335780620575,
-0.01985890418291092,
0.018326273187994957,
-0.06476356834173203,
0.0014825118705630302,
0.086652010679245,
0.09223967045545578,
0.027626443654298782,
-0.09599748998880386,
-0.09794165939092636,
0.0633435994386673,
0.0045919218100607395,
-0.021647868677973747,
0.06334938108921051,
-0.03499718755483627,
-0.09829308092594147,
0.07421427965164185,
-0.03225494548678398,
-0.06299201399087906,
0.21547062695026398,
-0.06798209995031357,
-0.04383837804198265,
0.03455084562301636,
-0.03324022889137268,
-0.010708102956414223,
0.0745142474770546,
-0.10018520057201385,
-0.034002020955085754,
0.04461606219410896,
0.0016106239054352045,
-0.0922653079032898,
0.08397090435028076,
0.10755598545074463,
0.036654066294431686,
-0.007845702581107616,
-0.06318783015012741,
0.06357603520154953,
0.05741296708583832,
0.08709001541137695,
-0.0470576174557209,
0.0913042351603508,
-0.08079572021961212,
-0.0196192879229784,
0.014171140268445015,
0.029078885912895203,
-0.021610116586089134,
-0.07135048508644104,
-0.0031737973913550377,
-0.2315816432237625,
-0.005697313696146011,
0.06526821106672287,
-0.08595051616430283,
-0.0788470059633255,
0.024912964552640915,
-0.0366542749106884,
0.04669281840324402,
0.01142018660902977,
0.05196122080087662,
0.05997330695390701,
-0.06765811890363693,
0.03933468833565712,
0.10055587440729141,
-0.04345432296395302,
-0.0848245620727539,
-0.08038033545017242,
-0.03900504484772682,
-0.12445364892482758,
-0.03372836485505104,
0.006875959224998951,
-0.10614568740129471,
-0.058060403913259506,
-0.016094336286187172,
-0.050652723759412766,
0.08014657348394394,
0.14203014969825745,
0.03551705181598663,
-0.015613583847880363,
0.11844470351934433,
-0.01592005044221878,
0.11757391691207886,
-0.05814701318740845,
-0.046954795718193054
] |
null | null |
transformers
|
# deit_small_patch16_224
Implementation of DeiT proposed in [Training data-efficient image
transformers & distillation through
attention](https://arxiv.org/pdf/2010.11929.pdf)
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.

``` {.sourceCode .}
DeiT.deit_tiny_patch16_224()
DeiT.deit_small_patch16_224()
DeiT.deit_base_patch16_224()
DeiT.deit_base_patch16_384()
```
|
{}
| null |
glasses/deit_small_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# deit_small_patch16_224
Implementation of DeiT proposed in Training data-efficient image
transformers & distillation through
attention
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.
!image
|
[
"# deit_small_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# deit_small_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
29,
71
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# deit_small_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
-0.1222398579120636,
-0.10376303642988205,
-0.003458722261711955,
0.13944292068481445,
0.09439133107662201,
0.0017437327187508345,
0.08597845584154129,
0.03708392009139061,
-0.13609038293361664,
0.0012937354622408748,
0.09011849761009216,
0.20906271040439606,
-0.028399072587490082,
0.12452343851327896,
-0.08295151591300964,
-0.31520402431488037,
-0.0005036798538640141,
0.1621406227350235,
-0.021265503019094467,
0.041383273899555206,
0.14029695093631744,
-0.1596992015838623,
0.10745816677808762,
-0.01612440124154091,
-0.1844092160463333,
0.0061318702064454556,
-0.06863225996494293,
-0.0653790757060051,
0.14341123402118683,
0.02303651161491871,
0.1302797496318817,
0.11136127263307571,
0.0705312192440033,
0.09978613257408142,
0.03815587982535362,
-0.02277609519660473,
0.0019757437985390425,
0.0431215763092041,
-0.03353438153862953,
0.06541020423173904,
0.08673644065856934,
-0.07111743092536926,
0.045195989310741425,
-0.05611567199230194,
-0.1438620239496231,
0.004536976106464863,
-0.055518738925457,
0.04670727252960205,
0.05921902135014534,
0.08078428357839584,
0.04105941951274872,
0.07193008065223694,
-0.06888750195503235,
0.10879742354154587,
0.09160386770963669,
-0.06716662645339966,
-0.06567354500293732,
-0.004364247899502516,
0.018790876492857933,
-0.04105491191148758,
0.03519786521792412,
0.018963821232318878,
0.07625404745340347,
0.09190324693918228,
-0.024980364367365837,
-0.0028552510775625706,
0.012294190004467964,
0.019441276788711548,
-0.0984012708067894,
-0.05153188481926918,
0.277652770280838,
0.0025516008026897907,
-0.031089911237359047,
-0.0005361076910048723,
-0.061608247458934784,
-0.1275489181280136,
-0.11715490370988846,
-0.12006071209907532,
0.008844053372740746,
-0.012830942869186401,
-0.038220249116420746,
-0.0325516015291214,
-0.14067432284355164,
0.011060877703130245,
-0.20019973814487457,
0.14438727498054504,
0.048158980906009674,
0.0016600628150627017,
-0.16788765788078308,
0.08309190720319748,
-0.027600808069109917,
-0.07581961154937744,
-0.07890050858259201,
-0.11221173405647278,
0.06544514745473862,
-0.11589927971363068,
-0.038976117968559265,
-0.18067361414432526,
-0.017633670940995216,
0.13845524191856384,
-0.05810760706663132,
-0.029581615701317787,
-0.005444598849862814,
0.027101149782538414,
0.04527729004621506,
0.13672830164432526,
-0.022752251476049423,
0.043621063232421875,
0.05600990727543831,
0.008733494207262993,
0.04491638392210007,
0.029398223385214806,
-0.12359624356031418,
-0.0452292300760746,
0.04579120874404907,
-0.0173749141395092,
-0.052657924592494965,
0.1309898942708969,
-0.0581800751388073,
-0.04971058666706085,
0.21654126048088074,
-0.039512667804956436,
-0.05011463537812233,
-0.05074941739439964,
-0.013681085780262947,
0.23938648402690887,
0.047001734375953674,
-0.03563004732131958,
0.01637778803706169,
0.07077345252037048,
-0.048584792762994766,
-0.017893541604280472,
-0.0659218579530716,
-0.10784221440553665,
0.005462201777845621,
-0.10910752415657043,
0.08244280517101288,
-0.21314574778079987,
-0.21191945672035217,
0.08632855862379074,
0.006930522620677948,
-0.012888246215879917,
0.04010945186018944,
-0.0664231926202774,
-0.09996672719717026,
-0.01817195862531662,
0.03765486925840378,
-0.09726974368095398,
-0.008775812573730946,
0.06229488179087639,
0.06125815957784653,
0.1644936352968216,
-0.221111461520195,
0.05126861482858658,
-0.04957123100757599,
0.007188684772700071,
-0.23212429881095886,
0.12212371081113815,
0.06906931847333908,
0.14974543452262878,
-0.04057995229959488,
-0.15633493661880493,
-0.16143035888671875,
0.014052538201212883,
0.05715031921863556,
0.19764210283756256,
-0.29996350407600403,
-0.05652565136551857,
-0.09508193284273148,
-0.1586357206106186,
-0.0712561160326004,
0.10658025741577148,
-0.06441553682088852,
0.13915576040744781,
0.12192948162555695,
0.04900301247835159,
-0.009005432017147541,
-0.13678334653377533,
-0.03148695454001427,
-0.0011172982631251216,
-0.2586049437522888,
0.00007246427412610501,
-0.06984651833772659,
0.19843874871730804,
0.0697413757443428,
0.01422867365181446,
0.0890389159321785,
0.002093781717121601,
-0.13644737005233765,
-0.06320413947105408,
-0.008627550676465034,
-0.013783389702439308,
0.04775018244981766,
0.05593494325876236,
0.07786023616790771,
-0.04217628762125969,
0.04431195557117462,
0.18299716711044312,
0.06772833317518234,
-0.02195790782570839,
-0.004902162589132786,
-0.17406758666038513,
0.08588584512472153,
-0.06203235685825348,
0.017229070886969566,
-0.12868335843086243,
0.012891147285699844,
-0.048116229474544525,
0.16502824425697327,
0.021389884874224663,
0.07717776298522949,
0.08998605608940125,
0.016308380290865898,
-0.010367557406425476,
-0.030337423086166382,
-0.04719763621687889,
0.05101499333977699,
-0.044328510761260986,
-0.09205758571624756,
0.01072495337575674,
-0.06157975643873215,
-0.04149645194411278,
-0.03879403695464134,
0.005476113874465227,
0.16857792437076569,
0.06970687210559845,
0.04366243630647659,
0.07379481196403503,
-0.03506482020020485,
0.05376521125435829,
-0.10271774977445602,
-0.07646457850933075,
0.028787575662136078,
-0.029443057253956795,
0.029527124017477036,
-0.05117086321115494,
0.04451567679643631,
0.14988189935684204,
0.16079291701316833,
-0.27254390716552734,
-0.10116870701313019,
0.04557943716645241,
-0.03053564764559269,
0.03219814971089363,
0.08641599863767624,
-0.06748911738395691,
0.04302923381328583,
-0.06466227024793625,
0.08642448484897614,
-0.014061586931347847,
0.024914097040891647,
0.05527004972100258,
0.040803439915180206,
-0.08539076894521713,
0.05029846727848053,
0.06463518738746643,
-0.21526266634464264,
0.07345134764909744,
0.0994667187333107,
0.02440141886472702,
0.006049774121493101,
0.011250494047999382,
-0.047190241515636444,
0.008466814644634724,
0.0013091558357700706,
0.03480467200279236,
0.12159229815006256,
-0.12694227695465088,
-0.03251305967569351,
0.021398097276687622,
-0.039958398789167404,
0.026662172749638557,
-0.10848517715930939,
-0.0005779392668046057,
0.04426465556025505,
-0.015529279597103596,
-0.08948151022195816,
0.0640292838215828,
-0.005032526794821024,
0.07520726323127747,
0.04981808364391327,
-0.05617690086364746,
0.09359025210142136,
0.07238055765628815,
0.014565277844667435,
0.10176938027143478,
-0.04888547956943512,
-0.24415446817874908,
-0.07157926261425018,
-0.05666767805814743,
0.050431087613105774,
0.11227162927389145,
0.08356499671936035,
-0.1005246713757515,
-0.049317143857479095,
-0.016281500458717346,
0.040204208344221115,
-0.06824787706136703,
0.056055109947919846,
0.025026632472872734,
0.024323884397745132,
-0.06150776520371437,
-0.08224881440401077,
-0.031307823956012726,
-0.08987187594175339,
0.014414936304092407,
0.1827620416879654,
-0.1385498195886612,
0.11614859104156494,
0.036694418638944626,
-0.07773143798112869,
0.0395362563431263,
-0.06782321631908417,
0.21590474247932434,
-0.08431515097618103,
0.08883205056190491,
0.2304966300725937,
0.028397444635629654,
0.009808208793401718,
0.06783763319253922,
-0.019186245277523994,
-0.13773058354854584,
0.012388702481985092,
-0.04204096645116806,
-0.12332026660442352,
-0.06435234099626541,
-0.06666439026594162,
-0.10893303900957108,
-0.019397152587771416,
0.07002153992652893,
0.052389904856681824,
0.154684916138649,
0.06533209979534149,
0.051946673542261124,
0.14211130142211914,
-0.009976946748793125,
0.09952814131975174,
0.30733269453048706,
-0.03386798873543739,
0.09092772752046585,
-0.09738540649414062,
-0.11950647085905075,
0.09036571532487869,
0.01445011142641306,
0.23768125474452972,
0.002873534569516778,
-0.05808073282241821,
0.05842342600226402,
-0.016693513840436935,
0.06688933819532394,
0.17217110097408295,
-0.02459411509335041,
-0.0425746813416481,
-0.06860367953777313,
-0.02760259062051773,
-0.009579090401530266,
0.0409964993596077,
-0.03327096626162529,
-0.04992715269327164,
-0.03575066104531288,
0.08640507608652115,
0.061298660933971405,
0.1687793880701065,
0.07153194397687912,
-0.44467273354530334,
-0.12679551541805267,
-0.029667804017663002,
0.06417284160852432,
-0.017461910843849182,
-0.02574039436876774,
-0.014029262587428093,
0.06757642328739166,
0.03611011058092117,
0.006117742974311113,
0.05097814276814461,
-0.08885432034730911,
0.06340506672859192,
0.03460610285401344,
0.029364973306655884,
0.024966828525066376,
-0.05305657163262367,
-0.30793964862823486,
0.15242670476436615,
-0.0037590281572192907,
0.09742455929517746,
-0.012433753348886967,
-0.017858773469924927,
0.03360825404524803,
0.24834531545639038,
0.09212668985128403,
-0.022652165964245796,
0.06976482272148132,
-0.19894792139530182,
0.0015406855382025242,
-0.019199073314666748,
0.05690249428153038,
0.09538348019123077,
0.013339285738766193,
0.03677092865109444,
0.01873752288520336,
0.04115808382630348,
0.09450100362300873,
-0.09047988057136536,
-0.1145685613155365,
0.017889168113470078,
-0.04131115972995758,
0.030668066814541817,
0.0207594633102417,
-0.08399371057748795,
-0.1586371511220932,
0.028191814199090004,
0.03305554762482643,
-0.06554694473743439,
-0.12619416415691376,
0.08037851750850677,
-0.0021886026952415705,
-0.01596585288643837,
0.09319107234477997,
0.0028327899053692818,
0.012227571569383144,
-0.02569517306983471,
-0.1343594342470169,
0.08308596163988113,
-0.10685303807258606,
0.0007958924979902804,
-0.010757852345705032,
0.007209784351289272,
0.019337553530931473,
-0.028671184554696083,
0.04642269387841225,
-0.005619833245873451,
-0.00944742001593113,
-0.009911512024700642,
0.007137186359614134,
0.0024958262220025063,
0.1094050481915474,
-0.022733798250555992,
-0.09070448577404022,
-0.009029656648635864,
-0.03159268572926521,
0.06252376735210419,
0.029501406475901604,
0.08648523688316345,
-0.02418331429362297,
0.021105904132127762,
-0.05447978153824806,
0.022132745012640953,
-0.22850339114665985,
0.08562227338552475,
-0.03798038512468338,
0.03899623453617096,
-0.052863363176584244,
-0.13154944777488708,
0.0213253665715456,
0.08236459642648697,
0.05550144612789154,
0.09718865901231766,
-0.18938319385051727,
-0.0494140163064003,
0.06126590073108673,
0.04104715213179588,
0.2724871337413788,
-0.057379111647605896,
0.04767454043030739,
-0.06493561714887619,
-0.16004160046577454,
0.007677375338971615,
0.07294122874736786,
0.07391594350337982,
-0.02949993871152401,
0.05198519676923752,
0.017509080469608307,
-0.019788283854722977,
0.03289121389389038,
0.061925433576107025,
0.11395271867513657,
-0.063756562769413,
-0.17404738068580627,
0.1464676707983017,
-0.02443818561732769,
0.04099015146493912,
0.2047680765390396,
0.08635400235652924,
0.017724381759762764,
-0.04484944045543671,
-0.022990265861153603,
0.05384064465761185,
0.04074528440833092,
-0.0728338286280632,
-0.12900519371032715,
0.007296636234968901,
-0.00021799324895255268,
-0.005946462973952293,
0.04813871532678604,
-0.0019607690628618,
-0.15811635553836823,
-0.09645720571279526,
0.11810705065727234,
-0.15284503996372223,
0.004353522788733244,
0.007868909277021885,
-0.04639902338385582,
0.07452493160963058,
-0.1364794820547104,
0.06405637413263321,
0.13277336955070496,
-0.005204484798014164,
0.07884252071380615,
0.1351950764656067,
0.06759956479072571,
0.09342917054891586,
0.11596013605594635,
-0.05402975529432297,
-0.05071123689413071,
-0.0019065935630351305,
-0.09771238267421722,
0.14749515056610107,
0.18972881138324738,
0.09985686093568802,
-0.03781973943114281,
0.03085308149456978,
-0.021909892559051514,
-0.022695817053318024,
-0.04667244851589203,
0.06358390301465988,
-0.059646666049957275,
-0.031884800642728806,
-0.07751620560884476,
-0.01760418713092804,
-0.033240772783756256,
-0.17978069186210632,
-0.01819775253534317,
-0.04954485595226288,
-0.1302284151315689,
-0.0652536004781723,
-0.1117904856801033,
0.03464193269610405,
-0.11844974011182785,
-0.02665332704782486,
-0.08630374819040298,
-0.11050310730934143,
0.11223049461841583,
0.09166885167360306,
0.07781160622835159,
0.0746496245265007,
-0.089942567050457,
0.1162579134106636,
-0.1604137122631073,
-0.0280957892537117,
-0.029216885566711426,
0.12878994643688202,
-0.1543925404548645,
0.08356860280036926,
-0.08784171938896179,
0.09471023082733154,
-0.09996449947357178,
-0.02288081869482994,
-0.0029552211053669453,
0.041271086782217026,
-0.032866187393665314,
-0.001351493177935481,
-0.07252472639083862,
-0.023964570835232735,
0.028844472020864487,
-0.01357865147292614,
-0.040417417883872986,
0.015600938349962234,
-0.03850068524479866,
0.06505594402551651,
0.07261722534894943,
0.02496895007789135,
0.01936371996998787,
-0.051534123718738556,
-0.046993475407361984,
-0.06865797936916351,
0.11720170080661774,
0.07676719129085541,
-0.07530303299427032,
-0.01829463429749012,
0.01716059260070324,
-0.10502990335226059,
0.18172940611839294,
0.06357800960540771,
0.0674876719713211,
0.08281280100345612,
0.005794237367808819,
0.0831882432103157,
-0.03227248042821884,
-0.008353670127689838,
0.013573935255408287,
-0.013001829385757446,
0.12429940700531006,
-0.05429615080356598,
-0.0808933898806572,
-0.05539550259709358,
0.006818411406129599,
0.11193379014730453,
0.0777943804860115,
0.14353184401988983,
-0.057684045284986496,
-0.047291968017816544,
-0.05562862381339073,
-0.03232854604721069,
0.019219838082790375,
-0.10761647671461105,
-0.057056132704019547,
-0.08282934874296188,
0.04885715991258621,
-0.005743100773543119,
0.1093539372086525,
-0.005055271554738283,
-0.03207411989569664,
0.02017253264784813,
0.011717204935848713,
-0.13282383978366852,
0.014124459587037563,
0.24852308630943298,
0.05606758967041969,
-0.0357857272028923,
-0.1085687130689621,
0.0910239964723587,
0.09871017187833786,
0.11580785363912582,
0.08113320171833038,
0.1828540712594986,
-0.09199674427509308,
0.10191135108470917,
0.04551687464118004,
0.12864527106285095,
-0.22745968401432037,
-0.026436420157551765,
-0.03791221231222153,
0.016745230183005333,
0.05555523931980133,
-0.044545359909534454,
0.15622536838054657,
-0.012474813498556614,
0.03445342928171158,
0.008826195262372494,
-0.029690193012356758,
0.015704207122325897,
-0.053012534976005554,
-0.0926591008901596,
-0.10539357364177704,
0.003993397578597069,
-0.08300106227397919,
-0.02890476956963539,
-0.009094768203794956,
0.0993543490767479,
-0.0575551837682724,
0.11810359358787537,
0.0025456564035266638,
0.03088691458106041,
0.05550975352525711,
-0.04432305321097374,
-0.10159973800182343,
-0.012783011421561241,
-0.07865143567323685,
-0.03471161425113678,
0.03533262386918068,
-0.009757758118212223,
-0.02803735062479973,
0.0670977383852005,
-0.06045793741941452,
-0.031081702560186386,
-0.012950081378221512,
-0.018467135727405548,
0.07798554003238678,
-0.07903769612312317,
-0.062357500195503235,
0.0052008056081831455,
-0.08962304890155792,
0.05221003666520119,
-0.07745791226625443,
0.009869282133877277,
-0.22132395207881927,
-0.10759655386209488,
0.25650888681411743,
0.05245985463261604,
0.09794756770133972,
-0.007804423570632935,
-0.04292749986052513,
-0.025627560913562775,
0.14516378939151764,
0.18727585673332214,
-0.09907244145870209,
-0.03848765417933464,
-0.025253288447856903,
0.017364637926220894,
-0.07853592932224274,
-0.002823988674208522,
0.08820580691099167,
0.11661910265684128,
0.031716424971818924,
-0.09882725030183792,
-0.1020195335149765,
0.08151187002658844,
-0.010459975339472294,
-0.01655205339193344,
0.06166675314307213,
-0.02399819903075695,
-0.0911092609167099,
0.0623958557844162,
-0.025791741907596588,
-0.04512853920459747,
0.21654771268367767,
-0.06969161331653595,
-0.04162305220961571,
0.03785215690732002,
-0.032494667917490005,
-0.010496041737496853,
0.08376969397068024,
-0.10252024233341217,
-0.05013372004032135,
0.011833579279482365,
-0.0042646173387765884,
-0.1108691543340683,
0.10966483503580093,
0.12200407683849335,
0.02765372022986412,
0.0028871612157672644,
-0.05797024816274643,
0.06327588111162186,
0.0676569864153862,
0.0944027230143547,
-0.06336752325296402,
0.08090761303901672,
-0.07987722009420395,
-0.014630637131631374,
0.022702716290950775,
0.028189711272716522,
-0.02278577908873558,
-0.07646400481462479,
0.0014714475255459547,
-0.24219489097595215,
-0.0025000388268381357,
0.09628894925117493,
-0.07797165215015411,
-0.07958691567182541,
0.023869475349783897,
-0.033088941127061844,
0.0393093004822731,
0.012681607156991959,
0.05276584252715111,
0.05049549788236618,
-0.06349742412567139,
0.04698079824447632,
0.10047277063131332,
-0.044721364974975586,
-0.09102246165275574,
-0.08505817502737045,
-0.03744500130414963,
-0.12813213467597961,
-0.02688119001686573,
-0.0030052789952605963,
-0.10159934312105179,
-0.06060868874192238,
-0.018191006034612656,
-0.05078570172190666,
0.06716248393058777,
0.144122913479805,
0.036490608006715775,
-0.01490633562207222,
0.12266317754983902,
-0.02068234793841839,
0.11901240795850754,
-0.05248161405324936,
-0.04388391599059105
] |
null | null |
transformers
|
# deit_tiny_patch16_224
Implementation of DeiT proposed in [Training data-efficient image
transformers & distillation through
attention](https://arxiv.org/pdf/2010.11929.pdf)
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.

``` {.sourceCode .}
DeiT.deit_tiny_patch16_224()
DeiT.deit_small_patch16_224()
DeiT.deit_base_patch16_224()
DeiT.deit_base_patch16_384()
```
|
{}
| null |
glasses/deit_tiny_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# deit_tiny_patch16_224
Implementation of DeiT proposed in Training data-efficient image
transformers & distillation through
attention
An attention based distillation is proposed where a new token is added
to the model, the [dist]{.title-ref} token.
!image
|
[
"# deit_tiny_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# deit_tiny_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
29,
70
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# deit_tiny_patch16_224\n Implementation of DeiT proposed in Training data-efficient image\n transformers & distillation through\n attention\n\n An attention based distillation is proposed where a new token is added\n to the model, the [dist]{.title-ref} token.\n\n !image"
] |
[
-0.11589144915342331,
-0.10813921689987183,
-0.003708689473569393,
0.13982471823692322,
0.0997592881321907,
0.00037805692409165204,
0.08144062012434006,
0.042086776345968246,
-0.14986005425453186,
-0.003908088896423578,
0.08992928266525269,
0.2146308273077011,
-0.030866870656609535,
0.12440994381904602,
-0.07834052294492722,
-0.3212360739707947,
0.001848822459578514,
0.1630052775144577,
-0.019970715045928955,
0.04201038181781769,
0.1378871351480484,
-0.1572769433259964,
0.10400235652923584,
-0.017212219536304474,
-0.1906479150056839,
0.006262168753892183,
-0.06729866564273834,
-0.06511929631233215,
0.14626294374465942,
0.02382776141166687,
0.1398942619562149,
0.10217934101819992,
0.07257837802171707,
0.09027216583490372,
0.039393264800310135,
-0.008205835707485676,
0.00200918922200799,
0.04758399352431297,
-0.026458732783794403,
0.06307173520326614,
0.0834902822971344,
-0.06553570181131363,
0.046318087726831436,
-0.060491569340229034,
-0.1469941884279251,
0.002134612062945962,
-0.046608179807662964,
0.0503377839922905,
0.058954909443855286,
0.08345356583595276,
0.03922923654317856,
0.076247937977314,
-0.0527026429772377,
0.11234580725431442,
0.08924531191587448,
-0.06437020748853683,
-0.07205874472856522,
0.0026557277888059616,
0.013245168142020702,
-0.045608941465616226,
0.030048303306102753,
0.020799914374947548,
0.07631058990955353,
0.09278056025505066,
-0.01844683662056923,
-0.005600973032414913,
0.0058507672511041164,
0.01883171685039997,
-0.10217886418104172,
-0.05270897224545479,
0.2597223222255707,
0.008835043758153915,
-0.025771597400307655,
0.0003008292696904391,
-0.06694458425045013,
-0.12725523114204407,
-0.1169099509716034,
-0.10995867848396301,
0.00716003030538559,
-0.013869981281459332,
-0.048945724964141846,
-0.033355895429849625,
-0.1427525132894516,
0.008159085176885128,
-0.1982831358909607,
0.13318079710006714,
0.0492933914065361,
-0.0006486058700829744,
-0.1638941764831543,
0.09483566135168076,
-0.028118353337049484,
-0.0738835334777832,
-0.06982047855854034,
-0.11353377997875214,
0.058712296187877655,
-0.11112333834171295,
-0.038266539573669434,
-0.1936798393726349,
-0.021891053766012192,
0.12985631823539734,
-0.04797946661710739,
-0.024570796638727188,
-0.011873449198901653,
0.03277192637324333,
0.0496659092605114,
0.13348419964313507,
-0.03317869454622269,
0.050909943878650665,
0.05461268126964569,
0.009374476969242096,
0.048494648188352585,
0.02624896727502346,
-0.11246646195650101,
-0.03455464169383049,
0.0546550378203392,
-0.020942039787769318,
-0.04119221493601799,
0.1392579823732376,
-0.05213773995637894,
-0.05235940217971802,
0.1954568475484848,
-0.038732487708330154,
-0.061368897557258606,
-0.06371349841356277,
-0.00999089889228344,
0.23131322860717773,
0.05391887202858925,
-0.03245064988732338,
0.01270047016441822,
0.07811222225427628,
-0.05364150553941727,
-0.01913273148238659,
-0.0647675171494484,
-0.10779907554388046,
-0.00021934523829258978,
-0.10410594195127487,
0.08631283789873123,
-0.22036850452423096,
-0.20572488009929657,
0.0824815034866333,
0.016557004302740097,
-0.0075941807590425014,
0.03324899077415466,
-0.06405242532491684,
-0.10687638819217682,
-0.014161494560539722,
0.033893782645463943,
-0.08779594302177429,
-0.007912516593933105,
0.06465219706296921,
0.058469634503126144,
0.16255372762680054,
-0.21078111231327057,
0.04818597435951233,
-0.0470597967505455,
0.00915374606847763,
-0.23398156464099884,
0.12543515861034393,
0.061877597123384476,
0.145265132188797,
-0.042465247213840485,
-0.14819404482841492,
-0.15291030704975128,
0.01881852000951767,
0.053433023393154144,
0.18883013725280762,
-0.28481149673461914,
-0.05460966378450394,
-0.08622494339942932,
-0.1527443528175354,
-0.06532422453165054,
0.09757663309574127,
-0.06808827817440033,
0.13381871581077576,
0.12144394218921661,
0.06663905084133148,
-0.010185626335442066,
-0.13845594227313995,
-0.025892674922943115,
-0.00157643121201545,
-0.2604275643825531,
0.008123244158923626,
-0.07337185740470886,
0.20415155589580536,
0.05756635591387749,
0.013559217564761639,
0.06889565289020538,
0.006891047116369009,
-0.13649344444274902,
-0.06354726105928421,
-0.007544438820332289,
-0.014334483072161674,
0.043295323848724365,
0.06210080534219742,
0.07399248331785202,
-0.03956757113337517,
0.03851037472486496,
0.17849993705749512,
0.07462508231401443,
-0.02716529555618763,
-0.0010585981654003263,
-0.16865988075733185,
0.06654507666826248,
-0.066282257437706,
0.021276473999023438,
-0.13075552880764008,
-0.0031325698364526033,
-0.05144893750548363,
0.1497718095779419,
0.028808584436774254,
0.0832885280251503,
0.08776985853910446,
0.01665288768708706,
-0.017655624076724052,
-0.039851266890764236,
-0.0553242564201355,
0.05596906691789627,
-0.041565701365470886,
-0.09624991565942764,
0.01086685061454773,
-0.06369662284851074,
-0.04432479664683342,
-0.0357208251953125,
0.00218275748193264,
0.15602463483810425,
0.0708024725317955,
0.049421679228544235,
0.06926319003105164,
-0.030348319560289383,
0.050647299736738205,
-0.10409535467624664,
-0.081447072327137,
0.03415185585618019,
-0.02555428259074688,
0.041419826447963715,
-0.048112545162439346,
0.05080866068601608,
0.14894530177116394,
0.16550157964229584,
-0.28791898488998413,
-0.10853119194507599,
0.0587557852268219,
-0.031685955822467804,
0.030309021472930908,
0.08016625046730042,
-0.07264453172683716,
0.04542352631688118,
-0.06000128015875816,
0.08448605984449387,
-0.012078973464667797,
0.024181870743632317,
0.05637625977396965,
0.04039327800273895,
-0.08772322535514832,
0.04756908118724823,
0.06710110604763031,
-0.20578226447105408,
0.07332053035497665,
0.08728920668363571,
0.018219854682683945,
-0.0008347647963091731,
0.014827649109065533,
-0.041985418647527695,
0.013111534528434277,
-0.0008406483102589846,
0.035179827362298965,
0.11857906728982925,
-0.12598572671413422,
-0.035260844975709915,
0.018612157553434372,
-0.03556456044316292,
0.032807525247335434,
-0.1108204647898674,
-0.004197924397885799,
0.04432548210024834,
-0.01181891467422247,
-0.09024655818939209,
0.05796445533633232,
-0.014109853655099869,
0.07254951447248459,
0.0503990538418293,
-0.05737529322504997,
0.08838405460119247,
0.0701792761683464,
0.01976851560175419,
0.09565041959285736,
-0.04759180545806885,
-0.2520410418510437,
-0.06976143270730972,
-0.06136589124798775,
0.0444391667842865,
0.11697202175855637,
0.08596928417682648,
-0.10103999823331833,
-0.04937218129634857,
-0.013030962087213993,
0.039026256650686264,
-0.06237455829977989,
0.05598059296607971,
0.02350894920527935,
0.02872408926486969,
-0.0581662617623806,
-0.08132018148899078,
-0.02804056741297245,
-0.08935102075338364,
0.01639801822602749,
0.18284811079502106,
-0.13309121131896973,
0.12578728795051575,
0.04415557533502579,
-0.06679251044988632,
0.03639629855751991,
-0.06770825386047363,
0.21579985320568085,
-0.09247452765703201,
0.08856560289859772,
0.22718878090381622,
0.0311923585832119,
0.013348287902772427,
0.06990136206150055,
-0.021676689386367798,
-0.13647183775901794,
0.018089473247528076,
-0.030030062422156334,
-0.12372241914272308,
-0.06317449361085892,
-0.06582475453615189,
-0.10501143336296082,
-0.015478825196623802,
0.07490688562393188,
0.0513920821249485,
0.15912850201129913,
0.07099169492721558,
0.053466323763132095,
0.14323414862155914,
-0.00656535429880023,
0.10173901170492172,
0.30132609605789185,
-0.03706337511539459,
0.09452296793460846,
-0.09177426248788834,
-0.129999041557312,
0.08760799467563629,
0.022629793733358383,
0.23577848076820374,
-0.001463812543079257,
-0.06700661778450012,
0.05627915635704994,
-0.01871873065829277,
0.06202700361609459,
0.1705843061208725,
-0.021794021129608154,
-0.04206959530711174,
-0.0695931687951088,
-0.021427946165204048,
-0.008036991581320763,
0.03937236964702606,
-0.0302573274821043,
-0.054827068001031876,
-0.036091528832912445,
0.0810302272439003,
0.0646691843867302,
0.16616524755954742,
0.07617905735969543,
-0.4369320869445801,
-0.1174546480178833,
-0.028631512075662613,
0.059545230120420456,
-0.022815801203250885,
-0.02236974984407425,
-0.014369288459420204,
0.057565174996852875,
0.036271121352910995,
0.0007453642319887877,
0.053676530718803406,
-0.10699132084846497,
0.06442566961050034,
0.03474430367350578,
0.02648330107331276,
0.02186504192650318,
-0.05126103013753891,
-0.2928902804851532,
0.14788515865802765,
-0.006869623437523842,
0.09640677273273468,
-0.005888563115149736,
-0.021937968209385872,
0.03495686873793602,
0.2434244304895401,
0.09388287365436554,
-0.024750595912337303,
0.07546529173851013,
-0.2043873369693756,
0.0158231258392334,
-0.01701386645436287,
0.06218850985169411,
0.09629110991954803,
0.018192585557699203,
0.031403157860040665,
0.016791731119155884,
0.04718073457479477,
0.092553049325943,
-0.08880739659070969,
-0.10986336320638657,
0.020702648907899857,
-0.03554895892739296,
0.034008074551820755,
0.01879046857357025,
-0.08022909611463547,
-0.15252162516117096,
0.02552737295627594,
0.03133256360888481,
-0.06370218843221664,
-0.12765023112297058,
0.08086879551410675,
-0.004030528943985701,
-0.01665867492556572,
0.0945390984416008,
-0.0028282369021326303,
0.01703920029103756,
-0.02009120024740696,
-0.1378328949213028,
0.08065127581357956,
-0.10842310637235641,
0.00699215941131115,
-0.012396707199513912,
0.004463499411940575,
0.013512481935322285,
-0.03605538606643677,
0.0413009449839592,
-0.006821606773883104,
-0.01033035572618246,
-0.012009819969534874,
0.020917370915412903,
-0.00841610785573721,
0.09873706847429276,
-0.021084681153297424,
-0.08724162727594376,
-0.004515373148024082,
-0.0413794070482254,
0.06043799966573715,
0.02991962805390358,
0.07283288240432739,
-0.018564719706773758,
0.01276132557541132,
-0.04387405142188072,
0.025924069806933403,
-0.23904761672019958,
0.08369183540344238,
-0.026002295315265656,
0.028574161231517792,
-0.05577540025115013,
-0.13969843089580536,
0.026818284764885902,
0.07605873048305511,
0.05715123936533928,
0.1088324710726738,
-0.19147370755672455,
-0.0497673824429512,
0.06644366681575775,
0.045079685747623444,
0.2569742798805237,
-0.05675216391682625,
0.046699777245521545,
-0.06991824507713318,
-0.16694097220897675,
0.014287286438047886,
0.05843620002269745,
0.08193749934434891,
-0.023269589990377426,
0.05308135226368904,
0.01894248276948929,
-0.019424401223659515,
0.030273277312517166,
0.06656094640493393,
0.11946634948253632,
-0.06323035806417465,
-0.18158820271492004,
0.1474035084247589,
-0.023024193942546844,
0.038573261350393295,
0.21691550314426422,
0.0913098156452179,
0.024030284956097603,
-0.04246191680431366,
-0.028001701459288597,
0.0511869378387928,
0.0503125824034214,
-0.08359984308481216,
-0.13145995140075684,
0.007941685616970062,
0.00547201419249177,
-0.004037837963551283,
0.0605296827852726,
0.004275979474186897,
-0.1648891270160675,
-0.08696091920137405,
0.09301386773586273,
-0.15271039307117462,
-0.012471942231059074,
-0.0006925171473994851,
-0.04867833852767944,
0.07507656514644623,
-0.13799960911273956,
0.061860401183366776,
0.1300382912158966,
-0.007279777899384499,
0.07584408670663834,
0.13487593829631805,
0.05963991582393646,
0.09058552235364914,
0.11157623678445816,
-0.05888980254530907,
-0.04311208426952362,
-0.005290601402521133,
-0.09439777582883835,
0.15839652717113495,
0.19454346597194672,
0.10496753454208374,
-0.04020139202475548,
0.03582081198692322,
-0.019222157076001167,
-0.024121105670928955,
-0.04704318940639496,
0.07283367216587067,
-0.049660809338092804,
-0.035019125789403915,
-0.07720267027616501,
-0.007234130520373583,
-0.03222915902733803,
-0.17691534757614136,
-0.02019105851650238,
-0.051212895661592484,
-0.13129089772701263,
-0.0659501776099205,
-0.10757523030042648,
0.03517794609069824,
-0.127529114484787,
-0.0237442459911108,
-0.07496456056833267,
-0.10618903487920761,
0.10899196565151215,
0.08525664359331131,
0.08014152944087982,
0.07317400723695755,
-0.09246348589658737,
0.1178886666893959,
-0.16280128061771393,
-0.02873801812529564,
-0.028806418180465698,
0.1366414874792099,
-0.164671391248703,
0.08765389770269394,
-0.0777725949883461,
0.09593181312084198,
-0.1035136729478836,
-0.0264230165630579,
-0.006255935877561569,
0.04455608129501343,
-0.01600070297718048,
-0.0037985884118825197,
-0.06667499989271164,
-0.02396748773753643,
0.033451396971940994,
-0.01524821575731039,
-0.04438231140375137,
0.013725768774747849,
-0.036690276116132736,
0.058249231427907944,
0.06946767866611481,
0.017448455095291138,
0.0141024524345994,
-0.0539969876408577,
-0.04492828994989395,
-0.06637353450059891,
0.11116479337215424,
0.07343723624944687,
-0.07005182653665543,
-0.009768606163561344,
0.014054832980036736,
-0.10835234820842743,
0.17948904633522034,
0.06098904833197594,
0.0634898990392685,
0.07871068269014359,
0.00506808003410697,
0.07001767307519913,
-0.031196027994155884,
-0.010734092444181442,
0.026028957217931747,
-0.019387712702155113,
0.11264457553625107,
-0.0553877092897892,
-0.08193386346101761,
-0.05724191293120384,
0.011742300353944302,
0.11032082885503769,
0.08471556752920151,
0.1458740532398224,
-0.05514585226774216,
-0.045860227197408676,
-0.05569181218743324,
-0.028593722730875015,
0.01912214793264866,
-0.11769398301839828,
-0.03944147750735283,
-0.07898364961147308,
0.04898057505488396,
-0.015941904857754707,
0.10051623731851578,
-0.0012874086387455463,
-0.03151993826031685,
0.018325086683034897,
0.003563829930499196,
-0.14256024360656738,
0.016641411930322647,
0.2447020262479782,
0.060632701963186264,
-0.04016725718975067,
-0.11401131749153137,
0.09214047342538834,
0.10280203819274902,
0.10865934938192368,
0.07196123898029327,
0.17706778645515442,
-0.08880417793989182,
0.10063610970973969,
0.04328373074531555,
0.12712609767913818,
-0.23223747313022614,
-0.02428412437438965,
-0.03982416167855263,
0.017615078017115593,
0.05403991416096687,
-0.03532600775361061,
0.15325061976909637,
-0.009785466827452183,
0.03314082697033882,
0.010743741877377033,
-0.02891010418534279,
0.022276239469647408,
-0.05457357317209244,
-0.08918096125125885,
-0.10848240554332733,
0.000006071717052691383,
-0.08084289729595184,
-0.03375612944364548,
-0.0009857510449364781,
0.097493015229702,
-0.055284593254327774,
0.1315024346113205,
-0.0032978744711726904,
0.02417120710015297,
0.0615246444940567,
-0.03944611921906471,
-0.1097927838563919,
-0.008101152256131172,
-0.07401371747255325,
-0.036375343799591064,
0.04495399072766304,
-0.013950162567198277,
-0.029971323907375336,
0.05894427001476288,
-0.05227481201291084,
-0.02829415164887905,
-0.016301752999424934,
-0.016422633081674576,
0.08121201395988464,
-0.08141907304525375,
-0.074678435921669,
0.0015651594148948789,
-0.08780959993600845,
0.0512993223965168,
-0.0631963312625885,
0.008748620748519897,
-0.21077531576156616,
-0.11207274347543716,
0.256430983543396,
0.04980924353003502,
0.10243487358093262,
-0.011109285987913609,
-0.048937875777482986,
-0.031082313507795334,
0.15727598965168,
0.18823517858982086,
-0.10694368928670883,
-0.04375646635890007,
-0.018548252061009407,
0.01857234723865986,
-0.07595670968294144,
0.00836965162307024,
0.08292939513921738,
0.1124713271856308,
0.030333472415804863,
-0.09419180452823639,
-0.09995151311159134,
0.07663135975599289,
0.000026197398256044835,
-0.02058526687324047,
0.06686022877693176,
-0.022879086434841156,
-0.09833738952875137,
0.06716623157262802,
-0.030102450400590897,
-0.045938365161418915,
0.21436651051044464,
-0.05987019091844559,
-0.040264714509248734,
0.03993682190775871,
-0.03153518587350845,
-0.013724857941269875,
0.08770759403705597,
-0.10245780646800995,
-0.04398130252957344,
0.011878944002091885,
0.0016702382126823068,
-0.10372178256511688,
0.09874871373176575,
0.11591923981904984,
0.034941524267196655,
0.010075648315250874,
-0.06381775438785553,
0.055529505014419556,
0.0637948140501976,
0.09104236960411072,
-0.06433258950710297,
0.08645481616258621,
-0.07511161267757416,
-0.01653285324573517,
0.021715030074119568,
0.03512311726808548,
-0.028375770896673203,
-0.08079314976930618,
-0.002502215327695012,
-0.23932801187038422,
-0.00628427742049098,
0.0973895713686943,
-0.0672004446387291,
-0.07786140590906143,
0.020360685884952545,
-0.035562340170145035,
0.03752104192972183,
0.009679844602942467,
0.05353953689336777,
0.05025935918092728,
-0.06784408539533615,
0.0418231263756752,
0.10104800760746002,
-0.045746348798274994,
-0.08903439342975616,
-0.07977809756994247,
-0.03862356022000313,
-0.1284945160150528,
-0.029460981488227844,
-0.0014784669037908316,
-0.10045922547578812,
-0.055755965411663055,
-0.02030390501022339,
-0.047110479325056076,
0.07092005014419556,
0.13793541491031647,
0.03386253863573074,
-0.016622774302959442,
0.1363619714975357,
-0.022038601338863373,
0.1221260130405426,
-0.05531223118305206,
-0.05086033418774605
] |
null | null |
transformers
|
# densenet161
Implementation of DenseNet proposed in [Densely Connected Convolutional
Networks](https://arxiv.org/abs/1608.06993)
Create a default models
``` {.sourceCode .}
DenseNet.densenet121()
DenseNet.densenet161()
DenseNet.densenet169()
DenseNet.densenet201()
```
Examples:
``` {.sourceCode .}
# change activation
DenseNet.densenet121(activation = nn.SELU)
# change number of classes (default is 1000 )
DenseNet.densenet121(n_classes=100)
# pass a different block
DenseNet.densenet121(block=...)
# change the initial convolution
model = DenseNet.densenet121()
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = DenseNet.densenet121()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14]), torch.Size([1, 512, 7, 7]), torch.Size([1, 1024, 7, 7])]
```
|
{}
| null |
glasses/densenet161
|
[
"transformers",
"pytorch",
"arxiv:1608.06993",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1608.06993"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us
|
# densenet161
Implementation of DenseNet proposed in Densely Connected Convolutional
Networks
Create a default models
Examples:
|
[
"# densenet161\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n",
"# densenet161\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
30,
31
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n# densenet161\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
-0.05057618021965027,
-0.11846312880516052,
-0.004925112705677748,
0.002082043094560504,
-0.01579143851995468,
-0.0007243842119351029,
0.04471753537654877,
0.015644708648324013,
-0.05735547095537186,
0.0002974487142637372,
0.2252035290002823,
0.07805570214986801,
-0.06303419917821884,
0.15211114287376404,
-0.052574507892131805,
-0.18264073133468628,
0.04891694709658623,
0.1209125816822052,
-0.07366309314966202,
0.03274248540401459,
0.10870599001646042,
-0.04991098865866661,
0.0828256756067276,
0.0196157805621624,
-0.15472614765167236,
-0.007544635329395533,
-0.04010042920708656,
-0.0014409578870981932,
0.08647903800010681,
0.06328343600034714,
0.20979733765125275,
0.006009800359606743,
-0.008094890043139458,
-0.2145550400018692,
0.021462544798851013,
0.021637635305523872,
-0.05546581372618675,
0.1322299838066101,
0.0819864496588707,
0.061935264617204666,
-0.0597061887383461,
-0.010368661023676395,
-0.05054517462849617,
-0.012813822366297245,
-0.01690366119146347,
-0.07133892178535461,
0.00019564077956601977,
0.10741458833217621,
-0.033790428191423416,
0.023831987753510475,
0.03287529572844505,
0.1561007797718048,
-0.07605324685573578,
0.04579860717058182,
-0.0331634059548378,
-0.2813641130924225,
-0.02460404857993126,
0.30239495635032654,
-0.06966517120599747,
-0.03151361271739006,
0.026880299672484398,
-0.013980935327708721,
0.08104220777750015,
0.014925417490303516,
0.11834494769573212,
-0.021585650742053986,
-0.05922843888401985,
0.019801253452897072,
-0.15385285019874573,
0.07424264401197433,
0.011139542795717716,
0.02488298900425434,
0.11880407482385635,
-0.02133253403007984,
-0.21231858432292938,
0.0016807102365419269,
0.006706350017338991,
-0.09848320484161377,
-0.03219689428806305,
0.06270821392536163,
0.04560312256217003,
-0.017531743273139,
-0.0689253956079483,
0.06292930990457535,
-0.15881747007369995,
0.07738598436117172,
0.07446824014186859,
0.0424477756023407,
-0.11402473598718643,
0.089473195374012,
-0.15009191632270813,
-0.10522298514842987,
0.13148894906044006,
-0.1371433585882187,
-0.04158533737063408,
-0.004916079342365265,
-0.004571869969367981,
0.017182350158691406,
0.10239121317863464,
0.0535840168595314,
0.10949695855379105,
-0.047713398933410645,
0.10040898621082306,
0.10890229791402817,
-0.03662629798054695,
-0.08891015499830246,
-0.19579331576824188,
-0.03467117249965668,
0.018311815336346626,
-0.04534470662474632,
0.0975525826215744,
-0.0013855698052793741,
-0.07939103245735168,
-0.037979479879140854,
-0.007445794064551592,
0.019881799817085266,
-0.018275322392582893,
0.09424248337745667,
-0.060672082006931305,
-0.02559969387948513,
0.0392778106033802,
0.016310883685946465,
0.013449993915855885,
-0.057077571749687195,
-0.03199350833892822,
0.12482355535030365,
0.04629006236791611,
0.06674284487962723,
-0.015472901985049248,
0.12818370759487152,
-0.10409170389175415,
-0.04223557934165001,
-0.044619638472795486,
-0.07574489712715149,
0.05471707507967949,
-0.01604650355875492,
0.01062074489891529,
-0.18114601075649261,
0.004806450568139553,
0.039746806025505066,
0.09617946296930313,
-0.03201596438884735,
-0.006368755828589201,
-0.06619994342327118,
-0.13357847929000854,
-0.030256764963269234,
-0.04237766191363335,
0.08593513816595078,
-0.05198585242033005,
-0.03968009352684021,
-0.010812337510287762,
-0.004506805445998907,
-0.143694669008255,
-0.0022623336408287287,
-0.003447603201493621,
0.0020882997196167707,
-0.02243952639400959,
0.028012018650770187,
-0.13109691441059113,
0.02889140322804451,
-0.11065668612718582,
-0.05384218692779541,
-0.05075300112366676,
0.005319651216268539,
0.10421330481767654,
0.07529529184103012,
-0.13457247614860535,
-0.019238999113440514,
0.04714873805642128,
-0.08595212548971176,
-0.16540071368217468,
0.14770381152629852,
-0.03933043032884598,
0.021873638033866882,
0.021653972566127777,
0.09521181881427765,
0.16168160736560822,
-0.1464354395866394,
0.03408103808760643,
0.0757966935634613,
-0.09272097051143646,
-0.13221867382526398,
-0.02438553236424923,
0.10607333481311798,
-0.135728120803833,
0.002350150840356946,
0.00992114283144474,
0.155152827501297,
-0.08788810670375824,
-0.047146882861852646,
-0.025536559522151947,
-0.08187371492385864,
0.09332454949617386,
0.012021200731396675,
0.062389638274908066,
0.03905903175473213,
-0.04809016361832619,
-0.0806138888001442,
0.043035391718149185,
-0.019932635128498077,
0.021063148975372314,
-0.025206120684742928,
0.05664334446191788,
-0.14302068948745728,
-0.015199831686913967,
-0.08342909812927246,
-0.19220979511737823,
0.03444404900074005,
0.13947804272174835,
-0.03615842014551163,
0.23627842962741852,
0.05695531144738197,
-0.03909868001937866,
-0.08878292143344879,
-0.015533885918557644,
0.19404096901416779,
0.041112013161182404,
-0.14787672460079193,
-0.10809866338968277,
-0.06238673999905586,
-0.08112609386444092,
-0.034986063838005066,
-0.13975511491298676,
-0.010390623472630978,
-0.04127257317304611,
0.19033518433570862,
-0.031176745891571045,
0.07817406952381134,
0.021171510219573975,
0.02571256086230278,
-0.03459874540567398,
-0.012246239930391312,
0.05733611807227135,
-0.04586309567093849,
-0.161333367228508,
0.08944342285394669,
-0.025363026186823845,
0.26431795954704285,
0.05947713181376457,
-0.11515643447637558,
0.01028931699693203,
-0.09312659502029419,
-0.05933689326047897,
-0.021727673709392548,
0.038264818489551544,
0.015805304050445557,
0.18282221257686615,
-0.007745148614048958,
0.07613483816385269,
-0.03925199434161186,
-0.031024396419525146,
-0.0043122549541294575,
-0.016850179061293602,
0.00700426148250699,
0.05364337936043739,
0.06039350852370262,
-0.16285577416419983,
0.11251884698867798,
0.007056633476167917,
-0.055190641433000565,
-0.05553503707051277,
0.013178146444261074,
0.032650724053382874,
-0.012615125626325607,
0.1128816232085228,
-0.006738101597875357,
0.03812478482723236,
-0.03253042325377464,
0.025367703288793564,
0.06368440389633179,
-0.043256860226392746,
0.08487191051244736,
-0.08362088352441788,
0.011219851672649384,
-0.0054185353219509125,
0.00940198265016079,
0.08517871052026749,
0.09457836300134659,
-0.028437085449695587,
0.044176697731018066,
-0.045277878642082214,
-0.22598695755004883,
0.058721769601106644,
-0.026077639311552048,
-0.033505260944366455,
0.14389322698116302,
-0.1251579076051712,
-0.1055726706981659,
-0.04669537767767906,
-0.10792511701583862,
-0.03383457660675049,
-0.027862049639225006,
-0.02214958146214485,
-0.0736827701330185,
-0.06661742180585861,
0.014722839929163456,
0.05722982436418533,
0.06419093906879425,
0.07777073979377747,
-0.025830822065472603,
-0.01074570044875145,
-0.014447164721786976,
-0.10711421817541122,
0.004956697579473257,
-0.12839019298553467,
0.050415296107530594,
0.01693596877157688,
-0.11440754681825638,
0.02712143585085869,
0.15201154351234436,
0.04319028928875923,
-0.020292654633522034,
0.0004124585830140859,
0.09907977283000946,
0.03441077098250389,
0.10084128379821777,
0.05970558896660805,
-0.04392878711223602,
0.08384080976247787,
0.21636708080768585,
0.08047603070735931,
-0.013496831059455872,
0.01147717610001564,
-0.021868914365768433,
-0.027960583567619324,
-0.19690941274166107,
-0.2033262699842453,
-0.0864848792552948,
-0.03419077768921852,
0.053355343639850616,
0.02493000030517578,
0.03240499272942543,
0.14488738775253296,
0.0026961378753185272,
-0.0133353965356946,
0.024559279903769493,
0.05596039816737175,
0.18052995204925537,
-0.017063893377780914,
0.07211804389953613,
-0.04212596267461777,
-0.13311199843883514,
0.04535578191280365,
0.07758667320013046,
0.14388114213943481,
-0.060577817261219025,
-0.02605801820755005,
0.07590591907501221,
0.25813692808151245,
0.09009714424610138,
0.11258824914693832,
-0.0539020337164402,
-0.026070337742567062,
-0.02078610472381115,
-0.05184267461299896,
0.0806560292840004,
0.09746872633695602,
0.03961766138672829,
-0.0887848287820816,
0.05756058171391487,
0.0661102831363678,
0.04354170709848404,
-0.10603800415992737,
0.12346135824918747,
-0.30480289459228516,
0.056820232421159744,
-0.030865071341395378,
-0.025283319875597954,
-0.030656058341264725,
0.051674094051122665,
0.2393878698348999,
0.022017264738678932,
0.1219884604215622,
-0.06542614847421646,
0.059016596525907516,
-0.016223257407546043,
0.022936195135116577,
0.024233011528849602,
0.12922173738479614,
0.04208134859800339,
0.04850667342543602,
-0.17163819074630737,
0.2887764871120453,
-0.006057982798665762,
-0.05196801945567131,
-0.061203908175230026,
0.017528606578707695,
0.02002551220357418,
0.01722261868417263,
0.0684068351984024,
0.006199648603796959,
0.21501384675502777,
0.1265418380498886,
-0.10542137175798416,
0.017824508249759674,
0.01833091862499714,
0.081072598695755,
-0.07172930985689163,
0.07511751353740692,
-0.08039020001888275,
0.08860541135072708,
0.020674671977758408,
-0.07124240696430206,
-0.09946998208761215,
0.03513047471642494,
0.26304924488067627,
-0.11116889864206314,
-0.045372672379016876,
-0.08301423490047455,
-0.02625642530620098,
0.23272322118282318,
0.029069185256958008,
-0.0802290290594101,
-0.05205041542649269,
0.033202558755874634,
0.23263660073280334,
-0.055534545332193375,
0.06307640671730042,
-0.058700527995824814,
0.055763598531484604,
-0.09372636675834656,
-0.1511671096086502,
0.1319442242383957,
-0.10044750571250916,
0.013883310370147228,
-0.0882648229598999,
0.1695970743894577,
0.0071052671410143375,
-0.020709000527858734,
-0.002975588198751211,
-0.0278631579130888,
-0.0901644229888916,
-0.10230876505374908,
-0.11923836916685104,
0.0781332328915596,
-0.04917395859956741,
0.030938655138015747,
-0.024405280128121376,
-0.028019487857818604,
0.022183338180184364,
0.09651339054107666,
0.19860608875751495,
0.05340343713760376,
-0.0955481231212616,
-0.002124606166034937,
0.1309170126914978,
-0.015067091211676598,
-0.20073087513446808,
-0.08879394829273224,
0.07289756089448929,
-0.0417269803583622,
-0.12480973452329636,
-0.05483943223953247,
0.18673066794872284,
0.0517304427921772,
0.007503360044211149,
-0.016861705109477043,
-0.13767679035663605,
-0.09387270361185074,
0.11140661686658859,
-0.010350140742957592,
0.3022499084472656,
0.04375335946679115,
-0.04291042685508728,
0.003110026940703392,
-0.052954502403736115,
0.06844959408044815,
0.042983781546354294,
0.03211868181824684,
0.013336826115846634,
0.03769320994615555,
0.010875320993363857,
-0.052522238343954086,
0.07109498977661133,
0.16794876754283905,
0.017294391989707947,
-0.025688428431749344,
-0.12034278362989426,
-0.04682664945721626,
-0.028993012383580208,
0.05928696691989899,
0.06279631704092026,
0.012999087572097778,
-0.0835757702589035,
-0.008728651329874992,
-0.07629323750734329,
0.04469624534249306,
0.027751479297876358,
-0.05372823774814606,
-0.12943194806575775,
0.004266391042619944,
0.14491921663284302,
0.012107537128031254,
0.23568499088287354,
0.10441917181015015,
0.14621372520923615,
0.11445140093564987,
0.07851474732160568,
-0.17453794181346893,
-0.02945873513817787,
0.00911610946059227,
-0.04052137956023216,
0.062356919050216675,
-0.10391955077648163,
-0.027943653985857964,
0.20474183559417725,
-0.05637393519282341,
-0.09491123259067535,
0.09298533201217651,
-0.04218930006027222,
-0.10203433781862259,
0.07669475674629211,
-0.15249373018741608,
-0.03983067721128464,
-0.011270096525549889,
0.05267797410488129,
0.032660309225320816,
0.16291221976280212,
0.13792110979557037,
-0.09051307290792465,
0.00038678129203617573,
0.06791975349187851,
0.023167118430137634,
-0.134429931640625,
0.09055477380752563,
0.09739648550748825,
0.07372485846281052,
-0.09558668732643127,
0.13359738886356354,
-0.009610255248844624,
-0.03676033020019531,
-0.09757733345031738,
0.021945895627141,
-0.13424398005008698,
-0.05962112545967102,
0.06572446972131729,
0.08126702159643173,
-0.07220949977636337,
-0.05799398571252823,
0.04336226359009743,
-0.0902998074889183,
0.023818498477339745,
0.2547541856765747,
0.10411261767148972,
0.09438983350992203,
-0.05371304973959923,
-0.06742407381534576,
-0.06186243146657944,
0.0017165298340842128,
0.09029904007911682,
0.08294327557086945,
-0.12144652754068375,
-0.04303198307752609,
-0.021985575556755066,
0.061294980347156525,
-0.08941181749105453,
0.024427028372883797,
-0.0805203914642334,
0.04802830517292023,
-0.07521290332078934,
-0.02284657023847103,
-0.11443375051021576,
-0.033334389328956604,
-0.004719281103461981,
-0.09865948557853699,
-0.01203291304409504,
0.03491951525211334,
-0.06384336948394775,
0.0853034257888794,
0.051510877907276154,
-0.011575824581086636,
-0.048028118908405304,
-0.07472679764032364,
-0.054556429386138916,
-0.012935423292219639,
0.125412255525589,
-0.016736162826418877,
-0.1114797294139862,
-0.004863066133111715,
-0.22356918454170227,
-0.14899267256259918,
0.13891765475273132,
0.08045520633459091,
0.09964010119438171,
-0.08943331986665726,
0.048596810549497604,
0.027934832498431206,
0.026938045397400856,
-0.017478760331869125,
0.1527434140443802,
-0.01538399700075388,
0.010837895795702934,
-0.1016564816236496,
-0.01342164445668459,
-0.05689119175076485,
-0.05569011718034744,
0.004499434493482113,
0.10482002794742584,
0.06512602418661118,
-0.03070855140686035,
0.04341750591993332,
0.020966552197933197,
0.05141911655664444,
-0.028271934017539024,
-0.14741277694702148,
0.10109322518110275,
-0.03360062837600708,
0.009814650751650333,
-0.040059905499219894,
0.20882666110992432,
-0.12740527093410492,
-0.12099853903055191,
-0.02919022925198078,
-0.07032357901334763,
-0.058662593364715576,
0.0329151377081871,
0.10947012901306152,
0.027621669694781303,
-0.005711515434086323,
-0.05234634876251221,
0.056527379900217056,
0.08616899698972702,
0.026995977386832237,
0.05147312954068184,
-0.015494810417294502,
0.09374760091304779,
0.10373345017433167,
0.0613195039331913,
-0.08480573445558548,
0.02031085453927517,
-0.11553042382001877,
-0.08283640444278717,
0.02221124991774559,
-0.024033568799495697,
0.0590229406952858,
-0.02345210872590542,
0.02382708340883255,
-0.04021090269088745,
0.053614307194948196,
-0.06510871648788452,
-0.06274697184562683,
-0.15003982186317444,
-0.005432097241282463,
-0.13911201059818268,
-0.02301924303174019,
-0.08269312232732773,
-0.18447478115558624,
0.10041748732328415,
0.09794821590185165,
0.019888348877429962,
0.12048261612653732,
-0.02431098185479641,
-0.020913757383823395,
0.06664684414863586,
0.054853107780218124,
0.04088890925049782,
-0.002732908120378852,
-0.06918551027774811,
-0.15916544198989868,
0.08155439049005508,
-0.10041765868663788,
0.04364747554063797,
0.0824035257101059,
0.08303459733724594,
0.04022052884101868,
-0.03729299083352089,
-0.06978017091751099,
0.00630961125716567,
0.021942343562841415,
0.12445393949747086,
-0.003142214845865965,
-0.057510312646627426,
0.006866013165563345,
0.04420076683163643,
-0.06507112085819244,
-0.1286843717098236,
-0.08970846235752106,
0.17714014649391174,
-0.06555572897195816,
0.08973390609025955,
-0.036957692354917526,
0.056108489632606506,
-0.1265283077955246,
0.24812904000282288,
0.3426513373851776,
-0.08138038218021393,
0.009257208555936813,
0.07792814075946808,
0.013500389643013477,
0.01268591359257698,
0.0598713681101799,
0.07969939708709717,
0.2839573323726654,
-0.08968456089496613,
-0.12098294496536255,
-0.01650654710829258,
0.015537621453404427,
-0.04671049490571022,
-0.10630365461111069,
-0.0030388657469302416,
-0.0959940031170845,
-0.0651170164346695,
0.09859216958284378,
-0.23084120452404022,
0.010988056659698486,
0.018370693549513817,
-0.14912982285022736,
-0.06576903909444809,
-0.08386145532131195,
0.11997393518686295,
-0.0308395866304636,
0.09688711166381836,
-0.06734858453273773,
-0.006016111467033625,
0.0578765831887722,
0.013958713971078396,
-0.10973385721445084,
-0.0230063796043396,
-0.017639480531215668,
0.08289236575365067,
0.03515308350324631,
0.013295409269630909,
0.13698048889636993,
0.14016863703727722,
-0.04371058940887451,
-0.09000620990991592,
-0.00795576348900795,
0.0712314322590828,
-0.12107443809509277,
-0.09176245331764221,
-0.164673313498497,
0.003001587698236108,
0.11637479811906815,
0.027867581695318222,
-0.15618550777435303,
0.0052528977394104,
0.11549731343984604,
0.018803250044584274,
-0.06391898542642593,
-0.03690292313694954,
-0.047153621912002563,
0.07141900807619095,
0.01286186184734106,
-0.042861953377723694,
-0.03370532765984535,
-0.024805555120110512,
0.030265428125858307,
-0.007362759672105312,
0.01793620176613331,
0.012244198471307755,
-0.1322701871395111,
-0.010874060913920403,
-0.019594844430685043,
0.027994057163596153,
0.07760568708181381,
-0.01833030767738819,
-0.12925060093402863,
0.02143157832324505,
-0.07132463157176971,
0.0506715252995491,
0.09427622705698013,
-0.006608251016587019,
0.00974259153008461,
-0.05032770335674286,
-0.029190145432949066,
-0.030761433765292168,
-0.13552214205265045,
-0.1140001118183136
] |
null | null |
transformers
|
# densenet169
Implementation of DenseNet proposed in [Densely Connected Convolutional
Networks](https://arxiv.org/abs/1608.06993)
Create a default models
``` {.sourceCode .}
DenseNet.densenet121()
DenseNet.densenet161()
DenseNet.densenet169()
DenseNet.densenet201()
```
Examples:
``` {.sourceCode .}
# change activation
DenseNet.densenet121(activation = nn.SELU)
# change number of classes (default is 1000 )
DenseNet.densenet121(n_classes=100)
# pass a different block
DenseNet.densenet121(block=...)
# change the initial convolution
model = DenseNet.densenet121()
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = DenseNet.densenet121()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14]), torch.Size([1, 512, 7, 7]), torch.Size([1, 1024, 7, 7])]
```
|
{}
| null |
glasses/densenet169
|
[
"transformers",
"pytorch",
"arxiv:1608.06993",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1608.06993"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us
|
# densenet169
Implementation of DenseNet proposed in Densely Connected Convolutional
Networks
Create a default models
Examples:
|
[
"# densenet169\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n",
"# densenet169\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
30,
31
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n# densenet169\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
-0.051083873957395554,
-0.1171654611825943,
-0.004883099347352982,
0.0028192538302391768,
-0.016958655789494514,
-0.0019582463428378105,
0.043211430311203,
0.017064420506358147,
-0.0570816695690155,
0.001964234048500657,
0.22589632868766785,
0.07909025996923447,
-0.062122881412506104,
0.15192224085330963,
-0.05313350632786751,
-0.18126103281974792,
0.04823744297027588,
0.12158722430467606,
-0.0719999372959137,
0.03239964321255684,
0.10714936256408691,
-0.05046714097261429,
0.08277498185634613,
0.018256589770317078,
-0.15370027720928192,
-0.005430193152278662,
-0.0390695221722126,
-0.0020168025512248278,
0.08655572682619095,
0.06405532360076904,
0.20974361896514893,
0.0050114830955863,
-0.008101153187453747,
-0.21192464232444763,
0.021191179752349854,
0.022240832448005676,
-0.056247077882289886,
0.13187256455421448,
0.08139359205961227,
0.06471437960863113,
-0.05579826980829239,
-0.01182129979133606,
-0.05047333240509033,
-0.014323055744171143,
-0.016664115712046623,
-0.06972210109233856,
-0.0016834426205605268,
0.10818220674991608,
-0.031172223389148712,
0.024601882323622704,
0.032750312238931656,
0.15797695517539978,
-0.07646986842155457,
0.045518938452005386,
-0.03540181741118431,
-0.27882468700408936,
-0.025784894824028015,
0.30224066972732544,
-0.07094012200832367,
-0.028748080134391785,
0.028094302862882614,
-0.012672233395278454,
0.08065236359834671,
0.014850519597530365,
0.11560375988483429,
-0.021340887993574142,
-0.06366118043661118,
0.02027815394103527,
-0.15414923429489136,
0.0742446556687355,
0.011784352362155914,
0.027463527396321297,
0.11950661987066269,
-0.021016115322709084,
-0.21193115413188934,
0.003798782592639327,
0.005903507117182016,
-0.1000877171754837,
-0.03174333646893501,
0.061968378722667694,
0.04641234874725342,
-0.017124628648161888,
-0.06876121461391449,
0.0629282221198082,
-0.15838520228862762,
0.07681407779455185,
0.07286309450864792,
0.04313499853014946,
-0.11338751018047333,
0.08789418637752533,
-0.1526629626750946,
-0.10494104772806168,
0.13222487270832062,
-0.13690614700317383,
-0.042368125170469284,
-0.004085187800228596,
-0.0042150914669036865,
0.015856530517339706,
0.10101325809955597,
0.051257845014333725,
0.11145324259996414,
-0.04584477096796036,
0.0992559865117073,
0.10851571708917618,
-0.03818288445472717,
-0.08754211664199829,
-0.19471202790737152,
-0.035250432789325714,
0.018594132736325264,
-0.04398462921380997,
0.09874764829874039,
-0.003236549673601985,
-0.07931014895439148,
-0.03756043687462807,
-0.01006498746573925,
0.02097967639565468,
-0.017903029918670654,
0.0934172049164772,
-0.05995725467801094,
-0.024772698059678078,
0.04170078784227371,
0.01699252985417843,
0.013975092209875584,
-0.05793115869164467,
-0.0315621979534626,
0.12584151327610016,
0.04629470407962799,
0.06701915711164474,
-0.014334606006741524,
0.12732499837875366,
-0.10326491296291351,
-0.04460262879729271,
-0.045856501907110214,
-0.07632327079772949,
0.056202858686447144,
-0.015202301554381847,
0.011341637000441551,
-0.18105769157409668,
0.005771066527813673,
0.03958272561430931,
0.09468892216682434,
-0.03166695684194565,
-0.006671321578323841,
-0.06512656807899475,
-0.13345444202423096,
-0.029446152970194817,
-0.04249078035354614,
0.08727458119392395,
-0.05158703029155731,
-0.0406000055372715,
-0.010668213479220867,
-0.004088547546416521,
-0.14357179403305054,
-0.002259916625916958,
-0.0033508590422570705,
0.0019455953733995557,
-0.029208403080701828,
0.027091966941952705,
-0.13213308155536652,
0.03034650906920433,
-0.10887432098388672,
-0.05309982970356941,
-0.051914509385824203,
0.005168765317648649,
0.10514485090970993,
0.07582465559244156,
-0.13679170608520508,
-0.018333863466978073,
0.04386242479085922,
-0.08512050658464432,
-0.16595539450645447,
0.14790946245193481,
-0.03954668715596199,
0.020152047276496887,
0.02276989445090294,
0.0969659760594368,
0.16065439581871033,
-0.14484229683876038,
0.03287477791309357,
0.07769954949617386,
-0.09334312379360199,
-0.1341760754585266,
-0.026037707924842834,
0.10515531152486801,
-0.13924738764762878,
0.0024063815362751484,
0.006249226629734039,
0.1561523675918579,
-0.08717085421085358,
-0.04629548266530037,
-0.02387925051152706,
-0.08108633756637573,
0.09398660063743591,
0.012559154070913792,
0.06135118752717972,
0.03620486333966255,
-0.04753103479743004,
-0.08323664963245392,
0.04180451110005379,
-0.020349062979221344,
0.02151692844927311,
-0.024459725245833397,
0.05696234107017517,
-0.14162969589233398,
-0.01567472703754902,
-0.08112826943397522,
-0.19392099976539612,
0.03530256450176239,
0.1395336538553238,
-0.034909963607788086,
0.2322665899991989,
0.05769844725728035,
-0.039846066385507584,
-0.08875589817762375,
-0.016838276758790016,
0.1923963576555252,
0.04110480844974518,
-0.14969530701637268,
-0.10925859957933426,
-0.06207733228802681,
-0.08049999177455902,
-0.03930716961622238,
-0.14054171741008759,
-0.011224298737943172,
-0.041375089436769485,
0.19025468826293945,
-0.031556032598018646,
0.07847493886947632,
0.020860841497778893,
0.025552691891789436,
-0.032917559146881104,
-0.012164095416665077,
0.057758525013923645,
-0.04707251861691475,
-0.1623128205537796,
0.0885726809501648,
-0.024862676858901978,
0.26270315051078796,
0.06089303642511368,
-0.11411731690168381,
0.01093846932053566,
-0.09010046720504761,
-0.05789022892713547,
-0.022403068840503693,
0.03809940069913864,
0.01658632420003414,
0.1838892102241516,
-0.008285733871161938,
0.07440895587205887,
-0.04094387963414192,
-0.032674163579940796,
-0.004167700652033091,
-0.014926843345165253,
0.007009767461568117,
0.05456792190670967,
0.062208499759435654,
-0.16366714239120483,
0.11192784458398819,
0.011660462245345116,
-0.05423348397016525,
-0.05713028088212013,
0.014315544627606869,
0.03190518543124199,
-0.01212052721530199,
0.11220168322324753,
-0.007294013164937496,
0.037342168390750885,
-0.033843185752630234,
0.025941811501979828,
0.06253505498170853,
-0.04401512071490288,
0.08475711941719055,
-0.08385786414146423,
0.011913823895156384,
-0.004972721915692091,
0.010293773375451565,
0.08671208471059799,
0.09443381428718567,
-0.029134489595890045,
0.04411882162094116,
-0.043555617332458496,
-0.22781431674957275,
0.060309384018182755,
-0.026471225544810295,
-0.03337513655424118,
0.14310577511787415,
-0.12488899379968643,
-0.10838104039430618,
-0.04626697301864624,
-0.10496767610311508,
-0.0347675085067749,
-0.02829892560839653,
-0.022999556735157967,
-0.07455704361200333,
-0.06764771789312363,
0.014552383683621883,
0.055023062974214554,
0.061422914266586304,
0.07848239690065384,
-0.02594011463224888,
-0.011146151460707188,
-0.015401559881865978,
-0.10504739731550217,
0.005451960023492575,
-0.12866716086864471,
0.053231436759233475,
0.014493615366518497,
-0.11238992214202881,
0.028623241931200027,
0.1511281430721283,
0.042237766087055206,
-0.02061823569238186,
0.0008524511940777302,
0.0985495075583458,
0.033300865441560745,
0.10134612023830414,
0.05901964008808136,
-0.04361957684159279,
0.08429518342018127,
0.21574978530406952,
0.08062393963336945,
-0.01420009508728981,
0.012005751952528954,
-0.020960267633199692,
-0.028073739260435104,
-0.1975681185722351,
-0.20292779803276062,
-0.08683682978153229,
-0.03439704328775406,
0.05172932893037796,
0.024289418011903763,
0.03109588660299778,
0.14511780440807343,
0.00330866570584476,
-0.011697972193360329,
0.021521087735891342,
0.0552830770611763,
0.1808544099330902,
-0.018835769966244698,
0.07143596559762955,
-0.04291698336601257,
-0.13400326669216156,
0.04656919464468956,
0.0768166109919548,
0.14040477573871613,
-0.05924064293503761,
-0.028839973732829094,
0.07402627915143967,
0.2619035243988037,
0.09004925936460495,
0.11267538368701935,
-0.054076846688985825,
-0.027414213865995407,
-0.02101396769285202,
-0.05146486684679985,
0.07970584183931351,
0.09914197772741318,
0.04115980863571167,
-0.0913587436079979,
0.05719808116555214,
0.06375366449356079,
0.04308202490210533,
-0.10741057246923447,
0.12068607658147812,
-0.30458298325538635,
0.057322777807712555,
-0.030032500624656677,
-0.0257281344383955,
-0.03189348056912422,
0.05277042090892792,
0.2389964908361435,
0.02302209660410881,
0.12299856543540955,
-0.06381367146968842,
0.05912134051322937,
-0.016385624185204506,
0.02179292030632496,
0.021102245897054672,
0.12839443981647491,
0.042467210441827774,
0.04943641275167465,
-0.16774322092533112,
0.2884182631969452,
-0.006310444790869951,
-0.05272405967116356,
-0.06100350245833397,
0.018306046724319458,
0.0202487763017416,
0.01564490981400013,
0.069648876786232,
0.007225946057587862,
0.20590896904468536,
0.1254589855670929,
-0.10347732156515121,
0.01800706796348095,
0.018001161515712738,
0.08239682763814926,
-0.0726851224899292,
0.07719587534666061,
-0.08101651072502136,
0.08865177631378174,
0.019194023683667183,
-0.07244770973920822,
-0.09870414435863495,
0.034588105976581573,
0.2634010910987854,
-0.1116192638874054,
-0.04452861100435257,
-0.08328772336244583,
-0.027737775817513466,
0.23500502109527588,
0.030813371762633324,
-0.07926643639802933,
-0.05170309543609619,
0.030782578513026237,
0.23367151618003845,
-0.05543986335396767,
0.062010280787944794,
-0.05722929537296295,
0.055892135947942734,
-0.09348811209201813,
-0.15246741473674774,
0.13398608565330505,
-0.10081785917282104,
0.015194752253592014,
-0.08724090456962585,
0.17068739235401154,
0.008256270550191402,
-0.020256834104657173,
-0.004480038303881884,
-0.02874397486448288,
-0.08881030231714249,
-0.10309843719005585,
-0.11657635122537613,
0.07825465500354767,
-0.04849488288164139,
0.03181935101747513,
-0.020279088988900185,
-0.025616955012083054,
0.022382210940122604,
0.09642338752746582,
0.19716159999370575,
0.05153430998325348,
-0.09581376612186432,
-0.00284791411831975,
0.13257063925266266,
-0.014999094419181347,
-0.20011286437511444,
-0.08861418068408966,
0.07124360650777817,
-0.040916942059993744,
-0.1257234364748001,
-0.053219158202409744,
0.18671004474163055,
0.052768077701330185,
0.007331536151468754,
-0.01967284083366394,
-0.13834141194820404,
-0.09330587089061737,
0.11304090172052383,
-0.010212347842752934,
0.3001573979854584,
0.04595998302102089,
-0.044117238372564316,
0.0033759637735784054,
-0.04646514728665352,
0.06952967494726181,
0.04012801870703697,
0.030653003603219986,
0.01272581610828638,
0.03781869262456894,
0.010989737696945667,
-0.05293240398168564,
0.07102111726999283,
0.1673894226551056,
0.018296338617801666,
-0.025219375267624855,
-0.11971054971218109,
-0.04688074812293053,
-0.029435589909553528,
0.05964646860957146,
0.06459305435419083,
0.014340398833155632,
-0.0847979485988617,
-0.008424809202551842,
-0.07733664661645889,
0.04479021579027176,
0.02771284244954586,
-0.05365254357457161,
-0.13056680560112,
0.002920867409557104,
0.143461212515831,
0.012773752212524414,
0.23454424738883972,
0.10574419796466827,
0.14746695756912231,
0.11491106450557709,
0.07804152369499207,
-0.1750195175409317,
-0.029101505875587463,
0.008996007032692432,
-0.04071745276451111,
0.06310908496379852,
-0.103036068379879,
-0.02919757179915905,
0.2059553861618042,
-0.05677179619669914,
-0.09674062579870224,
0.09242352098226547,
-0.04327617213129997,
-0.10331696271896362,
0.07748310267925262,
-0.1541985422372818,
-0.03799953684210777,
-0.012858748435974121,
0.05045796185731888,
0.03226304426789284,
0.1622498780488968,
0.1378069370985031,
-0.09052351862192154,
0.0012216151226311922,
0.06649572402238846,
0.022407347336411476,
-0.13488459587097168,
0.09022222459316254,
0.09810461103916168,
0.07269849628210068,
-0.09681755304336548,
0.13284669816493988,
-0.010027964599430561,
-0.041938360780477524,
-0.09769029170274734,
0.026350252330303192,
-0.13401490449905396,
-0.05847091227769852,
0.0662921592593193,
0.0830128937959671,
-0.07495355606079102,
-0.05961211398243904,
0.04394865408539772,
-0.0901079773902893,
0.02447916939854622,
0.25560396909713745,
0.10311629623174667,
0.09307026118040085,
-0.05386892333626747,
-0.06794144213199615,
-0.06087028980255127,
0.0005908205057494342,
0.08869802206754684,
0.08384563028812408,
-0.12273714691400528,
-0.04472390562295914,
-0.020970571786165237,
0.061368975788354874,
-0.08919525891542435,
0.026490185409784317,
-0.08006613701581955,
0.04852837696671486,
-0.07290063053369522,
-0.023617302998900414,
-0.11316157877445221,
-0.03307684510946274,
-0.004788889549672604,
-0.09785827994346619,
-0.012572893872857094,
0.034043729305267334,
-0.06403220444917679,
0.08574245870113373,
0.051413342356681824,
-0.011300170794129372,
-0.047760214656591415,
-0.07609089463949203,
-0.05388578772544861,
-0.011551864445209503,
0.12542475759983063,
-0.015745194628834724,
-0.10941854119300842,
-0.004316227976232767,
-0.22300685942173004,
-0.14916646480560303,
0.13923349976539612,
0.08203796297311783,
0.10008644312620163,
-0.08799910545349121,
0.04829929769039154,
0.028651786968111992,
0.02650747448205948,
-0.01738491840660572,
0.1507861316204071,
-0.017155097797513008,
0.00836626160889864,
-0.10065793991088867,
-0.01455788966268301,
-0.057402729988098145,
-0.05604490637779236,
0.005235620308667421,
0.10506424307823181,
0.062210120260715485,
-0.03181673586368561,
0.04381705820560455,
0.020674366503953934,
0.05136297270655632,
-0.027248632162809372,
-0.14730122685432434,
0.10115348547697067,
-0.03266279026865959,
0.011071635410189629,
-0.04124835506081581,
0.21025580167770386,
-0.1284930557012558,
-0.11801077425479889,
-0.028684355318546295,
-0.07284609973430634,
-0.05615434795618057,
0.03247782960534096,
0.10975764691829681,
0.026902129873633385,
-0.007510928902775049,
-0.05065445229411125,
0.05702681839466095,
0.08562713116407394,
0.02476849965751171,
0.051577627658843994,
-0.01851515844464302,
0.09761353582143784,
0.10419220477342606,
0.06282580643892288,
-0.08353552222251892,
0.01853867806494236,
-0.11409652233123779,
-0.0835939422249794,
0.022393029183149338,
-0.02412492036819458,
0.05758567899465561,
-0.02256820537149906,
0.02280341647565365,
-0.039483584463596344,
0.05395357310771942,
-0.06604418158531189,
-0.06164749711751938,
-0.15096943080425262,
-0.0063660419546067715,
-0.14063908159732819,
-0.02228795364499092,
-0.0832560807466507,
-0.18398894369602203,
0.09919340908527374,
0.09730255603790283,
0.018223358318209648,
0.12156189233064651,
-0.02229086495935917,
-0.020723732188344002,
0.06732163578271866,
0.0550273098051548,
0.04089624434709549,
-0.0012375483056530356,
-0.06967329233884811,
-0.1592216044664383,
0.08030441403388977,
-0.10132133215665817,
0.04377821832895279,
0.08101925998926163,
0.08408631384372711,
0.043471626937389374,
-0.03700685873627663,
-0.07057934254407883,
0.0053520044311881065,
0.024432839825749397,
0.1252973973751068,
-0.0021280760411173105,
-0.05781247839331627,
0.006918200757354498,
0.04320133477449417,
-0.06699938327074051,
-0.12430262565612793,
-0.08680014312267303,
0.177957221865654,
-0.06628087908029556,
0.08975827693939209,
-0.036442793905735016,
0.056581854820251465,
-0.12689967453479767,
0.24891328811645508,
0.34035614132881165,
-0.07998647540807724,
0.008401920087635517,
0.0755169466137886,
0.013445617631077766,
0.011015735566616058,
0.060315318405628204,
0.08051963150501251,
0.2839759886264801,
-0.09039661288261414,
-0.12204872816801071,
-0.01681922934949398,
0.015013270080089569,
-0.04712633788585663,
-0.1046154648065567,
-0.0014945176662877202,
-0.09709304571151733,
-0.0637606680393219,
0.1012498065829277,
-0.22951053082942963,
0.007103622425347567,
0.01937747560441494,
-0.1492585390806198,
-0.06548802554607391,
-0.08326906710863113,
0.11947015672922134,
-0.02986045554280281,
0.09634695947170258,
-0.06751915067434311,
-0.005898284260183573,
0.05624845251441002,
0.015206007286906242,
-0.11280696839094162,
-0.024848880246281624,
-0.01901102252304554,
0.08444453775882721,
0.03515336290001869,
0.014265300706028938,
0.1377224177122116,
0.13984215259552002,
-0.043122656643390656,
-0.08943586051464081,
-0.006536906585097313,
0.07154531031847,
-0.12144621461629868,
-0.09071791172027588,
-0.16348294913768768,
0.0035574862267822027,
0.1194772869348526,
0.02770514227449894,
-0.15730394423007965,
0.004956254735589027,
0.11585759371519089,
0.01972423493862152,
-0.06539636105298996,
-0.0359199084341526,
-0.04861189424991608,
0.07243914902210236,
0.014029423706233501,
-0.042097121477127075,
-0.03451986983418465,
-0.02387428469955921,
0.029120666906237602,
-0.007969356141984463,
0.01815187744796276,
0.013195683248341084,
-0.12936580181121826,
-0.009860847145318985,
-0.016903921961784363,
0.028001874685287476,
0.07697505503892899,
-0.016982804983854294,
-0.12870831787586212,
0.02154056914150715,
-0.07173708826303482,
0.05011717602610588,
0.09136907011270523,
-0.008398232981562614,
0.010148261673748493,
-0.05055761709809303,
-0.02930246666073799,
-0.03210942819714546,
-0.13571874797344208,
-0.1135156899690628
] |
null | null |
transformers
|
# densenet201
Implementation of DenseNet proposed in [Densely Connected Convolutional
Networks](https://arxiv.org/abs/1608.06993)
Create a default models
``` {.sourceCode .}
DenseNet.densenet121()
DenseNet.densenet161()
DenseNet.densenet169()
DenseNet.densenet201()
```
Examples:
``` {.sourceCode .}
# change activation
DenseNet.densenet121(activation = nn.SELU)
# change number of classes (default is 1000 )
DenseNet.densenet121(n_classes=100)
# pass a different block
DenseNet.densenet121(block=...)
# change the initial convolution
model = DenseNet.densenet121()
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = DenseNet.densenet121()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14]), torch.Size([1, 512, 7, 7]), torch.Size([1, 1024, 7, 7])]
```
|
{}
| null |
glasses/densenet201
|
[
"transformers",
"pytorch",
"arxiv:1608.06993",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1608.06993"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us
|
# densenet201
Implementation of DenseNet proposed in Densely Connected Convolutional
Networks
Create a default models
Examples:
|
[
"# densenet201\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n",
"# densenet201\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
30,
31
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1608.06993 #endpoints_compatible #region-us \n# densenet201\nImplementation of DenseNet proposed in Densely Connected Convolutional\nNetworks\n\n Create a default models\n\n \n\n Examples:"
] |
[
-0.0470874086022377,
-0.11464595049619675,
-0.0049463543109595776,
0.0009678402566350996,
-0.018274566158652306,
-0.0009933023247867823,
0.0480819009244442,
0.015345470048487186,
-0.061478570103645325,
0.0013068460393697023,
0.2265351116657257,
0.07671534270048141,
-0.06389712542295456,
0.15719923377037048,
-0.05238110572099686,
-0.1818779557943344,
0.04865610599517822,
0.12107113003730774,
-0.07069545239210129,
0.033223118633031845,
0.10858332365751266,
-0.050635598599910736,
0.08395592868328094,
0.017825555056333542,
-0.15547765791416168,
-0.005186522379517555,
-0.04098088666796684,
-0.0033910823985934258,
0.08697398751974106,
0.06245413422584534,
0.21438229084014893,
0.005046308506280184,
-0.003939866088330746,
-0.21391528844833374,
0.020648254081606865,
0.021451154723763466,
-0.05646665766835213,
0.13188646733760834,
0.07879118621349335,
0.06438907235860825,
-0.060619793832302094,
-0.013120187446475029,
-0.04950721934437752,
-0.014659001491963863,
-0.01558657269924879,
-0.06753885746002197,
-0.0003701042733155191,
0.108774833381176,
-0.03424658998847008,
0.02368675358593464,
0.03293905779719353,
0.1580003947019577,
-0.07257353514432907,
0.043769627809524536,
-0.03065650910139084,
-0.28096240758895874,
-0.025275317952036858,
0.3013519048690796,
-0.06943009048700333,
-0.025468837469816208,
0.02895475924015045,
-0.012757392600178719,
0.08329307287931442,
0.01581501215696335,
0.12011697143316269,
-0.02146574854850769,
-0.056431856006383896,
0.02119155041873455,
-0.15382032096385956,
0.07433958351612091,
0.01536079216748476,
0.02740134485065937,
0.1181396096944809,
-0.021288694813847542,
-0.2133614420890808,
-0.004899380262941122,
0.008635691367089748,
-0.1028149351477623,
-0.030520718544721603,
0.06102997437119484,
0.04388460889458656,
-0.01847180165350437,
-0.07113581895828247,
0.06317973136901855,
-0.15823881328105927,
0.08079545199871063,
0.07399573177099228,
0.04241490364074707,
-0.11540205031633377,
0.08891342580318451,
-0.14903563261032104,
-0.1038082018494606,
0.1326327621936798,
-0.138407364487648,
-0.04317191615700722,
-0.005093062296509743,
-0.002864245558157563,
0.014046527445316315,
0.10151928663253784,
0.050476543605327606,
0.11289197206497192,
-0.04571361839771271,
0.10027241706848145,
0.10826694220304489,
-0.0334601029753685,
-0.0870351642370224,
-0.1922573447227478,
-0.03791246935725212,
0.015741320326924324,
-0.044633783400058746,
0.10024122893810272,
-0.0005786540568806231,
-0.0795777216553688,
-0.03643372282385826,
-0.006468814332038164,
0.019343195483088493,
-0.01951371133327484,
0.09307543933391571,
-0.05966181308031082,
-0.024771392345428467,
0.035989824682474136,
0.015423010103404522,
0.013230158016085625,
-0.06031505763530731,
-0.03131144121289253,
0.12981361150741577,
0.043856535106897354,
0.0673026293516159,
-0.012740569189190865,
0.12841100990772247,
-0.10445046424865723,
-0.04320020601153374,
-0.044613517820835114,
-0.07563506811857224,
0.05405630171298981,
-0.01605524867773056,
0.009132171049714088,
-0.1809682548046112,
0.008114244788885117,
0.039791569113731384,
0.09867174923419952,
-0.03048737719655037,
-0.008638781495392323,
-0.06893279403448105,
-0.13328014314174652,
-0.030390292406082153,
-0.04278198629617691,
0.08369899541139603,
-0.05072491988539696,
-0.04085804149508476,
-0.012184186838567257,
-0.003938933834433556,
-0.14810481667518616,
-0.003342399140819907,
-0.002680133329704404,
0.003479469334706664,
-0.02185720205307007,
0.026893340051174164,
-0.12902097404003143,
0.02748984284698963,
-0.11004576086997986,
-0.052934423089027405,
-0.05394845828413963,
0.005265362560749054,
0.10395096987485886,
0.07422216236591339,
-0.14347821474075317,
-0.019055722281336784,
0.050160035490989685,
-0.0865754783153534,
-0.16457632184028625,
0.1456407904624939,
-0.03975047171115875,
0.022494884207844734,
0.02406623214483261,
0.09492486715316772,
0.15262751281261444,
-0.1491153985261917,
0.03361155837774277,
0.07302849739789963,
-0.09663473069667816,
-0.13263659179210663,
-0.0248173289000988,
0.10467441380023956,
-0.13856203854084015,
0.0010299774585291743,
0.010200144723057747,
0.15444344282150269,
-0.086815245449543,
-0.04696738347411156,
-0.02422490157186985,
-0.08121080696582794,
0.09184671938419342,
0.009775949642062187,
0.061245668679475784,
0.03507457301020622,
-0.048690833151340485,
-0.07838758826255798,
0.04103885963559151,
-0.022170700132846832,
0.022270163521170616,
-0.02405393309891224,
0.05191332846879959,
-0.14289580285549164,
-0.016106314957141876,
-0.07991351187229156,
-0.1962282806634903,
0.035507895052433014,
0.1350753903388977,
-0.03493834286928177,
0.23635074496269226,
0.056996144354343414,
-0.039619576185941696,
-0.08988822251558304,
-0.01701582409441471,
0.19076719880104065,
0.041328538209199905,
-0.1480039358139038,
-0.10600682348012924,
-0.060837242752313614,
-0.08239003270864487,
-0.03784680739045143,
-0.13773220777511597,
-0.011176333762705326,
-0.04339862987399101,
0.18872623145580292,
-0.02974158152937889,
0.07730020582675934,
0.01977895013988018,
0.024480968713760376,
-0.034140173345804214,
-0.011641950346529484,
0.0548161156475544,
-0.04782775789499283,
-0.1632927805185318,
0.086207315325737,
-0.02311388961970806,
0.2604653537273407,
0.05967223644256592,
-0.10799462348222733,
0.009581604972481728,
-0.09581732749938965,
-0.05958923324942589,
-0.020154686644673347,
0.035102907568216324,
0.01600050739943981,
0.1843990534543991,
-0.006223603151738644,
0.07325727492570877,
-0.039148442447185516,
-0.030212875455617905,
-0.005103721283376217,
-0.014190223067998886,
0.004444492980837822,
0.0508125014603138,
0.06213431805372238,
-0.165399968624115,
0.11091551929712296,
0.012625172734260559,
-0.055059920996427536,
-0.05877535045146942,
0.014385678805410862,
0.031079908832907677,
-0.014014274813234806,
0.11074592918157578,
-0.008767195977270603,
0.03907192125916481,
-0.026304613798856735,
0.024961253628134727,
0.06178102642297745,
-0.044392794370651245,
0.08313728868961334,
-0.08358879387378693,
0.010890631936490536,
-0.006017039529979229,
0.009668976068496704,
0.08764474093914032,
0.0937655046582222,
-0.0302499420940876,
0.04306509718298912,
-0.044088974595069885,
-0.2254735678434372,
0.05978096276521683,
-0.025649504736065865,
-0.03362658992409706,
0.1408269703388214,
-0.12518614530563354,
-0.10592850297689438,
-0.042441532015800476,
-0.09948717802762985,
-0.031499508768320084,
-0.027685467153787613,
-0.0222441628575325,
-0.07368364185094833,
-0.06596090644598007,
0.013332247734069824,
0.0541643425822258,
0.06337403506040573,
0.08005524426698685,
-0.024515269324183464,
-0.01162633579224348,
-0.01801617257297039,
-0.10672410577535629,
0.0036321424413472414,
-0.12913300096988678,
0.05127513036131859,
0.01587665267288685,
-0.11465120315551758,
0.025971494615077972,
0.15134112536907196,
0.04231918975710869,
-0.017124509438872337,
-0.0003257284697610885,
0.09408575296401978,
0.03455740585923195,
0.10042949765920639,
0.05195879936218262,
-0.0417686328291893,
0.08497995138168335,
0.21986141800880432,
0.07984306663274765,
-0.013043520972132683,
0.014256389811635017,
-0.020526345819234848,
-0.02792871929705143,
-0.1974949836730957,
-0.202129527926445,
-0.08336273580789566,
-0.03681737184524536,
0.05397209897637367,
0.024925345554947853,
0.03419937565922737,
0.1449751853942871,
0.005224461667239666,
-0.013306205160915852,
0.023067215457558632,
0.05673317238688469,
0.17593729496002197,
-0.017504440620541573,
0.06975378096103668,
-0.04266338795423508,
-0.1357891857624054,
0.04579276591539383,
0.07740825414657593,
0.14685948193073273,
-0.05630188807845116,
-0.03044426441192627,
0.07547570765018463,
0.2545319199562073,
0.08967535197734833,
0.1122390478849411,
-0.04772220551967621,
-0.026375707238912582,
-0.022193629294633865,
-0.05240041762590408,
0.08052133768796921,
0.10120780020952225,
0.0433950237929821,
-0.0909103974699974,
0.060336288064718246,
0.06847812980413437,
0.04448239877820015,
-0.10558880865573883,
0.12171433120965958,
-0.30510738492012024,
0.06075147166848183,
-0.029595518484711647,
-0.025376906618475914,
-0.030541522428393364,
0.05400902032852173,
0.24012044072151184,
0.023557770997285843,
0.12204473465681076,
-0.06456726789474487,
0.057049524039030075,
-0.018590856343507767,
0.02129330299794674,
0.02420472726225853,
0.1273224949836731,
0.04195941239595413,
0.045521512627601624,
-0.16969798505306244,
0.29255411028862,
-0.007467694114893675,
-0.053474895656108856,
-0.060129888355731964,
0.01717119663953781,
0.01815815083682537,
0.014005552977323532,
0.07190109044313431,
0.006053337827324867,
0.20903940498828888,
0.13135360181331635,
-0.10594836622476578,
0.020653577521443367,
0.01933637447655201,
0.08282304555177689,
-0.07289611548185349,
0.07601523399353027,
-0.08065112680196762,
0.08927322924137115,
0.022760365158319473,
-0.07685300707817078,
-0.0971648097038269,
0.03446279093623161,
0.2661586105823517,
-0.10879349708557129,
-0.04649600386619568,
-0.0860903263092041,
-0.02503802254796028,
0.2331925332546234,
0.034375548362731934,
-0.07832638919353485,
-0.05145720764994621,
0.03174690157175064,
0.23342850804328918,
-0.055375754833221436,
0.06168026477098465,
-0.05724690854549408,
0.055571261793375015,
-0.09366626292467117,
-0.14990472793579102,
0.13222378492355347,
-0.09631425887346268,
0.01772383227944374,
-0.08842423558235168,
0.17179173231124878,
0.010335924103856087,
-0.020312810316681862,
-0.003387525910511613,
-0.02707212045788765,
-0.08530505001544952,
-0.10172546654939651,
-0.11605236679315567,
0.08222926408052444,
-0.0485907606780529,
0.028989681974053383,
-0.023091556504368782,
-0.03019162267446518,
0.023804891854524612,
0.09604176878929138,
0.20082660019397736,
0.053260840475559235,
-0.09567105770111084,
-0.0007669178303331137,
0.12761487066745758,
-0.01481065433472395,
-0.19694189727306366,
-0.08815258741378784,
0.0742776095867157,
-0.04128449410200119,
-0.1275128871202469,
-0.05564279481768608,
0.18665838241577148,
0.05405619367957115,
0.008221217431128025,
-0.022795647382736206,
-0.13948439061641693,
-0.09432478994131088,
0.11040449887514114,
-0.009390008635818958,
0.2977595329284668,
0.04616585373878479,
-0.04392899200320244,
0.00045186179340817034,
-0.04968104884028435,
0.07253201305866241,
0.042894382029771805,
0.031301744282245636,
0.014765972271561623,
0.0403665192425251,
0.010581010021269321,
-0.052665725350379944,
0.06921329349279404,
0.1683369278907776,
0.01673811674118042,
-0.023968415334820747,
-0.1264626383781433,
-0.047509241849184036,
-0.02983408235013485,
0.06082035228610039,
0.06404286623001099,
0.01206466555595398,
-0.07631348818540573,
-0.008721277117729187,
-0.07618460059165955,
0.04461349546909332,
0.02913021296262741,
-0.05557838827371597,
-0.1307074874639511,
0.0032499125227332115,
0.14642265439033508,
0.012969755567610264,
0.23903532326221466,
0.10585165023803711,
0.14125393331050873,
0.10701953619718552,
0.07718686759471893,
-0.17324231564998627,
-0.027750568464398384,
0.008423062972724438,
-0.04186411201953888,
0.0607471764087677,
-0.10403547435998917,
-0.027728337794542313,
0.20681145787239075,
-0.05792514234781265,
-0.0935969203710556,
0.09345096349716187,
-0.0407809242606163,
-0.10214321315288544,
0.07818271219730377,
-0.15425977110862732,
-0.039150603115558624,
-0.012615671381354332,
0.051586173474788666,
0.0330178402364254,
0.16633303463459015,
0.13677968084812164,
-0.08990973979234695,
0.0004025954403914511,
0.06548852473497391,
0.022080345079302788,
-0.1364208608865738,
0.08716077357530594,
0.09564922004938126,
0.0711677074432373,
-0.09372199326753616,
0.13424894213676453,
-0.009909815154969692,
-0.03826380893588066,
-0.09579542279243469,
0.024132873862981796,
-0.13269273936748505,
-0.0565330870449543,
0.06699047982692719,
0.08244545757770538,
-0.07399642467498779,
-0.0571829229593277,
0.04528040066361427,
-0.08741780370473862,
0.02158423140645027,
0.2531949281692505,
0.10386418551206589,
0.09338094294071198,
-0.05479515343904495,
-0.06814838945865631,
-0.06046215817332268,
0.000790567952208221,
0.08889075368642807,
0.08546924591064453,
-0.12240723520517349,
-0.042004942893981934,
-0.023332009091973305,
0.058751098811626434,
-0.08967277407646179,
0.024386558681726456,
-0.07814406603574753,
0.04748750105500221,
-0.07563655823469162,
-0.023872071877121925,
-0.11445794999599457,
-0.035079117864370346,
-0.003039949107915163,
-0.09700994193553925,
-0.014217287302017212,
0.033230483531951904,
-0.06382618099451065,
0.08451850712299347,
0.052202824503183365,
-0.010555914603173733,
-0.04708953574299812,
-0.07567748427391052,
-0.056098733097314835,
-0.011255782097578049,
0.12470421195030212,
-0.016743507236242294,
-0.10913458466529846,
-0.006156028248369694,
-0.22184798121452332,
-0.14879585802555084,
0.14091730117797852,
0.08096959441900253,
0.09969613701105118,
-0.0898820236325264,
0.047517042607069016,
0.030234243720769882,
0.0289465319365263,
-0.017490725964307785,
0.1527855545282364,
-0.015501103363931179,
0.009346450679004192,
-0.1020638644695282,
-0.017199242487549782,
-0.05920931696891785,
-0.05741327628493309,
0.005065435077995062,
0.10356246680021286,
0.06331170350313187,
-0.030956445261836052,
0.04405596852302551,
0.02284112013876438,
0.052496008574962616,
-0.025874238461256027,
-0.14406347274780273,
0.10040785372257233,
-0.03467215225100517,
0.008047844283282757,
-0.04275871440768242,
0.21074163913726807,
-0.129108726978302,
-0.12454923242330551,
-0.02745666727423668,
-0.07465507835149765,
-0.05874834954738617,
0.032914452254772186,
0.1143726035952568,
0.027539074420928955,
-0.010131632909178734,
-0.05224253982305527,
0.055335793644189835,
0.08486893773078918,
0.024866953492164612,
0.05453355982899666,
-0.01700311154127121,
0.09336215257644653,
0.10511288791894913,
0.060258202254772186,
-0.08627530187368393,
0.020582333207130432,
-0.11298918724060059,
-0.08204828947782516,
0.020625626668334007,
-0.021309567615389824,
0.06679654866456985,
-0.02628643810749054,
0.022755693644285202,
-0.0439639687538147,
0.053783025592565536,
-0.065984345972538,
-0.0617622509598732,
-0.15433824062347412,
-0.007027088664472103,
-0.1386563628911972,
-0.021868471056222916,
-0.08366230130195618,
-0.18451470136642456,
0.09731291979551315,
0.09865617752075195,
0.019803892821073532,
0.12014967948198318,
-0.025232186540961266,
-0.021677091717720032,
0.06580150127410889,
0.05689578130841255,
0.042821161448955536,
-0.004602111876010895,
-0.06758330017328262,
-0.1613268256187439,
0.08160919696092606,
-0.10164748877286911,
0.04307917505502701,
0.08236601203680038,
0.08175242692232132,
0.042971670627593994,
-0.03482072427868843,
-0.07175208628177643,
0.004778709728270769,
0.023053189739584923,
0.12088236957788467,
-0.00204848637804389,
-0.059944428503513336,
0.008187229745090008,
0.040991634130477905,
-0.06730148941278458,
-0.1296594887971878,
-0.08861268311738968,
0.17679764330387115,
-0.065322145819664,
0.092467300593853,
-0.036571938544511795,
0.05700301751494408,
-0.1277029663324356,
0.2432902455329895,
0.3439669609069824,
-0.07765056192874908,
0.007067115046083927,
0.07724768668413162,
0.013217806816101074,
0.008901994675397873,
0.05618220567703247,
0.07834111899137497,
0.28251907229423523,
-0.09046147018671036,
-0.11881799250841141,
-0.017456863075494766,
0.017114760354161263,
-0.04587884619832039,
-0.10496131330728531,
-0.004203777760267258,
-0.09631497412919998,
-0.06549064069986343,
0.09975255280733109,
-0.2271147519350052,
0.010794511996209621,
0.018438145518302917,
-0.14686837792396545,
-0.06585990637540817,
-0.08018650114536285,
0.12119605392217636,
-0.031081706285476685,
0.09717513620853424,
-0.06676801294088364,
-0.006625534500926733,
0.06265702843666077,
0.014625261537730694,
-0.1137002632021904,
-0.022139109671115875,
-0.01710370182991028,
0.07575101405382156,
0.03664644807577133,
0.015026143752038479,
0.14202630519866943,
0.1396878957748413,
-0.04411909729242325,
-0.08772756159305573,
-0.004888077266514301,
0.07166331261396408,
-0.12188906222581863,
-0.09245290607213974,
-0.16551539301872253,
0.0024041482247412205,
0.12011047452688217,
0.0285163763910532,
-0.1500069797039032,
0.0042870319448411465,
0.12105989456176758,
0.019703732803463936,
-0.06527198851108551,
-0.03707519918680191,
-0.0488547645509243,
0.07384125888347626,
0.010267925448715687,
-0.041600301861763,
-0.03165303170681,
-0.025065036490559578,
0.030537258833646774,
-0.004108354914933443,
0.015827052295207977,
0.013599734753370285,
-0.130062997341156,
-0.010604399256408215,
-0.01773163676261902,
0.026606343686580658,
0.08084801584482193,
-0.017415175214409828,
-0.13176237046718597,
0.02216910570859909,
-0.07060734182596207,
0.048549942672252655,
0.09312566369771957,
-0.005605952348560095,
0.009614210575819016,
-0.0508231446146965,
-0.02860700897872448,
-0.03253823518753052,
-0.13695374131202698,
-0.11439339071512222
] |
null | null |
transformers
|
# ResNet
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{}
| null |
glasses/dummy
|
[
"transformers",
"pytorch",
"arxiv:1512.03385",
"arxiv:1812.01187",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us
|
# ResNet
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# ResNet\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us \n",
"# ResNet\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
39,
24
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1512.03385 #arxiv-1812.01187 #endpoints_compatible #region-us \n# ResNet\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.09616639465093613,
-0.06366479396820068,
-0.0037322191055864096,
0.05138205736875534,
0.10659891366958618,
0.021586144343018532,
-0.009634475223720074,
0.07054006308317184,
-0.053948886692523956,
0.04731278866529465,
0.1037297248840332,
0.08295340090990067,
-0.0058242822997272015,
-0.04304485023021698,
-0.05572299286723137,
-0.23913611471652985,
0.06364098936319351,
0.08260917663574219,
-0.0041536977514624596,
0.06929973512887955,
0.05711069330573082,
-0.08343339711427689,
0.10536167025566101,
-0.01534983143210411,
-0.23553140461444855,
-0.0008716770098544657,
-0.0015674196183681488,
-0.06291821599006653,
0.11587695777416229,
0.02531363256275654,
0.1433437615633011,
-0.005697180982679129,
0.08062248677015305,
-0.09028321504592896,
0.032286085188388824,
0.05324514955282211,
-0.055810634046792984,
0.05216443911194801,
0.08513733744621277,
-0.024412643164396286,
0.03505827113986015,
0.06152614206075668,
-0.04949970915913582,
-0.04106217250227928,
-0.051160868257284164,
-0.022320285439491272,
0.003970127087086439,
0.1425614356994629,
0.0618370957672596,
0.09957648068666458,
0.011353893205523491,
0.10124125331640244,
-0.09590128809213638,
0.11621695011854172,
0.22650742530822754,
-0.2877153754234314,
-0.05333174392580986,
0.12962009012699127,
-0.018461626023054123,
-0.05911590903997421,
-0.07185162603855133,
0.04889311268925667,
0.025916794314980507,
0.04601820930838585,
-0.17162609100341797,
-0.005087184254080057,
-0.012779563665390015,
-0.02003314718604088,
-0.12083566188812256,
-0.004452147986739874,
0.042793259024620056,
0.049353163689374924,
0.07589907199144363,
0.007548857014626265,
-0.18082308769226074,
-0.02956576645374298,
-0.09480594098567963,
-0.008974917232990265,
-0.008822361007332802,
0.04200572893023491,
-0.12411284446716309,
-0.035819657146930695,
-0.0542692169547081,
-0.03718498349189758,
-0.14949841797351837,
0.00035568850580602884,
0.04786805063486099,
0.11708948016166687,
-0.12814345955848694,
0.019677553325891495,
0.009213728830218315,
-0.01861814223229885,
0.0032554352656006813,
-0.1049971953034401,
-0.015009131282567978,
0.02546936646103859,
-0.0866856798529625,
-0.08173765242099762,
0.04706958681344986,
0.169330433011055,
-0.06645245850086212,
0.021708253771066666,
-0.014596919529139996,
0.1354973167181015,
-0.092161625623703,
0.06524650007486343,
-0.22519497573375702,
0.13533498346805573,
0.0030418639071285725,
-0.10303604602813721,
0.00828594621270895,
-0.02859899215400219,
-0.07747437059879303,
-0.03164482116699219,
0.13175752758979797,
-0.035118017345666885,
-0.049196142703294754,
0.06387704610824585,
-0.008523649536073208,
-0.04166751354932785,
0.036404117941856384,
-0.03709346801042557,
-0.035524070262908936,
-0.0036350248847156763,
-0.08825824409723282,
0.11459409445524216,
0.08952172100543976,
-0.020030781626701355,
-0.09247057884931564,
0.027215268462896347,
-0.021534817293286324,
-0.0005505894660018384,
0.009312539361417294,
-0.06682882457971573,
0.005303868558257818,
-0.011325676925480366,
0.06586373597383499,
-0.23614996671676636,
-0.08204628527164459,
0.0000747989397495985,
0.07170955091714859,
-0.05585205927491188,
0.05664948374032974,
-0.032947998493909836,
-0.12499848753213882,
0.005688948556780815,
0.02867663837969303,
0.002569950418546796,
-0.029144879430532455,
0.03157763555645943,
-0.06381957232952118,
0.15274909138679504,
-0.07331380993127823,
0.01883196085691452,
-0.06868655234575272,
-0.03596195951104164,
-0.09619539231061935,
0.10507222265005112,
-0.06251030415296555,
0.09574922919273376,
-0.11913549154996872,
-0.06121145933866501,
-0.0973372533917427,
0.0018160815816372633,
0.029805617406964302,
0.09138412028551102,
-0.08663304895162582,
-0.044879984110593796,
0.19928184151649475,
-0.11909475177526474,
-0.08380326628684998,
0.09367016702890396,
-0.01744966208934784,
-0.10007224977016449,
0.010876941494643688,
0.17531220614910126,
-0.0244562029838562,
-0.03576638177037239,
-0.04192519560456276,
0.003198040649294853,
-0.15272191166877747,
-0.06346306949853897,
-0.06974166631698608,
0.15619108080863953,
0.030275864526629448,
-0.03408117592334747,
-0.028254369273781776,
0.08099199831485748,
-0.12453390657901764,
-0.10478387773036957,
-0.04384445771574974,
-0.09010497480630875,
0.019377050921320915,
0.04687240719795227,
0.12048951536417007,
0.04958990216255188,
-0.05682099610567093,
-0.18854448199272156,
0.03711370378732681,
-0.040553152561187744,
0.10992804914712906,
-0.1016993448138237,
0.16345857083797455,
-0.0324053131043911,
0.023020165041089058,
-0.24396753311157227,
-0.036164067685604095,
0.001099988236092031,
0.033006057143211365,
-0.02438049018383026,
0.031112706288695335,
0.06862418353557587,
-0.03928405046463013,
-0.0370740108191967,
-0.06982580572366714,
0.0579024963080883,
-0.023036137223243713,
-0.01346115954220295,
-0.07706062495708466,
0.015331270173192024,
-0.06971108168363571,
-0.025232503190636635,
-0.14091938734054565,
-0.05557587742805481,
0.03171567618846893,
0.136163592338562,
-0.013488548807799816,
0.07727973163127899,
-0.04401315003633499,
-0.03547320142388344,
-0.012745033018290997,
-0.0703873485326767,
0.07882826030254364,
-0.01460378710180521,
-0.011510140262544155,
0.053760867565870285,
0.043415460735559464,
0.15966254472732544,
0.1359671652317047,
-0.29837125539779663,
-0.10110662877559662,
0.10063248872756958,
-0.029598083347082138,
0.0075759282335639,
0.009420227259397507,
0.11691375076770782,
0.14380717277526855,
-0.01608603075146675,
0.04739191383123398,
-0.026110723614692688,
0.10502426326274872,
0.022296136245131493,
-0.04719935730099678,
-0.022579681128263474,
0.06073072925209999,
0.03806349262595177,
-0.27490732073783875,
0.05868559330701828,
0.03820537403225899,
-0.1381387710571289,
-0.022569075226783752,
-0.01115906611084938,
-0.055527035146951675,
0.07982605695724487,
0.08695974200963974,
-0.0029335685539990664,
0.054909348487854004,
0.030321035534143448,
-0.056420356035232544,
0.024967951700091362,
-0.11099997907876968,
0.03611764684319496,
-0.11957498639822006,
0.01978095807135105,
-0.05205049738287926,
0.0224534310400486,
0.020146183669567108,
0.04041709750890732,
0.01003020629286766,
0.08644474297761917,
0.011071660555899143,
-0.19169823825359344,
0.02648504637181759,
-0.05994715541601181,
-0.006003072019666433,
0.20096056163311005,
-0.014781941659748554,
-0.2528081238269806,
-0.013159460388123989,
-0.06434469670057297,
-0.04610476642847061,
0.014993774704635143,
0.02400091663002968,
-0.1690298616886139,
-0.04073570296168327,
0.02738729678094387,
0.028085622936487198,
0.06585867702960968,
0.03221980109810829,
0.008873585611581802,
0.008155830204486847,
-0.005330168176442385,
-0.03792063519358635,
0.005196568556129932,
-0.19364729523658752,
-0.11086932569742203,
0.12803220748901367,
-0.10683499276638031,
0.0727882981300354,
0.10985487699508667,
-0.011087493039667606,
0.027579177170991898,
0.021416692063212395,
0.1650094985961914,
-0.08658349514007568,
0.001803247956559062,
0.06770754605531693,
0.008463691920042038,
0.060786325484514236,
0.087828129529953,
0.011576122604310513,
-0.10527554154396057,
-0.0023850491270422935,
0.025533633306622505,
-0.0471671000123024,
-0.0932692214846611,
-0.12378356605768204,
-0.03554900363087654,
-0.020865077152848244,
0.1392531394958496,
0.07811623066663742,
0.04299869015812874,
0.09809346497058868,
0.03773089125752449,
-0.03198246657848358,
0.04472929984331131,
0.06039099767804146,
0.11695236712694168,
0.021177023649215698,
0.11293401569128036,
-0.06092534959316254,
-0.09304522722959518,
0.02215699665248394,
0.06662129610776901,
0.2503420114517212,
-0.03325411677360535,
-0.12929852306842804,
0.034634701907634735,
0.20973743498325348,
0.14032338559627533,
0.11195618659257889,
-0.053508032113313675,
-0.0446770116686821,
0.015411756001412868,
0.023029012605547905,
-0.013527204282581806,
0.04641374945640564,
0.07235120236873627,
-0.07753536105155945,
0.018948160111904144,
0.039883971214294434,
0.006641671061515808,
0.23265179991722107,
0.13058383762836456,
-0.3275323808193207,
-0.026314686983823776,
-0.04888926073908806,
0.009060313925147057,
-0.046013087034225464,
0.12998785078525543,
0.01667068339884281,
-0.04744471237063408,
0.05186162143945694,
-0.13498079776763916,
0.08925706893205643,
-0.08000063896179199,
0.027066657319664955,
0.0007050245185382664,
-0.07834101468324661,
0.06579329818487167,
0.01880345121026039,
-0.15778401494026184,
0.19509968161582947,
-0.041605036705732346,
-0.06095254421234131,
-0.03143259137868881,
-0.05542684718966484,
0.08103534579277039,
0.11073995381593704,
0.2163456678390503,
0.015206702053546906,
-0.023822758346796036,
0.02683872915804386,
0.012682470493018627,
0.02246783673763275,
0.12912426888942719,
0.05702856183052063,
-0.03565071150660515,
0.02266528643667698,
-0.05012273043394089,
0.027500445023179054,
0.11448082327842712,
0.02174338325858116,
-0.08640487492084503,
-0.03521240875124931,
0.02685510739684105,
-0.03928445652127266,
-0.00970480591058731,
-0.053410209715366364,
-0.0196926798671484,
0.11782960593700409,
0.03888550400733948,
0.000791321275755763,
-0.08791199326515198,
0.0704430341720581,
0.09081615507602692,
-0.07240357249975204,
0.09658049046993256,
-0.04402809217572212,
0.04021677374839783,
-0.02433166466653347,
-0.22330059111118317,
0.19713091850280762,
-0.10196847468614578,
-0.010259950533509254,
0.026182159781455994,
-0.018575577065348625,
-0.007777910679578781,
-0.04176381230354309,
0.03852911666035652,
0.03962760046124458,
-0.154759019613266,
-0.0038064068648964167,
0.07512474060058594,
-0.05109056457877159,
-0.05209963023662567,
0.11168338358402252,
-0.03765421360731125,
-0.005651270970702171,
-0.07825334370136261,
0.1671111285686493,
0.15769889950752258,
-0.02511957287788391,
-0.07747594267129898,
-0.005477683153003454,
0.10828018188476562,
0.054838068783283234,
-0.3625635802745819,
-0.071600541472435,
-0.032428231090307236,
-0.030581897124648094,
-0.09672317653894424,
-0.11674997210502625,
0.17704439163208008,
-0.057948943227529526,
-0.026801254600286484,
0.047275360673666,
-0.11357663571834564,
-0.03833182528614998,
0.25215184688568115,
0.0690564289689064,
0.3235005736351013,
-0.07480263710021973,
0.006578272674232721,
-0.03670022636651993,
0.0511147640645504,
0.02499515935778618,
0.023564422503113747,
0.10711825639009476,
-0.0005448069423437119,
0.1309168040752411,
-0.012865539640188217,
-0.021053098142147064,
-0.03511054813861847,
0.08789539337158203,
0.06238444894552231,
-0.10399191081523895,
0.030277999117970467,
0.03433924913406372,
-0.01336060743778944,
0.0037412536330521107,
0.20008961856365204,
0.11669732630252838,
0.012370551005005836,
0.014542870223522186,
-0.10157157480716705,
0.05524391680955887,
0.104908786714077,
0.0012534648412838578,
-0.10693530738353729,
0.05290684849023819,
0.0868673324584961,
0.07412654906511307,
0.2506482005119324,
0.09582418203353882,
0.10936985164880753,
0.07545501738786697,
-0.020809166133403778,
-0.07045572996139526,
-0.09366432577371597,
-0.08007951825857162,
-0.03247310221195221,
0.13897334039211273,
-0.0851539894938469,
0.044494710862636566,
0.10430929064750671,
0.011564386077225208,
-0.011123903095722198,
0.1253771185874939,
-0.013523777946829796,
0.00007070678839227185,
0.08741416037082672,
-0.08468732982873917,
-0.054008934646844864,
0.0024577388539910316,
0.023133397102355957,
0.1214664950966835,
0.15753883123397827,
0.13810713589191437,
-0.08928141742944717,
0.01581190526485443,
0.005592189729213715,
0.06723731011152267,
-0.060160957276821136,
0.07067473232746124,
0.13102371990680695,
0.007646622136235237,
-0.14499303698539734,
0.17384225130081177,
-0.009134200401604176,
-0.15489234030246735,
-0.019230904057621956,
-0.0430375337600708,
-0.1217782199382782,
-0.1061612069606781,
-0.049318306148052216,
0.009697653353214264,
-0.23965798318386078,
-0.04195836931467056,
-0.00257253460586071,
-0.09004853665828705,
0.09919212013483047,
0.09002063423395157,
0.11944286525249481,
-0.0013454792788252234,
-0.06208103895187378,
-0.03452495485544205,
-0.07308915257453918,
-0.011999952606856823,
0.040148455649614334,
0.07525607943534851,
-0.16470380127429962,
-0.17379215359687805,
0.011770646087825298,
0.1832388937473297,
-0.10081958770751953,
-0.01901462860405445,
-0.12386397272348404,
0.08843978494405746,
-0.08819127082824707,
0.03508936986327171,
-0.013828903436660767,
0.010643631219863892,
-0.05563155189156532,
-0.04750439152121544,
-0.0838933140039444,
0.026323050260543823,
-0.09638624638319016,
0.007307535968720913,
0.0025148573331534863,
-0.07064785808324814,
-0.10205885767936707,
-0.0036746980622410774,
0.0004545443516690284,
-0.0033063979353755713,
0.12240234762430191,
0.11536829173564911,
-0.09778837114572525,
0.12104161083698273,
-0.12688417732715607,
-0.2539132535457611,
0.14661380648612976,
0.011033882386982441,
-0.009231960400938988,
0.07249196618795395,
0.026276908814907074,
0.017272265627980232,
-0.001079774578101933,
-0.010369636118412018,
0.025508636608719826,
-0.01764269731938839,
0.016553910449147224,
-0.22716856002807617,
-0.005806738045066595,
-0.036009471863508224,
0.02123752050101757,
0.06139025837182999,
0.1250770539045334,
0.06566081196069717,
-0.08076547831296921,
0.012497694231569767,
-0.004178235772997141,
0.01405277755111456,
0.004700651858001947,
-0.10142960399389267,
0.09955787658691406,
-0.059505969285964966,
0.039005961269140244,
-0.08107014745473862,
0.16751740872859955,
-0.07013847678899765,
-0.04317577928304672,
0.022722819820046425,
0.19938738644123077,
-0.18842896819114685,
0.08512367308139801,
0.12884214520454407,
0.1152644008398056,
-0.021650347858667374,
0.03236263990402222,
0.07901591062545776,
0.08805978298187256,
0.2019713670015335,
0.03225326910614967,
0.21013660728931427,
-0.008131392300128937,
0.12226073443889618,
0.07050228118896484,
-0.0023360522463917732,
0.052982766181230545,
0.05396782234311104,
-0.10959552973508835,
0.06313527375459671,
0.003909396007657051,
0.0034602144733071327,
0.13745799660682678,
0.0735032856464386,
0.018869882449507713,
0.002401195000857115,
-0.020885029807686806,
-0.08683834969997406,
-0.11803468316793442,
-0.10098493099212646,
-0.14222751557826996,
0.05021996051073074,
-0.015355239622294903,
-0.05628221109509468,
0.2072771042585373,
0.09360400587320328,
-0.02101852558553219,
0.12540660798549652,
-0.04815549775958061,
-0.06232065334916115,
0.17239010334014893,
-0.005154091399163008,
-0.10518171638250351,
0.13748005032539368,
-0.05297958850860596,
0.03314472734928131,
0.057131268084049225,
-0.011303236708045006,
0.001412559999153018,
0.006177163682878017,
0.1295699030160904,
-0.038002315908670425,
-0.06904566287994385,
-0.03361576050519943,
0.05422547459602356,
-0.07377170026302338,
-0.08924497663974762,
-0.02439011074602604,
-0.004112503491342068,
0.015669450163841248,
0.26361364126205444,
-0.11519166082143784,
-0.03956569358706474,
-0.09168118983507156,
-0.03702652454376221,
-0.008218398317694664,
0.10316119343042374,
-0.0893259197473526,
-0.03464678302407265,
-0.05597321689128876,
0.2172318547964096,
0.21798594295978546,
-0.18943066895008087,
0.0030119726434350014,
0.10964611917734146,
0.021330608054995537,
-0.036226436495780945,
0.014681376516819,
0.14544600248336792,
0.13675609230995178,
-0.06903103739023209,
-0.12196600437164307,
-0.028807656839489937,
-0.05713662877678871,
-0.07782092690467834,
0.008308938704431057,
0.08386057615280151,
-0.02823054790496826,
-0.17064912617206573,
0.12138491868972778,
-0.21265347301959991,
-0.02967873215675354,
0.09312994033098221,
-0.1480749398469925,
-0.11778515577316284,
-0.04605138674378395,
0.06611345708370209,
0.0844729021191597,
0.09218201786279678,
-0.06134823337197304,
-0.009951614774763584,
0.015241128392517567,
0.049815237522125244,
-0.16508835554122925,
-0.04498249292373657,
-0.06281755864620209,
-0.0078077069483697414,
0.07460979372262955,
0.0035026620607823133,
-0.04684630036354065,
0.04779211804270744,
0.01854811981320381,
-0.012193878181278706,
-0.002248608274385333,
-0.03543119505047798,
-0.16973876953125,
0.01616964116692543,
0.1358521431684494,
-0.031988248229026794,
-0.00933578610420227,
0.06028946861624718,
-0.24986061453819275,
-0.02156110480427742,
0.04570363834500313,
0.03851936012506485,
-0.09015276283025742,
-0.004490310791879892,
-0.06526032090187073,
0.06967474520206451,
0.06411376595497131,
0.018326347693800926,
0.021268298849463463,
-0.049520451575517654,
-0.026317117735743523,
0.029106764122843742,
-0.04576254263520241,
-0.00640319986268878,
-0.08154943585395813,
-0.06505836546421051,
-0.19577236473560333,
0.02853178232908249,
-0.04132523760199547,
-0.038306526839733124,
-0.020721210166811943,
-0.007531433831900358,
-0.06790506094694138,
0.09864114224910736,
0.09792070090770721,
-0.025514377281069756,
0.013281364925205708,
-0.01059822365641594,
0.07926242053508759,
0.04989178478717804,
-0.051241371780633926,
-0.11247623711824417
] |
null | null |
transformers
|
# eca_resnet26t
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/eca_resnet26t
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# eca_resnet26t
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# eca_resnet26t\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# eca_resnet26t\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
29
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# eca_resnet26t\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10457070171833038,
0.029430776834487915,
-0.0028053056448698044,
0.0883975401520729,
0.09627515822649002,
0.010581794194877148,
-0.03246023505926132,
0.07576985657215118,
-0.07312431931495667,
-0.007466910406947136,
0.1127457320690155,
0.10824551433324814,
-0.0015766889555379748,
-0.03975651413202286,
-0.06539358198642731,
-0.23350664973258972,
0.08232293277978897,
0.07957950979471207,
-0.006315643899142742,
0.055250659584999084,
0.08797156065702438,
-0.08669257164001465,
0.10958514362573624,
0.0033926276955753565,
-0.15765498578548431,
0.0198005773127079,
-0.026967667043209076,
-0.07477791607379913,
0.08713716268539429,
0.025719720870256424,
0.09796079993247986,
0.0057305581867694855,
0.10381416976451874,
-0.11277484893798828,
0.028886448591947556,
0.04709121212363243,
-0.08338604122400284,
0.050263967365026474,
0.08722009509801865,
-0.03043767437338829,
0.0214325450360775,
0.04535980895161629,
-0.05569998919963837,
-0.0597236342728138,
0.003433309495449066,
-0.04899241775274277,
0.012037662789225578,
0.14421863853931427,
0.11517725139856339,
0.09471197426319122,
0.027831804007291794,
0.08146046847105026,
-0.11967800557613373,
0.08371871709823608,
0.1772053837776184,
-0.2727671265602112,
-0.048440415412187576,
0.07380592823028564,
-0.040088869631290436,
-0.06352916359901428,
-0.06611426919698715,
0.019990652799606323,
0.04217151552438736,
0.04622535780072212,
-0.10745038092136383,
0.002187055768445134,
-0.08758925646543503,
-0.008185510523617268,
-0.08717206120491028,
-0.028870854526758194,
0.09009508043527603,
0.10181812942028046,
0.056959863752126694,
0.024573829025030136,
-0.16058863699436188,
0.01169031485915184,
-0.0886860117316246,
0.03547212481498718,
0.03480205312371254,
0.04947671294212341,
-0.08556729555130005,
0.007858311757445335,
-0.07632704824209213,
-0.03464769199490547,
-0.1524270623922348,
-0.02521669492125511,
0.06053881719708443,
0.13010405004024506,
-0.12110460549592972,
0.031748633831739426,
0.05024104192852974,
-0.04559389501810074,
-0.01639491692185402,
-0.07195907086133957,
0.032818011939525604,
0.04831250011920929,
-0.06571341305971146,
0.008120262995362282,
0.07817667722702026,
0.2526457905769348,
-0.0575823113322258,
-0.002964134095236659,
-0.01771547459065914,
0.1365654617547989,
-0.0856563001871109,
0.004541204310953617,
-0.22927223145961761,
0.11358312517404556,
0.04729635640978813,
-0.08047395944595337,
0.02857809327542782,
-0.019755570217967033,
-0.08688779920339584,
-0.022706281393766403,
0.10747348517179489,
-0.04046187922358513,
-0.030273480340838432,
0.05635569617152214,
-0.019806750118732452,
-0.040838923305273056,
0.1368095874786377,
-0.037845827639102936,
-0.027822604402899742,
-0.004454907961189747,
-0.10842697322368622,
0.07999575138092041,
0.10854221135377884,
0.0034855669364333153,
-0.08773790299892426,
0.02774592861533165,
-0.018220284953713417,
-0.015618713572621346,
0.0022512031719088554,
-0.03964855149388313,
0.05051446706056595,
-0.015237882733345032,
0.07583023607730865,
-0.22873710095882416,
-0.14876236021518707,
0.010053515434265137,
0.10532962530851364,
-0.03579511493444443,
-0.00034523208159953356,
-0.020880211144685745,
-0.11840302497148514,
0.006858814973384142,
0.0011768124531954527,
0.025146862491965294,
-0.07647869735956192,
0.001849617576226592,
-0.13248597085475922,
0.13305416703224182,
-0.11580085009336472,
0.03617505729198456,
-0.09066014736890793,
-0.04479385167360306,
-0.06548610329627991,
0.07006150484085083,
-0.07390766590833664,
0.0789070650935173,
-0.13009744882583618,
-0.06891889125108719,
-0.0576421394944191,
-0.017078323289752007,
0.008213132619857788,
0.06020146608352661,
-0.08191998302936554,
-0.010296535678207874,
0.11628740280866623,
-0.1067388579249382,
-0.10213591903448105,
0.08431457728147507,
-0.031345002353191376,
-0.10997148603200912,
-0.0005601410521194339,
0.1355292648077011,
-0.03824728727340698,
-0.0030447551980614662,
-0.05583719164133072,
0.02083570696413517,
-0.10949753224849701,
-0.12532126903533936,
-0.018056022003293037,
0.13789260387420654,
0.020674938336014748,
-0.014277772977948189,
-0.09317674487829208,
0.0913732573390007,
-0.08602529764175415,
-0.10950642079114914,
-0.02769794873893261,
-0.08837490528821945,
-0.0285960603505373,
0.02525912970304489,
0.0957668200135231,
0.04942753165960312,
-0.04809383302927017,
-0.15647992491722107,
0.06732172518968582,
-0.05722498521208763,
0.11709029972553253,
-0.09335478395223618,
0.14666414260864258,
-0.0380595363676548,
0.030975593253970146,
-0.21820701658725739,
0.01888270489871502,
0.005664709489792585,
0.030677655711770058,
-0.034861817955970764,
-0.0671318769454956,
0.024201566353440285,
-0.04996408149600029,
-0.03019092045724392,
-0.0752854272723198,
0.043826811015605927,
-0.01642325334250927,
-0.006099840626120567,
-0.13043108582496643,
-0.0063094752840697765,
-0.044324152171611786,
0.017197469249367714,
-0.1283702254295349,
-0.05731325224041939,
0.03813593089580536,
0.1305987536907196,
-0.026584530249238014,
0.1040419489145279,
-0.05139511078596115,
-0.061856891959905624,
-0.0455857515335083,
-0.04801085591316223,
0.06979761272668839,
0.005226305220276117,
-0.016011742874979973,
0.06904126703739166,
0.08983705192804337,
0.1467168927192688,
0.15552067756652832,
-0.2856258451938629,
-0.03142703324556351,
0.08633608371019363,
-0.03319796174764633,
-0.01054908987134695,
-0.021794607862830162,
0.12157154828310013,
0.09129570424556732,
-0.03231040760874748,
0.0805860161781311,
-0.06813284754753113,
0.0933227613568306,
0.009376841597259045,
-0.03323158994317055,
0.001171683892607689,
0.050243064761161804,
0.11695694178342819,
-0.24835355579853058,
0.08833204954862595,
0.1352226585149765,
-0.17683689296245575,
-0.05140363425016403,
-0.02466019243001938,
-0.06721216440200806,
0.10367460548877716,
0.042836744338274,
-0.019531067460775375,
0.07351680845022202,
0.0391564667224884,
-0.01697111874818802,
0.06548447906970978,
-0.08711355179548264,
0.030585184693336487,
-0.13522614538669586,
0.02671266347169876,
-0.0419432707130909,
0.012189288623631,
0.01025563944131136,
-0.0033599103335291147,
-0.01447366364300251,
0.10743555426597595,
-0.024038121104240417,
-0.24757079780101776,
0.049230873584747314,
-0.05712820589542389,
-0.03109735995531082,
0.22394415736198425,
-0.03619995340704918,
-0.21742035448551178,
-0.005949896760284901,
-0.026296067982912064,
-0.07578127831220627,
-0.00185601064004004,
0.045828089118003845,
-0.12033922225236893,
-0.07450010627508163,
-0.02378353662788868,
-0.08940302580595016,
0.10885141044855118,
0.03472479432821274,
0.029108749702572823,
0.015306978486478329,
0.01941242627799511,
-0.05990567058324814,
0.010394112206995487,
-0.13793595135211945,
-0.09376603364944458,
0.14455145597457886,
-0.08850719034671783,
0.0995735228061676,
0.08648181706666946,
-0.01976657100021839,
0.0041780024766922,
0.02271971106529236,
0.14126670360565186,
-0.050004199147224426,
0.028275707736611366,
0.08235486596822739,
-0.021385880187153816,
0.053285326808691025,
0.08978604525327682,
0.007095993962138891,
-0.08023771643638611,
0.00003416787149035372,
0.02806701324880123,
-0.037734296172857285,
-0.12196870148181915,
-0.12557218968868256,
-0.030596895143389702,
-0.021292926743626595,
0.16456906497478485,
0.10443305224180222,
0.04723380133509636,
0.10535255819559097,
0.0013704816810786724,
-0.01015821285545826,
0.04798338562250137,
0.0648968443274498,
0.10371963679790497,
0.03896481543779373,
0.1100812554359436,
-0.06828468292951584,
-0.07430411130189896,
0.04265784099698067,
0.13340266048908234,
0.26095739006996155,
-0.010688716545701027,
-0.11630053073167801,
0.05544956400990486,
0.21271255612373352,
0.09507223963737488,
0.12078836560249329,
-0.0660683736205101,
-0.009563466534018517,
-0.01509803719818592,
0.002113629598170519,
0.029767317697405815,
0.06064654886722565,
-0.02653304487466812,
-0.05503535643219948,
0.013664690777659416,
0.032739683985710144,
0.03958241641521454,
0.23120231926441193,
0.08552104979753494,
-0.3428786098957062,
0.02961129881441593,
-0.028803909197449684,
0.008071774616837502,
-0.014067359268665314,
0.1406853049993515,
-0.01543740089982748,
-0.03036135621368885,
0.093637615442276,
-0.10464349389076233,
0.11331968754529953,
-0.05618251487612724,
0.00517159653827548,
0.048443395644426346,
-0.10908731818199158,
0.07845279574394226,
0.04390314966440201,
-0.13541501760482788,
0.1875445544719696,
-0.049991365522146225,
-0.06497108191251755,
-0.06937754154205322,
-0.046824485063552856,
0.10710783302783966,
0.15229766070842743,
0.24434426426887512,
0.012219798751175404,
-0.05026889592409134,
0.07720808684825897,
-0.018420040607452393,
0.05069682002067566,
0.04817299172282219,
0.04525842145085335,
-0.05143751576542854,
0.0456346832215786,
-0.04930970072746277,
0.046536635607481,
0.11984080821275711,
-0.04144466668367386,
-0.09795152395963669,
-0.07020790874958038,
0.0877707377076149,
-0.05602454021573067,
-0.025529325008392334,
-0.07375231385231018,
-0.07092810422182083,
0.0787568986415863,
0.01122068427503109,
-0.02777463011443615,
-0.07742757350206375,
0.0731123834848404,
0.060010962188243866,
-0.071244016289711,
0.10479087382555008,
-0.027384720742702484,
0.04363591969013214,
-0.01865950971841812,
-0.22222693264484406,
0.17793434858322144,
-0.12702549993991852,
-0.008026357740163803,
0.022578489035367966,
-0.072667695581913,
-0.004252216778695583,
-0.0317828506231308,
0.05817944556474686,
0.03592277318239212,
-0.1358446180820465,
-0.026796765625476837,
0.02869626134634018,
-0.03773656114935875,
0.020007450133562088,
0.09535008668899536,
-0.0646081268787384,
-0.08395344018936157,
-0.03908449411392212,
0.14122474193572998,
0.14151176810264587,
-0.00029248982900753617,
-0.09162415564060211,
-0.015167584642767906,
0.1457478404045105,
0.02605263702571392,
-0.3775632977485657,
-0.09773769229650497,
-0.028999974951148033,
-0.05632798746228218,
-0.09354251623153687,
-0.1293414980173111,
0.2309338003396988,
0.010213150642812252,
-0.06122275069355965,
0.007254638709127903,
-0.12365885078907013,
-0.047853272408246994,
0.26369282603263855,
0.05950915068387985,
0.2765624523162842,
-0.0780777558684349,
0.03604063019156456,
-0.08038333058357239,
0.012374652549624443,
0.10077693313360214,
-0.023006621748209,
0.06472385674715042,
-0.012132911942899227,
0.04949255287647247,
-0.053913358598947525,
-0.0080203702673316,
-0.002294649835675955,
0.13434219360351562,
0.09755348414182663,
-0.08129727095365524,
0.07604892551898956,
0.05843017250299454,
-0.015297151170670986,
0.057777199894189835,
0.07407542318105698,
0.09594760835170746,
-0.032530464231967926,
0.020492935553193092,
-0.08481311053037643,
0.058975011110305786,
0.10097707808017731,
-0.004027234856039286,
-0.11328849196434021,
0.03809956833720207,
0.10636353492736816,
0.08290805667638779,
0.28482335805892944,
0.13836351037025452,
0.01267250720411539,
0.0698290467262268,
0.005068878177553415,
-0.1114509254693985,
-0.08333971351385117,
-0.09004037827253342,
-0.04133410379290581,
0.15851417183876038,
-0.14942103624343872,
0.05667712539434433,
0.10516682267189026,
-0.013689064420759678,
0.005632312037050724,
0.11222158372402191,
0.010802607983350754,
-0.02422284334897995,
0.10987148433923721,
-0.056878555566072464,
0.008481521159410477,
0.013690236955881119,
0.07450971752405167,
0.11215405911207199,
0.14556534588336945,
0.11217106878757477,
-0.10375448316335678,
-0.005804500076919794,
0.01657351478934288,
0.07976828515529633,
-0.04446546360850334,
0.06714416295289993,
0.15743322670459747,
-0.03215787559747696,
-0.138727605342865,
0.21057170629501343,
0.011249064467847347,
-0.1731969118118286,
-0.028499605134129524,
-0.06403861194849014,
-0.13248763978481293,
-0.1288985162973404,
-0.01897459290921688,
0.00023963424609974027,
-0.2679016888141632,
-0.09749604016542435,
-0.006916843354701996,
-0.07611414045095444,
0.11934066563844681,
0.010567253455519676,
0.1412506252527237,
-0.029690079391002655,
-0.022911760956048965,
-0.06061264127492905,
-0.030850745737552643,
0.020336559042334557,
0.02369382046163082,
0.08355164527893066,
-0.16720479726791382,
-0.23564881086349487,
0.021508201956748962,
0.15193556249141693,
-0.07209276407957077,
0.022415734827518463,
-0.07476162165403366,
0.08940091729164124,
-0.11445934325456619,
0.07631979137659073,
-0.036056410521268845,
0.020703887566924095,
-0.04431981220841408,
-0.02259823866188526,
-0.08163444697856903,
0.05632396414875984,
-0.09769795835018158,
-0.026146383956074715,
0.006624156143516302,
-0.012985645793378353,
-0.11256127804517746,
-0.037635426968336105,
-0.006668888032436371,
-0.007200654596090317,
0.1301761269569397,
0.07529646158218384,
-0.10662037879228592,
0.08529918640851974,
-0.1470099687576294,
-0.25830286741256714,
0.16286489367485046,
0.045059215277433395,
-0.008547664619982243,
0.06266174465417862,
0.05212802812457085,
0.07052534073591232,
-0.04234246164560318,
-0.032808639109134674,
0.0019850055687129498,
-0.04780755564570427,
-0.02682577818632126,
-0.22721166908740997,
-0.00502537377178669,
-0.025266043841838837,
-0.003250321140512824,
0.07485883682966232,
0.10751103609800339,
0.1212277039885521,
-0.044386837631464005,
-0.014337152242660522,
-0.0418364517390728,
0.004997629206627607,
-0.0036349156871438026,
-0.11485478281974792,
-0.020791243761777878,
-0.035530105233192444,
0.017166854813694954,
-0.09280504286289215,
0.17131182551383972,
-0.018595563247799873,
-0.08511806279420853,
0.021905088797211647,
0.21004226803779602,
-0.13717569410800934,
0.08264046162366867,
0.18533748388290405,
0.0659904032945633,
-0.00015308109868783504,
0.016691768541932106,
0.028482861816883087,
0.11761131137609482,
0.17468033730983734,
0.0025782035663723946,
0.2188120037317276,
0.059162601828575134,
0.13565868139266968,
0.08518169075250626,
-0.021879635751247406,
0.021210359409451485,
0.053378764539957047,
-0.06488477438688278,
0.10345219820737839,
0.0351238027215004,
0.0062081641517579556,
0.1512349545955658,
0.028957610949873924,
-0.00942903570830822,
0.008763180114328861,
-0.0034260598476976156,
-0.09734269231557846,
-0.17360679805278778,
-0.10990148782730103,
-0.1542084515094757,
0.0503755658864975,
-0.04864387586712837,
-0.07913187891244888,
0.22693583369255066,
0.07701683789491653,
-0.03346841782331467,
0.04382824897766113,
-0.08328352123498917,
-0.057806167751550674,
0.14913900196552277,
-0.004003296606242657,
-0.10124521702528,
0.139497309923172,
-0.049460045993328094,
0.067264124751091,
0.04849392920732498,
-0.021852903068065643,
-0.012594071216881275,
0.06799886375665665,
0.167276993393898,
-0.03427550941705704,
-0.027929022908210754,
-0.03472483903169632,
0.018589945510029793,
-0.04289715737104416,
-0.01754000410437584,
-0.02083701081573963,
0.02715718373656273,
0.05858466774225235,
0.21915245056152344,
-0.10494092106819153,
-0.0668976828455925,
-0.1100371778011322,
-0.0402563102543354,
0.03155387192964554,
0.08063114434480667,
-0.03843359276652336,
-0.043459702283144,
-0.0375259667634964,
0.18253526091575623,
0.21079492568969727,
-0.17206116020679474,
-0.00497428746894002,
0.07085427641868591,
0.011233996599912643,
-0.039125148206949234,
0.02116912417113781,
0.17907239496707916,
0.11309102922677994,
-0.05918646231293678,
-0.12954950332641602,
-0.05631009861826897,
-0.03742195665836334,
-0.11555642634630203,
0.017056144773960114,
0.05465805158019066,
-0.08390240371227264,
-0.12759387493133545,
0.10684279352426529,
-0.1427055448293686,
-0.01776047796010971,
0.07276035845279694,
-0.07831970602273941,
-0.11626562476158142,
-0.038331929594278336,
0.08591422438621521,
0.09776626527309418,
0.018956027925014496,
-0.08941759169101715,
0.02279064618051052,
0.03539672866463661,
0.030613506212830544,
-0.16037501394748688,
-0.030044864863157272,
-0.04863125458359718,
-0.015129721723496914,
0.14977023005485535,
0.03496572747826576,
0.020254887640476227,
0.03933752328157425,
0.03579875826835632,
-0.0588880255818367,
0.015069598332047462,
-0.04581454396247864,
-0.08605387806892395,
0.046730171889066696,
0.060006532818078995,
-0.03790431097149849,
-0.02698800526559353,
0.05817560479044914,
-0.17499452829360962,
-0.0263885036110878,
0.07927028089761734,
0.0021388763561844826,
-0.11210069805383682,
0.01410139910876751,
-0.07243452221155167,
0.0479755625128746,
0.019084768369793892,
0.0005086975870653987,
-0.011444268748164177,
-0.04439272731542587,
-0.024083541706204414,
0.016945796087384224,
-0.1152496263384819,
0.026762492954730988,
-0.022127095609903336,
-0.029971862211823463,
-0.22055502235889435,
0.06124623119831085,
0.01929929479956627,
-0.016657106578350067,
-0.04010902717709541,
-0.045878756791353226,
-0.09173838794231415,
0.07076196372509003,
0.11392749845981598,
-0.04051242396235466,
0.0013557482743635774,
0.019910195842385292,
0.059265587478876114,
-0.0027651707641780376,
-0.02537750080227852,
-0.12470041960477829
] |
null | null |
transformers
|
# efficientnet_b0
Implementation of EfficientNet proposed in [EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks](https://arxiv.org/abs/1905.11946)

The basic architecture is similar to MobileNetV2 as was computed by
using [Progressive Neural Architecture
Search](https://arxiv.org/abs/1905.11946) .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):

Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.

``` python
EfficientNet.efficientnet_b0()
EfficientNet.efficientnet_b1()
EfficientNet.efficientnet_b2()
EfficientNet.efficientnet_b3()
EfficientNet.efficientnet_b4()
EfficientNet.efficientnet_b5()
EfficientNet.efficientnet_b6()
EfficientNet.efficientnet_b7()
EfficientNet.efficientnet_b8()
EfficientNet.efficientnet_l2()
```
Examples:
``` python
EfficientNet.efficientnet_b0(activation = nn.SELU)
# change number of classes (default is 1000 )
EfficientNet.efficientnet_b0(n_classes=100)
# pass a different block
EfficientNet.efficientnet_b0(block=...)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = EfficientNet.efficientnet_b0()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 32, 112, 112]), torch.Size([1, 24, 56, 56]), torch.Size([1, 40, 28, 28]), torch.Size([1, 80, 14, 14])]
```
|
{}
| null |
glasses/efficientnet_b0
|
[
"transformers",
"pytorch",
"arxiv:1905.11946",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1905.11946"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us
|
# efficientnet_b0
Implementation of EfficientNet proposed in EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks
!image
The basic architecture is similar to MobileNetV2 as was computed by
using Progressive Neural Architecture
Search .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):
!image
Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.
!image
Examples:
|
[
"# efficientnet_b0\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n",
"# efficientnet_b0\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
29,
144
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n# efficientnet_b0\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
-0.033114928752183914,
-0.015605971217155457,
-0.0025784901808947325,
0.0167230274528265,
0.038678426295518875,
-0.03705884888768196,
-0.03407173976302147,
0.033724259585142136,
-0.15034173429012299,
0.014691023156046867,
0.08621065318584442,
0.10710324347019196,
-0.03293218836188316,
0.11030368506908417,
0.015585108660161495,
-0.16462908685207367,
-0.0055694784969091415,
0.11996421217918396,
-0.07353699207305908,
0.047367170453071594,
0.12548476457595825,
-0.07966729253530502,
0.096774622797966,
0.06877121329307556,
-0.10949564725160599,
0.05781679227948189,
-0.08478505164384842,
-0.01980690099298954,
0.13614532351493835,
-0.028826482594013214,
0.09736792743206024,
-0.03548577055335045,
-0.033734261989593506,
-0.16243121027946472,
0.0010332274250686169,
0.06682725995779037,
-0.026355864480137825,
0.09936267137527466,
0.06075192987918854,
0.11851672828197479,
0.21016310155391693,
-0.009989170357584953,
-0.0905037447810173,
-0.058579061180353165,
-0.010708516463637352,
-0.09465588629245758,
0.018997637555003166,
0.060661450028419495,
0.031583111733198166,
0.04092903062701225,
0.026422077789902687,
0.1648302972316742,
-0.10567606985569,
0.09552905708551407,
-0.017865417525172234,
-0.32407474517822266,
-0.04601066932082176,
0.0720648244023323,
-0.05026005208492279,
0.01579955592751503,
0.023633962497115135,
0.032677292823791504,
0.061271484941244125,
0.057541098445653915,
0.1451970636844635,
-0.00928537454456091,
-0.1222924143075943,
0.01518709771335125,
-0.10833941400051117,
0.004563125316053629,
0.18194817006587982,
0.05378270521759987,
0.08254684507846832,
-0.07025115936994553,
-0.2151416838169098,
-0.029453380033373833,
0.015146644786000252,
-0.1991647481918335,
-0.011727317236363888,
0.0022259855177253485,
-0.06018131971359253,
-0.08121877163648605,
-0.08128146827220917,
0.00500861182808876,
-0.13220220804214478,
0.09326446801424026,
0.03119853138923645,
-0.010173231363296509,
-0.04739690199494362,
0.133416548371315,
-0.03147970139980316,
-0.07905956357717514,
0.07588917016983032,
-0.061542510986328125,
-0.0053123608231544495,
0.02974277175962925,
0.02288929373025894,
-0.15342585742473602,
0.05021413788199425,
0.03706803172826767,
0.08852452039718628,
-0.06075452268123627,
0.10110388696193695,
0.07749147713184357,
-0.030683819204568863,
0.12723655998706818,
-0.16806872189044952,
-0.19827552139759064,
0.011442835442721844,
0.01186542771756649,
0.05304498225450516,
0.018914008513092995,
-0.07511559128761292,
-0.04684469848871231,
0.11181239038705826,
-0.017270132899284363,
-0.011559338308870792,
0.09445574134588242,
-0.04692371189594269,
-0.11263994872570038,
0.23762260377407074,
-0.022010331973433495,
0.019352681934833527,
-0.030779603868722916,
-0.030563198029994965,
0.0054665603674948215,
0.029138540849089622,
0.0003806450986303389,
-0.1052904725074768,
0.01691949926316738,
-0.08107415586709976,
-0.1450764238834381,
-0.06080010533332825,
-0.0767533928155899,
-0.005827188491821289,
-0.0061124819330871105,
0.003986813127994537,
-0.2116316705942154,
-0.10727058351039886,
0.09453237801790237,
0.0958646759390831,
-0.01468111202120781,
-0.028312675654888153,
-0.036633651703596115,
-0.10997803509235382,
-0.10422706604003906,
-0.0276934877038002,
0.12609036266803741,
-0.042918719351291656,
0.0015480504371225834,
-0.01556325051933527,
0.03099721670150757,
-0.1438286304473877,
0.015909815207123756,
-0.10946222394704819,
0.018430765718221664,
-0.1518549919128418,
0.05069585144519806,
-0.04685074836015701,
-0.007943247444927692,
-0.067202128469944,
-0.023909980431199074,
-0.19656576216220856,
-0.00875874049961567,
0.11821568757295609,
0.08329213410615921,
-0.02899158000946045,
-0.08786704391241074,
-0.022733338177204132,
-0.07625807821750641,
-0.17689761519432068,
0.1554359346628189,
0.008278349414467812,
0.11292339861392975,
0.05868687480688095,
0.19805285334587097,
0.12527517974376678,
-0.03877939283847809,
-0.06523573398590088,
0.025869645178318024,
-0.11233310401439667,
-0.12108258903026581,
-0.07746228575706482,
0.1115993782877922,
0.048072416335344315,
0.042154230177402496,
-0.0025487886741757393,
0.08143147081136703,
-0.05903429538011551,
-0.0680423378944397,
-0.021983450278639793,
-0.06080261617898941,
-0.01320107001811266,
0.012011220678687096,
0.07125309109687805,
0.045437268912792206,
-0.05005975067615509,
-0.18043343722820282,
0.0670291855931282,
-0.0801236554980278,
0.02672315388917923,
-0.0758417546749115,
0.0144143495708704,
-0.09230908751487732,
0.03349047899246216,
-0.07245729863643646,
-0.02776610106229782,
0.11053695529699326,
0.013240672647953033,
-0.0329243540763855,
0.1506654918193817,
0.02563576214015484,
-0.05060221254825592,
-0.10076849162578583,
-0.02102508768439293,
0.018862001597881317,
0.04266824573278427,
-0.10479412972927094,
-0.12487394362688065,
-0.04696483537554741,
-0.023478610441088676,
0.0483156256377697,
-0.11321309953927994,
-0.03263833746314049,
0.0566302090883255,
0.24286139011383057,
0.018837902694940567,
0.03410619497299194,
-0.0771624967455864,
-0.020633982494473457,
-0.09754202514886856,
-0.018192512914538383,
0.0202122051268816,
-0.002929715206846595,
-0.04633043706417084,
0.054171815514564514,
0.07966536283493042,
0.1258450597524643,
0.05643188953399658,
-0.14869779348373413,
0.05201684683561325,
0.04089997336268425,
-0.05267580598592758,
-0.0865587368607521,
0.03445517644286156,
-0.10368268191814423,
0.0857609361410141,
-0.03565013408660889,
0.121464304625988,
-0.07457109540700912,
-0.047613393515348434,
0.02257954142987728,
0.026533398777246475,
0.047572411596775055,
0.03473825380206108,
0.116939015686512,
-0.2993123233318329,
0.037243641912937164,
0.09073645621538162,
-0.09850745648145676,
0.09604647010564804,
0.05317971482872963,
-0.021584121510386467,
0.0007880981429480016,
0.030045686289668083,
-0.01453369576483965,
0.17263180017471313,
-0.1568029522895813,
-0.01061142049729824,
0.040741216391325,
-0.0003407634503673762,
0.1928202509880066,
-0.07993479073047638,
0.0904134139418602,
0.002701449440792203,
0.022823527455329895,
0.2804569602012634,
-0.002027164213359356,
-0.031618643552064896,
0.06323924660682678,
-0.05278777703642845,
-0.26955434679985046,
0.004788790829479694,
-0.038732774555683136,
-0.027357039973139763,
0.14557264745235443,
-0.009075818583369255,
-0.17830690741539001,
-0.07822293043136597,
-0.0027503948658704758,
-0.04519969969987869,
-0.0010580861708149314,
-0.041816841810941696,
0.010494726710021496,
-0.070726677775383,
-0.03676937520503998,
-0.07983054965734482,
0.059767045080661774,
0.05960698798298836,
-0.00514968391507864,
0.05325149744749069,
0.04444460943341255,
-0.10456157475709915,
0.01004305575042963,
-0.1029888316988945,
0.013497188687324524,
0.026341328397393227,
-0.11937889456748962,
0.12819282710552216,
0.15446442365646362,
-0.03205982223153114,
-0.01264783926308155,
0.02909422665834427,
0.10766752064228058,
-0.05184168741106987,
0.07643505930900574,
-0.07792072743177414,
0.054873935878276825,
0.02201557345688343,
0.13043783605098724,
0.004414720926433802,
-0.01561746746301651,
0.032962772995233536,
-0.013499148190021515,
-0.00039514453965239227,
-0.14181770384311676,
-0.18201954662799835,
-0.04680820554494858,
0.020140428096055984,
0.07969145476818085,
0.0622875839471817,
0.09797274321317673,
0.06536036729812622,
-0.008826770819723606,
0.03156285732984543,
0.009273547679185867,
0.055019836872816086,
0.23107479512691498,
0.026164494454860687,
0.07096380740404129,
-0.04127541929483414,
-0.21140283346176147,
0.04752111807465553,
0.09750450402498245,
0.19653360545635223,
-0.05039512366056442,
0.09611399471759796,
0.02601883001625538,
0.11207819730043411,
0.012082039378583431,
0.11071927100419998,
-0.06635630130767822,
-0.006296850740909576,
-0.06546121090650558,
-0.03729432448744774,
0.060707125812768936,
0.07429510354995728,
-0.011734669096767902,
-0.05014875903725624,
0.07619477808475494,
0.08587384223937988,
0.11276070773601532,
0.09524977207183838,
0.10729004442691803,
-0.24882228672504425,
0.05798177048563957,
-0.04422324523329735,
-0.09204041212797165,
-0.05240987241268158,
0.052157726138830185,
0.22196269035339355,
0.006701950915157795,
0.04439517855644226,
-0.04272977262735367,
0.05668448284268379,
-0.08439407497644424,
0.03872521594166756,
0.024524128064513206,
0.08562886714935303,
0.059812095016241074,
0.09365866333246231,
-0.08871760964393616,
0.19558030366897583,
0.004320588894188404,
0.0248879287391901,
-0.07406161725521088,
-0.04018109664320946,
-0.023409921675920486,
0.046584323048591614,
0.02786824479699135,
-0.00948400516062975,
0.256071001291275,
0.12863928079605103,
-0.12244390696287155,
0.0497807115316391,
0.031074268743395805,
0.0518224723637104,
0.07137195765972137,
0.024655383080244064,
-0.07438527047634125,
0.01682325452566147,
0.12391653656959534,
0.023380540311336517,
-0.10108665376901627,
0.007358978036791086,
0.17417123913764954,
-0.12085605412721634,
-0.0567757785320282,
-0.028399758040905,
-0.05956631526350975,
0.26631104946136475,
0.01158389262855053,
-0.0015359226381406188,
-0.008736173622310162,
0.016253702342510223,
0.11870244145393372,
-0.0370144322514534,
0.03772200644016266,
-0.03677321597933769,
0.03955046460032463,
0.009800208732485771,
-0.08473781496286392,
0.08379273861646652,
-0.08969604223966599,
0.003269163193181157,
-0.06840591877698898,
0.03967013210058212,
0.01724393106997013,
-0.013932842761278152,
0.06973428279161453,
-0.018539687618613243,
-0.13989953696727753,
-0.08429951965808868,
-0.11846999078989029,
0.004814630374312401,
-0.13248464465141296,
0.08159264922142029,
-0.059579573571681976,
-0.004786017816513777,
-0.02779233455657959,
0.04872587323188782,
0.1014314740896225,
0.08718013763427734,
-0.07604068517684937,
-0.1052510216832161,
0.11710543185472488,
-0.0026279017329216003,
-0.25137224793434143,
-0.13984648883342743,
0.043041475117206573,
0.03614658862352371,
0.04571503400802612,
-0.01654299721121788,
0.20697782933712006,
0.06595230847597122,
-0.03536354750394821,
-0.010296495631337166,
-0.10151198506355286,
-0.060980796813964844,
0.09683284908533096,
-0.00969575997442007,
0.11750518530607224,
-0.009600861929357052,
0.004844961687922478,
0.02443770319223404,
0.02536303550004959,
0.08067121356725693,
-0.09258367121219635,
0.10746221989393234,
-0.00641079805791378,
-0.05824325978755951,
0.00427553616464138,
-0.03277471661567688,
0.033662017434835434,
0.061620794236660004,
0.026716405525803566,
0.08673124015331268,
-0.062084704637527466,
0.1426752358675003,
0.007814265787601471,
0.054585471749305725,
0.1195843368768692,
0.0471511073410511,
-0.1226804256439209,
-0.027257362380623817,
-0.11647898703813553,
0.07620260119438171,
0.05067383870482445,
-0.07764048129320145,
-0.15905602276325226,
0.02343679778277874,
0.13529638946056366,
0.08805946260690689,
0.2189757525920868,
0.12442117929458618,
-0.010370190255343914,
0.27132296562194824,
0.085035040974617,
-0.1928206831216812,
0.06132735684514046,
-0.023790692910552025,
0.006801766809076071,
0.04628249257802963,
0.024077612906694412,
0.038342732936143875,
0.13892434537410736,
-0.015002097003161907,
-0.034107234328985214,
0.023044677451252937,
-0.08950801193714142,
-0.11776699870824814,
0.15087580680847168,
-0.15203596651554108,
-0.06873906403779984,
-0.06893397867679596,
0.054339539259672165,
0.0049551487900316715,
0.21969158947467804,
0.10378344357013702,
-0.11124810576438904,
0.019261687994003296,
0.04751341789960861,
-0.06982219964265823,
-0.09548920392990112,
0.11372727900743484,
0.12204564362764359,
0.03860562667250633,
-0.03333348035812378,
0.13008993864059448,
0.08069876581430435,
-0.034413665533065796,
-0.07851383835077286,
-0.0386241152882576,
-0.118890181183815,
0.00485576968640089,
-0.0901493951678276,
0.021897118538618088,
-0.014108968898653984,
-0.10988210886716843,
-0.08048678189516068,
0.008324540220201015,
0.03606366738677025,
-0.06267095357179642,
0.12453233450651169,
0.1017729789018631,
-0.08460699021816254,
0.050628941506147385,
-0.09022840857505798,
0.11371805518865585,
0.029824964702129364,
0.10607393085956573,
-0.16444529592990875,
-0.06360837072134018,
0.0002735448069870472,
0.0737866535782814,
-0.079881951212883,
0.009898077696561813,
0.038750629872083664,
0.03348493576049805,
-0.02260650135576725,
-0.09390248358249664,
-0.03709886968135834,
-0.07711176574230194,
0.004343750420957804,
-0.06442852318286896,
-0.13363800942897797,
0.023105459287762642,
-0.037001047283411026,
0.04762816056609154,
0.04176878184080124,
-0.02069397270679474,
-0.03335310146212578,
-0.0020209942013025284,
-0.0222619716078043,
-0.06887131184339523,
0.10023137181997299,
-0.04833393916487694,
-0.020402858033776283,
0.022293973714113235,
-0.0015392819186672568,
-0.0960811898112297,
0.11422748863697052,
0.1364356428384781,
0.08066684007644653,
-0.006526406854391098,
-0.008176454342901707,
0.08050021529197693,
0.025133078917860985,
-0.06142016127705574,
-0.0522615946829319,
-0.05828116461634636,
-0.010179318487644196,
-0.10848104953765869,
-0.032990697771310806,
-0.051951345056295395,
0.02275938168168068,
-0.03744424879550934,
0.07913335412740707,
0.100137859582901,
0.004662823863327503,
-0.12633724510669708,
-0.10150419920682907,
0.0597863495349884,
0.002717657247558236,
-0.12301918119192123,
0.032483842223882675,
0.017619948834180832,
-0.018193062394857407,
-0.03846907988190651,
0.23697854578495026,
-0.046711526811122894,
-0.14907270669937134,
-0.02875014953315258,
-0.08112820982933044,
-0.14100484549999237,
-0.0032876331824809313,
0.09556323289871216,
0.048505786806344986,
0.0020398558117449284,
0.003549930639564991,
0.0009912055684253573,
0.07005295902490616,
0.2613040506839752,
0.06536678224802017,
0.0030139980372041464,
-0.004705286119133234,
0.12002360075712204,
0.0383150577545166,
-0.01986546255648136,
-0.10133861750364304,
0.06257620453834534,
-0.04769804701209068,
0.0901271402835846,
0.014844100922346115,
0.02068507671356201,
0.15948358178138733,
-0.03951392322778702,
0.0036708153784275055,
0.0765436515212059,
-0.03558872640132904,
-0.033625781536102295,
-0.107340969145298,
-0.07893886417150497,
-0.12993454933166504,
-0.038435615599155426,
-0.13257382810115814,
-0.12115027010440826,
0.05880945175886154,
0.04930572584271431,
-0.10371775180101395,
0.2278100848197937,
0.07309484481811523,
-0.05150908604264259,
0.1065991073846817,
-0.013857400044798851,
0.025880051776766777,
-0.02766229957342148,
-0.014345310628414154,
-0.1409282088279724,
0.15269896388053894,
0.02683030068874359,
-0.004684626590460539,
-0.00901646725833416,
0.031167905777692795,
0.03836088255047798,
-0.013567151501774788,
-0.07740870118141174,
-0.06296800822019577,
-0.022943805903196335,
-0.08295492827892303,
-0.014313692227005959,
-0.0877251848578453,
0.02177421934902668,
0.15648214519023895,
-0.06754137575626373,
-0.14919881522655487,
-0.08222787827253342,
0.16760684549808502,
0.022709356620907784,
0.09025710821151733,
0.02384452149271965,
-0.04299303889274597,
-0.13350509107112885,
0.21767018735408783,
0.2237902134656906,
-0.1524917036294937,
-0.026538962498307228,
0.10314448177814484,
-0.0009644318488426507,
-0.027111656963825226,
-0.039854105561971664,
0.04276939854025841,
0.3001727759838104,
-0.017231449484825134,
-0.034774377942085266,
-0.0785990059375763,
0.007817374542355537,
-0.026656921952962875,
-0.06437461078166962,
0.1224995106458664,
-0.02345106191933155,
-0.11476876586675644,
0.02337360568344593,
-0.08313577622175217,
-0.06145745515823364,
-0.11234568804502487,
-0.17254792153835297,
-0.10972029715776443,
-0.027539633214473724,
0.02601981908082962,
-0.05999530479311943,
0.06999412178993225,
-0.08600182831287384,
0.03746529296040535,
-0.06381938606500626,
-0.016906794160604477,
-0.0550396591424942,
0.08640894293785095,
0.021732421591877937,
-0.06831161677837372,
0.12761473655700684,
-0.03290596604347229,
0.04660294950008392,
0.12949997186660767,
-0.04627841338515282,
-0.06164192408323288,
0.03270069882273674,
0.03525794297456741,
-0.044986993074417114,
0.016306553035974503,
-0.09410686790943146,
0.016947178170084953,
0.05380811169743538,
0.024987002834677696,
-0.05244230106472969,
-0.011415394954383373,
0.19651520252227783,
0.052310530096292496,
-0.08463437855243683,
-0.05099029839038849,
-0.04303795471787453,
0.08537645637989044,
0.0958903580904007,
-0.03433205559849739,
0.016659660264849663,
-0.08769945800304413,
0.01938549056649208,
0.08282705396413803,
0.13770872354507446,
0.03175007551908493,
-0.1894073337316513,
-0.041935186833143234,
-0.03042370267212391,
-0.030146606266498566,
-0.003196135861799121,
-0.014655740931630135,
-0.05703458935022354,
-0.028561724349856377,
-0.08137288689613342,
0.054748088121414185,
0.10284264385700226,
-0.005564292427152395,
-0.0020160803105682135,
0.0984329953789711,
-0.13434985280036926,
-0.011027191765606403,
-0.19990964233875275,
-0.12925319373607635
] |
null | null |
transformers
|
# efficientnet_b2
Implementation of EfficientNet proposed in [EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks](https://arxiv.org/abs/1905.11946)

The basic architecture is similar to MobileNetV2 as was computed by
using [Progressive Neural Architecture
Search](https://arxiv.org/abs/1905.11946) .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):

Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.

``` python
EfficientNet.efficientnet_b0()
EfficientNet.efficientnet_b1()
EfficientNet.efficientnet_b2()
EfficientNet.efficientnet_b3()
EfficientNet.efficientnet_b4()
EfficientNet.efficientnet_b5()
EfficientNet.efficientnet_b6()
EfficientNet.efficientnet_b7()
EfficientNet.efficientnet_b8()
EfficientNet.efficientnet_l2()
```
Examples:
``` python
EfficientNet.efficientnet_b0(activation = nn.SELU)
# change number of classes (default is 1000 )
EfficientNet.efficientnet_b0(n_classes=100)
# pass a different block
EfficientNet.efficientnet_b0(block=...)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = EfficientNet.efficientnet_b0()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 32, 112, 112]), torch.Size([1, 24, 56, 56]), torch.Size([1, 40, 28, 28]), torch.Size([1, 80, 14, 14])]
```
|
{}
| null |
glasses/efficientnet_b2
|
[
"transformers",
"pytorch",
"arxiv:1905.11946",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1905.11946"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us
|
# efficientnet_b2
Implementation of EfficientNet proposed in EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks
!image
The basic architecture is similar to MobileNetV2 as was computed by
using Progressive Neural Architecture
Search .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):
!image
Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.
!image
Examples:
|
[
"# efficientnet_b2\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n",
"# efficientnet_b2\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
29,
144
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n# efficientnet_b2\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
-0.030356522649526596,
-0.011883880943059921,
-0.0028559232596307993,
0.01743420399725437,
0.03268202766776085,
-0.038581717759370804,
-0.03107329085469246,
0.03603076934814453,
-0.1457444727420807,
0.0117693692445755,
0.08181089907884598,
0.09996844083070755,
-0.030948830768465996,
0.1168336346745491,
0.016107730567455292,
-0.16504402458667755,
-0.00827665813267231,
0.11964063346385956,
-0.07405880838632584,
0.051999349147081375,
0.12559853494167328,
-0.07758817076683044,
0.09076133370399475,
0.06831773370504379,
-0.11337140947580338,
0.05960022658109665,
-0.0779588371515274,
-0.017915643751621246,
0.13814514875411987,
-0.027944931760430336,
0.09941732883453369,
-0.030776681378483772,
-0.030736269429326057,
-0.16496220231056213,
0.0011644602054730058,
0.06139256805181503,
-0.025858934968709946,
0.0997505858540535,
0.06006806716322899,
0.1180356964468956,
0.20964869856834412,
-0.00805944949388504,
-0.09127306193113327,
-0.05933545157313347,
-0.010699410922825336,
-0.09754736721515656,
0.021310754120349884,
0.06998425722122192,
0.03240848332643509,
0.030560167506337166,
0.026921754702925682,
0.16086094081401825,
-0.10764230787754059,
0.094302237033844,
-0.013844339177012444,
-0.3254677653312683,
-0.04189930111169815,
0.06466668099164963,
-0.04576537385582924,
0.019318094477057457,
0.02732512354850769,
0.031496867537498474,
0.058583974838256836,
0.055003561079502106,
0.14280343055725098,
-0.009888426400721073,
-0.1258101761341095,
0.017489440739154816,
-0.10650325566530228,
0.00048222101759165525,
0.1771809458732605,
0.05150952190160751,
0.08155333250761032,
-0.06840589642524719,
-0.2165403813123703,
-0.02387377992272377,
0.0169240552932024,
-0.19757379591464996,
-0.010737019591033459,
0.002488392638042569,
-0.06154170632362366,
-0.0858808383345604,
-0.0828656405210495,
-0.0015756383072584867,
-0.12971185147762299,
0.09515747427940369,
0.02839025668799877,
-0.007655273657292128,
-0.05304902791976929,
0.13745814561843872,
-0.02666563168168068,
-0.07766817510128021,
0.07590018212795258,
-0.06265142560005188,
-0.004025714006274939,
0.0342458076775074,
0.021511834114789963,
-0.14758668839931488,
0.04912370443344116,
0.035478346049785614,
0.09169361740350723,
-0.06431222707033157,
0.09521481394767761,
0.07964584976434708,
-0.02873747982084751,
0.12510496377944946,
-0.162913516163826,
-0.19849662482738495,
0.01689774915575981,
0.012837051413953304,
0.05597546324133873,
0.018985500559210777,
-0.07979989796876907,
-0.051868364214897156,
0.11675168573856354,
-0.014688887633383274,
-0.010929140262305737,
0.08835293352603912,
-0.05380687490105629,
-0.11445258557796478,
0.23300929367542267,
-0.02405662275850773,
0.022202815860509872,
-0.026564201340079308,
-0.03327871114015579,
-0.0047469427809119225,
0.03590675815939903,
-0.0009697076166048646,
-0.10450618714094162,
0.013336531817913055,
-0.07855252176523209,
-0.14565150439739227,
-0.05914859473705292,
-0.0770338922739029,
-0.0050061228685081005,
-0.003454937832430005,
0.006124301813542843,
-0.20973224937915802,
-0.10885356366634369,
0.10039684176445007,
0.09299078583717346,
-0.01292756199836731,
-0.031172512099146843,
-0.03470616042613983,
-0.1052832081913948,
-0.10569951683282852,
-0.03170508146286011,
0.1262444704771042,
-0.03984398394823074,
0.005249409005045891,
-0.015454140491783619,
0.03208233043551445,
-0.14845679700374603,
0.011990093626081944,
-0.10941483825445175,
0.025165967643260956,
-0.14839644730091095,
0.05017447471618652,
-0.04279983788728714,
-0.015975195914506912,
-0.06469572335481644,
-0.02114771492779255,
-0.19121764600276947,
-0.00869724340736866,
0.11872825771570206,
0.08015060424804688,
-0.031433433294296265,
-0.09041143208742142,
-0.014070793986320496,
-0.07842423766851425,
-0.1749250739812851,
0.1515386402606964,
0.0065184179693460464,
0.11422719061374664,
0.060662489384412766,
0.19955192506313324,
0.12133946269750595,
-0.03499465063214302,
-0.06365256011486053,
0.02558092027902603,
-0.11096953600645065,
-0.12069433927536011,
-0.07503696531057358,
0.11695010215044022,
0.050794731825590134,
0.04018131271004677,
-0.004417395684868097,
0.08248166739940643,
-0.05725714564323425,
-0.06867750734090805,
-0.022913841530680656,
-0.059930846095085144,
-0.006113532464951277,
0.00980265624821186,
0.06972425431013107,
0.04449684917926788,
-0.05053042247891426,
-0.16640014946460724,
0.0678630918264389,
-0.08039650321006775,
0.027691641822457314,
-0.07775701582431793,
0.015549158677458763,
-0.09385082125663757,
0.033733077347278595,
-0.07255247980356216,
-0.02417292259633541,
0.11074857413768768,
0.009030237793922424,
-0.032755449414253235,
0.15464061498641968,
0.02656911313533783,
-0.048585399985313416,
-0.10135217756032944,
-0.01913478970527649,
0.011393887922167778,
0.04291088134050369,
-0.10444281250238419,
-0.12676569819450378,
-0.05208868533372879,
-0.024967215955257416,
0.05095226317644119,
-0.11390036344528198,
-0.03242397680878639,
0.06223497539758682,
0.24666179716587067,
0.02044614963233471,
0.03457092493772507,
-0.07852771133184433,
-0.026235731318593025,
-0.09631600975990295,
-0.017602138221263885,
0.021741265431046486,
-0.002682541962713003,
-0.04885413125157356,
0.05418585240840912,
0.0802631825208664,
0.12352238595485687,
0.05689385533332825,
-0.13916538655757904,
0.05724717676639557,
0.03640325739979744,
-0.05541129410266876,
-0.08708477765321732,
0.03675055131316185,
-0.10622227191925049,
0.08247271180152893,
-0.03354000672698021,
0.1214979812502861,
-0.08061721920967102,
-0.04988156259059906,
0.022160859778523445,
0.024458911269903183,
0.048903800547122955,
0.030006902292370796,
0.11367039382457733,
-0.30646634101867676,
0.0359482541680336,
0.09520845115184784,
-0.09307315945625305,
0.10163872689008713,
0.05406726151704788,
-0.024391723796725273,
-0.0036398381926119328,
0.03143889084458351,
-0.016430508345365524,
0.1787189543247223,
-0.15500693023204803,
-0.009724941104650497,
0.04011589288711548,
-0.005170516204088926,
0.19296570122241974,
-0.08191896975040436,
0.09009344130754471,
0.001964408904314041,
0.018948916345834732,
0.2744787931442261,
-0.009054679423570633,
-0.030303483828902245,
0.06555269658565521,
-0.04999924078583717,
-0.2659338414669037,
0.004918653052300215,
-0.04162134975194931,
-0.03092065081000328,
0.14074012637138367,
-0.007528022397309542,
-0.17889444530010223,
-0.07793979346752167,
-0.0006111115217208862,
-0.04259489104151726,
0.0014685139758512378,
-0.04389510303735733,
0.01759733445942402,
-0.06370315700769424,
-0.03529869392514229,
-0.07577451318502426,
0.06295628845691681,
0.061175599694252014,
-0.008737037889659405,
0.058984577655792236,
0.04758363962173462,
-0.10517337173223495,
0.010652489960193634,
-0.10298550128936768,
0.014761100523173809,
0.02374459058046341,
-0.12084263563156128,
0.12242110073566437,
0.15379182994365692,
-0.03416336327791214,
-0.013667015358805656,
0.030078254640102386,
0.10437557101249695,
-0.047417882829904556,
0.07581034302711487,
-0.08028917014598846,
0.04572775214910507,
0.020718269050121307,
0.13291384279727936,
0.002594498684629798,
-0.013027277775108814,
0.03311470150947571,
-0.016980700194835663,
0.0004443640646059066,
-0.14191488921642303,
-0.17972125113010406,
-0.04348956421017647,
0.02135579288005829,
0.07734797149896622,
0.06144067645072937,
0.08920438587665558,
0.06091037392616272,
-0.009603569284081459,
0.031805966049432755,
0.012769936583936214,
0.0537092462182045,
0.22494345903396606,
0.023381100967526436,
0.07087250053882599,
-0.038787394762039185,
-0.20911206305027008,
0.04959946870803833,
0.10625030845403671,
0.19527573883533478,
-0.05204209312796593,
0.09471588581800461,
0.024357393383979797,
0.1097380742430687,
0.016356388106942177,
0.10981277376413345,
-0.06509711593389511,
-0.003044690703973174,
-0.06539253145456314,
-0.0372665598988533,
0.05683211237192154,
0.07645288854837418,
-0.007665828336030245,
-0.044405315071344376,
0.07591143995523453,
0.08031459152698517,
0.11037129163742065,
0.09561333805322647,
0.11174819618463516,
-0.24364742636680603,
0.05665550008416176,
-0.035494886338710785,
-0.09431073814630508,
-0.05386152118444443,
0.048912521451711655,
0.22905774414539337,
0.004085693042725325,
0.04028337448835373,
-0.039855893701314926,
0.05656895413994789,
-0.08536084741353989,
0.03750711679458618,
0.016852138563990593,
0.0753951147198677,
0.061154887080192566,
0.09443136304616928,
-0.08168123662471771,
0.1937013566493988,
0.006462146528065205,
0.022087184712290764,
-0.0723419189453125,
-0.0391976535320282,
-0.02608802728354931,
0.043658506125211716,
0.028492281213402748,
-0.008888693526387215,
0.25596699118614197,
0.12981964647769928,
-0.12502239644527435,
0.05151553452014923,
0.02940468303859234,
0.048297397792339325,
0.0718582421541214,
0.02281046472489834,
-0.07367830723524094,
0.016435915604233742,
0.11910299211740494,
0.025556957349181175,
-0.09676869958639145,
0.007724291644990444,
0.1680733561515808,
-0.1264902502298355,
-0.06051339954137802,
-0.027616344392299652,
-0.04484299570322037,
0.2757539749145508,
0.005494779907166958,
-0.005632334854453802,
-0.005962992087006569,
0.024285661056637764,
0.11701292544603348,
-0.03586756810545921,
0.03665807843208313,
-0.03531908616423607,
0.04389868304133415,
0.009323449805378914,
-0.08680833876132965,
0.08595948666334152,
-0.08596309274435043,
-0.0025013540871441364,
-0.07097692787647247,
0.04724676162004471,
0.01730465702712536,
-0.011672807857394218,
0.07113996148109436,
-0.021714836359024048,
-0.13359121978282928,
-0.08102814853191376,
-0.11938102543354034,
0.007544378284364939,
-0.13468164205551147,
0.07806944102048874,
-0.0624067448079586,
-0.004067753441631794,
-0.026000337675213814,
0.05027000606060028,
0.10568710416555405,
0.09490054100751877,
-0.07892706990242004,
-0.09457039833068848,
0.11524225026369095,
-0.003735798643901944,
-0.248570516705513,
-0.14240683615207672,
0.0448007732629776,
0.04087447747588158,
0.04874980077147484,
-0.015123155899345875,
0.20091941952705383,
0.06393144279718399,
-0.038847606629133224,
-0.012574155814945698,
-0.10425759106874466,
-0.06260982900857925,
0.09268227219581604,
-0.006294553633779287,
0.11626040935516357,
-0.006927702575922012,
0.0012198281474411488,
0.02158896066248417,
0.02383713237941265,
0.08436453342437744,
-0.09726975858211517,
0.10402563959360123,
-0.00897441990673542,
-0.059075985103845596,
0.008010158315300941,
-0.030123567208647728,
0.03441668301820755,
0.05409436300396919,
0.023606428876519203,
0.08403884619474411,
-0.06547096371650696,
0.14867942035198212,
0.008321685716509819,
0.04878467693924904,
0.11752041429281235,
0.051375798881053925,
-0.12533847987651825,
-0.029120024293661118,
-0.11701672524213791,
0.07298512011766434,
0.04879556968808174,
-0.07915283739566803,
-0.15696710348129272,
0.02294880710542202,
0.13260039687156677,
0.09039822965860367,
0.21176674962043762,
0.12379566580057144,
-0.0171434935182333,
0.2791973948478699,
0.08677756041288376,
-0.1838405579328537,
0.06816993653774261,
-0.024320200085639954,
0.005945098120719194,
0.0449797697365284,
0.02246057614684105,
0.043027475476264954,
0.1383262276649475,
-0.013024901039898396,
-0.035331640392541885,
0.023369763046503067,
-0.09288699179887772,
-0.11593768745660782,
0.15279562771320343,
-0.15488338470458984,
-0.0714685469865799,
-0.06971540302038193,
0.05614717677235603,
0.0008877470390871167,
0.2165355086326599,
0.10902195423841476,
-0.1074385792016983,
0.014334173873066902,
0.047453977167606354,
-0.06863155215978622,
-0.10068979859352112,
0.1112200990319252,
0.12293928861618042,
0.038683775812387466,
-0.03154537081718445,
0.12898428738117218,
0.08017788082361221,
-0.027743760496377945,
-0.08072145283222198,
-0.039860982447862625,
-0.11586620658636093,
0.00834218692034483,
-0.08751004189252853,
0.020975319668650627,
-0.009594838134944439,
-0.10888509452342987,
-0.0801469162106514,
0.007859445177018642,
0.03389245271682739,
-0.05888564512133598,
0.12484166771173477,
0.10599662363529205,
-0.08192501962184906,
0.0508703738451004,
-0.08782977610826492,
0.11866197735071182,
0.030539700761437416,
0.10322172194719315,
-0.16320458054542542,
-0.06267301738262177,
-0.001269293250516057,
0.07578157633543015,
-0.08057604730129242,
0.01010422594845295,
0.03685001656413078,
0.0346548892557621,
-0.0321878083050251,
-0.09342597424983978,
-0.034967679530382156,
-0.07727459818124771,
0.002582829911261797,
-0.06576385349035263,
-0.13505810499191284,
0.020435888320207596,
-0.03808324784040451,
0.04760217294096947,
0.03834734484553337,
-0.024265803396701813,
-0.03760111331939697,
-0.004976390860974789,
-0.01797300949692726,
-0.07169631123542786,
0.1055474579334259,
-0.045376986265182495,
-0.014739573001861572,
0.02486172318458557,
0.0017052365001291037,
-0.09733863919973373,
0.11429397761821747,
0.13570845127105713,
0.07913465797901154,
-0.006936521735042334,
-0.0056944070383906364,
0.08114105463027954,
0.02308621071279049,
-0.06161271035671234,
-0.04067578911781311,
-0.05711161717772484,
-0.01399512030184269,
-0.1105322316288948,
-0.03572007268667221,
-0.05364613234996796,
0.022951651364564896,
-0.040956273674964905,
0.08510489016771317,
0.09670307487249374,
0.006607700139284134,
-0.126902773976326,
-0.10251607745885849,
0.05852709338068962,
0.0010153576731681824,
-0.1177097037434578,
0.0349399670958519,
0.0188419409096241,
-0.02316364273428917,
-0.03758930787444115,
0.24005135893821716,
-0.049389634281396866,
-0.1478940099477768,
-0.028475550934672356,
-0.08530326932668686,
-0.13748276233673096,
-0.0004683512379415333,
0.10111377388238907,
0.04518526419997215,
0.0024284319952130318,
0.0038248940836638212,
-0.0009665657416917384,
0.06571513414382935,
0.2557680904865265,
0.06574059277772903,
0.004894030746072531,
-0.0037493312265723944,
0.11976217478513718,
0.04393748193979263,
-0.015049687586724758,
-0.10605889558792114,
0.06386114656925201,
-0.044716015458106995,
0.08452145010232925,
0.012837347574532032,
0.029812969267368317,
0.16248971223831177,
-0.043161921203136444,
0.0003190379065927118,
0.08171543478965759,
-0.036353446543216705,
-0.03470270335674286,
-0.10602273046970367,
-0.07905223220586777,
-0.1266753077507019,
-0.03858460485935211,
-0.13179409503936768,
-0.12084770202636719,
0.05077815800905228,
0.04702889546751976,
-0.10650651156902313,
0.21974453330039978,
0.0774649903178215,
-0.0530662015080452,
0.10427156835794449,
-0.0110723115503788,
0.02797785773873329,
-0.02293044701218605,
-0.010799550451338291,
-0.13744406402111053,
0.15778270363807678,
0.02469821460545063,
-0.003431378398090601,
-0.0073865437880158424,
0.025914769619703293,
0.037550169974565506,
-0.012218943797051907,
-0.07923752069473267,
-0.06510477513074875,
-0.02128073386847973,
-0.08398222178220749,
-0.012715144082903862,
-0.08922512084245682,
0.02236294187605381,
0.15908730030059814,
-0.06726481765508652,
-0.14909252524375916,
-0.07919592410326004,
0.16619741916656494,
0.018025493249297142,
0.09204038977622986,
0.026613160967826843,
-0.04486430808901787,
-0.1358993649482727,
0.21016152203083038,
0.2217760533094406,
-0.1552448868751526,
-0.0246588122099638,
0.10431847721338272,
0.0005736567545682192,
-0.026397405192255974,
-0.03909281641244888,
0.03444381058216095,
0.30373817682266235,
-0.015186778269708157,
-0.036760419607162476,
-0.07632558047771454,
0.007154602557420731,
-0.022034531459212303,
-0.05652327835559845,
0.1165848821401596,
-0.024280240759253502,
-0.11864273995161057,
0.02015461027622223,
-0.07859363406896591,
-0.06255298852920532,
-0.11338881403207779,
-0.1774911880493164,
-0.10969878733158112,
-0.023180123418569565,
0.028407355770468712,
-0.06172093376517296,
0.06976920366287231,
-0.08682851493358612,
0.03658680245280266,
-0.060795024037361145,
-0.015826987102627754,
-0.058765411376953125,
0.09109330922365189,
0.024765480309724808,
-0.07105489820241928,
0.1319429576396942,
-0.033880192786455154,
0.05246169492602348,
0.12702815234661102,
-0.04591568559408188,
-0.0621798075735569,
0.02981833554804325,
0.03617759048938751,
-0.04616836458444595,
0.015038508921861649,
-0.08647071570158005,
0.016055133193731308,
0.053218480199575424,
0.026224328204989433,
-0.04863149672746658,
-0.013540768064558506,
0.19428065419197083,
0.04876350238919258,
-0.08841057866811752,
-0.05335327610373497,
-0.043085888028144836,
0.0818098708987236,
0.09026041626930237,
-0.0385100394487381,
0.01698075421154499,
-0.08892146497964859,
0.01987304538488388,
0.08431880921125412,
0.14485090970993042,
0.03624160587787628,
-0.18471050262451172,
-0.04243452101945877,
-0.027602003887295723,
-0.028769154101610184,
-0.003998524509370327,
-0.01430937834084034,
-0.057762403041124344,
-0.030384402722120285,
-0.08337121456861496,
0.060709014534950256,
0.10262216627597809,
-0.004849528893828392,
-0.0019356397679075599,
0.0911296010017395,
-0.13479578495025635,
-0.014323735609650612,
-0.20119112730026245,
-0.130545973777771
] |
null | null |
transformers
|
# efficientnet_b3
Implementation of EfficientNet proposed in [EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks](https://arxiv.org/abs/1905.11946)

The basic architecture is similar to MobileNetV2 as was computed by
using [Progressive Neural Architecture
Search](https://arxiv.org/abs/1905.11946) .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):

Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.

``` python
EfficientNet.efficientnet_b0()
EfficientNet.efficientnet_b1()
EfficientNet.efficientnet_b2()
EfficientNet.efficientnet_b3()
EfficientNet.efficientnet_b4()
EfficientNet.efficientnet_b5()
EfficientNet.efficientnet_b6()
EfficientNet.efficientnet_b7()
EfficientNet.efficientnet_b8()
EfficientNet.efficientnet_l2()
```
Examples:
``` python
EfficientNet.efficientnet_b0(activation = nn.SELU)
# change number of classes (default is 1000 )
EfficientNet.efficientnet_b0(n_classes=100)
# pass a different block
EfficientNet.efficientnet_b0(block=...)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = EfficientNet.efficientnet_b0()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 32, 112, 112]), torch.Size([1, 24, 56, 56]), torch.Size([1, 40, 28, 28]), torch.Size([1, 80, 14, 14])]
```
|
{}
| null |
glasses/efficientnet_b3
|
[
"transformers",
"pytorch",
"arxiv:1905.11946",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1905.11946"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us
|
# efficientnet_b3
Implementation of EfficientNet proposed in EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks
!image
The basic architecture is similar to MobileNetV2 as was computed by
using Progressive Neural Architecture
Search .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):
!image
Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.
!image
Examples:
|
[
"# efficientnet_b3\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n",
"# efficientnet_b3\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
29,
144
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n# efficientnet_b3\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
-0.029966391623020172,
-0.018517551943659782,
-0.0026498865336179733,
0.017939981073141098,
0.03787081316113472,
-0.03755519911646843,
-0.031122809275984764,
0.03494604676961899,
-0.15207627415657043,
0.011634355410933495,
0.08436252921819687,
0.10715970396995544,
-0.032733067870140076,
0.11367399245500565,
0.016989905387163162,
-0.16481377184391022,
-0.003990046214312315,
0.11778246611356735,
-0.07092837989330292,
0.050533756613731384,
0.1256229430437088,
-0.0777399018406868,
0.09506425261497498,
0.0666697546839714,
-0.10764522850513458,
0.06019316241145134,
-0.0852309912443161,
-0.019351862370967865,
0.13510476052761078,
-0.029132883995771408,
0.09629585593938828,
-0.032434266060590744,
-0.03282742202281952,
-0.16402573883533478,
0.0009890240617096424,
0.06331700086593628,
-0.027291251346468925,
0.09778395295143127,
0.06283175200223923,
0.11745863407850266,
0.20813943445682526,
-0.008245845325291157,
-0.08778829127550125,
-0.05843834578990936,
-0.011466694064438343,
-0.08918420970439911,
0.017476171255111694,
0.06530606746673584,
0.0346892774105072,
0.03876583278179169,
0.026911988854408264,
0.16476279497146606,
-0.110178641974926,
0.09617576003074646,
-0.010014200583100319,
-0.324764609336853,
-0.0456569641828537,
0.07486691325902939,
-0.04680401831865311,
0.019009102135896683,
0.027272241190075874,
0.032722990959882736,
0.05976327136158943,
0.0599709190428257,
0.14550204575061798,
-0.009805848821997643,
-0.12051235139369965,
0.016035646200180054,
-0.1101587787270546,
0.002588821342214942,
0.18287533521652222,
0.05348379164934158,
0.08176497370004654,
-0.07295175641775131,
-0.21141313016414642,
-0.029995497316122055,
0.016905443742871284,
-0.19927476346492767,
-0.01208452694118023,
-0.0002654205309227109,
-0.062123917043209076,
-0.08299281448125839,
-0.08224813640117645,
0.003146543400362134,
-0.12823611497879028,
0.09519322961568832,
0.031222214922308922,
-0.009927564300596714,
-0.0466756671667099,
0.13549599051475525,
-0.025136301293969154,
-0.07618148624897003,
0.07536409050226212,
-0.0657186433672905,
-0.00448062177747488,
0.029800154268741608,
0.02230590023100376,
-0.1509290337562561,
0.048822127282619476,
0.037934910506010056,
0.08883354812860489,
-0.059542953968048096,
0.10165125131607056,
0.07753200829029083,
-0.02806886099278927,
0.12459851801395416,
-0.16799437999725342,
-0.19658297300338745,
0.012150597758591175,
0.01563986949622631,
0.05488401651382446,
0.019131142646074295,
-0.07598841190338135,
-0.04509097710251808,
0.11495102941989899,
-0.012185247614979744,
-0.013351204805076122,
0.0940210148692131,
-0.04984401911497116,
-0.11641738563776016,
0.23959940671920776,
-0.023366425186395645,
0.017114218324422836,
-0.029482057318091393,
-0.031018933281302452,
0.00482453778386116,
0.031269896775484085,
-0.0006476357812061906,
-0.10432147234678268,
0.015362497419118881,
-0.07876832783222198,
-0.14491349458694458,
-0.05884344130754471,
-0.07582545280456543,
-0.006434421055018902,
-0.004810187965631485,
0.0036751641891896725,
-0.21197059750556946,
-0.10902737826108932,
0.09668558090925217,
0.09576427936553955,
-0.015496009960770607,
-0.029818739742040634,
-0.03527936711907387,
-0.10848736017942429,
-0.10604380071163177,
-0.02990005537867546,
0.12608784437179565,
-0.04258698225021362,
0.0013218746753409505,
-0.014206561259925365,
0.030387545004487038,
-0.1454918384552002,
0.014116792008280754,
-0.10838213562965393,
0.020571336150169373,
-0.15297062695026398,
0.049359116703271866,
-0.04745516926050186,
-0.010132388211786747,
-0.06659843027591705,
-0.02562323585152626,
-0.19155704975128174,
-0.009100104682147503,
0.11745487153530121,
0.08086875081062317,
-0.027419408783316612,
-0.08680728822946548,
-0.026196684688329697,
-0.07658488303422928,
-0.17722290754318237,
0.15346203744411469,
0.009272326715290546,
0.10947498679161072,
0.05895882472395897,
0.1964506357908249,
0.12185792624950409,
-0.039372481405735016,
-0.06512346863746643,
0.024809066206216812,
-0.11097070574760437,
-0.12172583490610123,
-0.0757058635354042,
0.11393462121486664,
0.048774413764476776,
0.04293085262179375,
-0.007227105088531971,
0.08194659650325775,
-0.05800817906856537,
-0.06815317273139954,
-0.020046692341566086,
-0.06043286621570587,
-0.011509409174323082,
0.009737454354763031,
0.0703183189034462,
0.04261898621916771,
-0.049492496997117996,
-0.1773584634065628,
0.06698039919137955,
-0.08220376074314117,
0.027467967942357063,
-0.0763062834739685,
0.0182788223028183,
-0.09308519959449768,
0.032762277871370316,
-0.07119137793779373,
-0.030955826863646507,
0.11050810664892197,
0.015683986246585846,
-0.03427708148956299,
0.1511407196521759,
0.02585519105195999,
-0.049428340047597885,
-0.1028943806886673,
-0.022325269877910614,
0.016033310443162918,
0.04316249489784241,
-0.10213851183652878,
-0.12442618608474731,
-0.05154310166835785,
-0.02546423301100731,
0.04449667036533356,
-0.10814719647169113,
-0.03341514989733696,
0.05708012357354164,
0.24286778271198273,
0.017990082502365112,
0.033609483391046524,
-0.07524917274713516,
-0.0223136804997921,
-0.09694734215736389,
-0.018089167773723602,
0.022217951714992523,
-0.003481846069917083,
-0.04600351303815842,
0.05400770157575607,
0.07836107909679413,
0.12694406509399414,
0.05614223703742027,
-0.14383606612682343,
0.05224934220314026,
0.036145661026239395,
-0.05493415147066116,
-0.08610975742340088,
0.036630865186452866,
-0.10612279921770096,
0.08432693034410477,
-0.0343875028192997,
0.12036203593015671,
-0.07723349332809448,
-0.05010693892836571,
0.020996632054448128,
0.027365287765860558,
0.044711634516716,
0.030650410801172256,
0.11864541471004486,
-0.30018851161003113,
0.03530287370085716,
0.0946282297372818,
-0.09357425570487976,
0.09736029803752899,
0.055002499371767044,
-0.021547770127654076,
0.0012258816277608275,
0.03222612291574478,
-0.01600201055407524,
0.1737043559551239,
-0.1524232178926468,
-0.010600066743791103,
0.03847517818212509,
0.0011011537862941623,
0.1938924342393875,
-0.07765444368124008,
0.08800305426120758,
0.0012847925536334515,
0.020491674542427063,
0.2831457257270813,
-0.0033493938390165567,
-0.03199033811688423,
0.06301338970661163,
-0.05006088688969612,
-0.27101877331733704,
0.006642778869718313,
-0.040072619915008545,
-0.02960123121738434,
0.14419913291931152,
-0.00772837782278657,
-0.1810794323682785,
-0.07736292481422424,
-0.007894898764789104,
-0.04888765141367912,
-0.0018602355848997831,
-0.04249131307005882,
0.01171244028955698,
-0.06997796893119812,
-0.039761241525411606,
-0.07813722640275955,
0.058688368648290634,
0.05905129760503769,
-0.003932158928364515,
0.054316215217113495,
0.04533321410417557,
-0.1050974428653717,
0.009416286833584309,
-0.10356900840997696,
0.0115771209821105,
0.02565106935799122,
-0.12203848361968994,
0.12645049393177032,
0.15185989439487457,
-0.028583114966750145,
-0.012441874481737614,
0.02722485549747944,
0.10897397994995117,
-0.04722024127840996,
0.07335353642702103,
-0.07830987870693207,
0.05342629924416542,
0.020902536809444427,
0.13185672461986542,
0.0034083120990544558,
-0.016810551285743713,
0.03508708253502846,
-0.013308740220963955,
-0.00014972846838645637,
-0.1423262059688568,
-0.18175527453422546,
-0.045402463525533676,
0.02060728333890438,
0.07718150317668915,
0.06388645619153976,
0.0944155752658844,
0.06348437815904617,
-0.008228258229792118,
0.02665138430893421,
0.012738565914332867,
0.05562230572104454,
0.2288876622915268,
0.02463681995868683,
0.07229208201169968,
-0.03911839425563812,
-0.2096349000930786,
0.04867083951830864,
0.10395847260951996,
0.1955592930316925,
-0.05132363364100456,
0.09077576547861099,
0.028728870674967766,
0.10962442308664322,
0.011516900733113289,
0.11233095824718475,
-0.06553415209054947,
-0.00686260312795639,
-0.06647792458534241,
-0.03726496174931526,
0.06181315332651138,
0.07168593257665634,
-0.008545992895960808,
-0.04749809950590134,
0.07470888644456863,
0.08422372490167618,
0.11323851346969604,
0.0988006666302681,
0.10690650343894958,
-0.24773743748664856,
0.06126687675714493,
-0.04007719084620476,
-0.09320927411317825,
-0.048097021877765656,
0.05186948552727699,
0.2192273586988449,
0.0022774741519242525,
0.04447481036186218,
-0.041997164487838745,
0.05434529855847359,
-0.08440600335597992,
0.03739994764328003,
0.024780485779047012,
0.08239179104566574,
0.060692645609378815,
0.09355608373880386,
-0.08724667876958847,
0.19358481466770172,
0.004131191875785589,
0.024434903636574745,
-0.07230975478887558,
-0.03994116187095642,
-0.024116480723023415,
0.04467420652508736,
0.029927460476756096,
-0.010007169097661972,
0.26052871346473694,
0.12558095157146454,
-0.12572580575942993,
0.05127599835395813,
0.030852824449539185,
0.05240532010793686,
0.0723380446434021,
0.0238622035831213,
-0.07409534603357315,
0.015591656789183617,
0.1251676380634308,
0.02138013020157814,
-0.10078278183937073,
0.008419947698712349,
0.16821838915348053,
-0.12485858798027039,
-0.05794638395309448,
-0.030968226492404938,
-0.06099396571516991,
0.27326205372810364,
0.01717350259423256,
-0.004263146314769983,
-0.008689912967383862,
0.01454838365316391,
0.11786523461341858,
-0.035022079944610596,
0.03586754575371742,
-0.03670081868767738,
0.03834608197212219,
0.010051388293504715,
-0.08769015222787857,
0.08395440876483917,
-0.09118855744600296,
0.0008632597746327519,
-0.06765897572040558,
0.04434261843562126,
0.016954312101006508,
-0.013113456778228283,
0.07000065594911575,
-0.021024100482463837,
-0.13387054204940796,
-0.08441059291362762,
-0.11887460947036743,
0.006091990042477846,
-0.13297748565673828,
0.0813409686088562,
-0.06140844523906708,
-0.003169267438352108,
-0.028466325253248215,
0.04934849217534065,
0.10444587469100952,
0.0907176062464714,
-0.07512093335390091,
-0.100400410592556,
0.11543462425470352,
-0.0011295214062556624,
-0.2492954283952713,
-0.1419665366411209,
0.04394631087779999,
0.0357617624104023,
0.05009346082806587,
-0.014179314486682415,
0.20418404042720795,
0.06680016964673996,
-0.03649912774562836,
-0.014475181698799133,
-0.10402099788188934,
-0.060405269265174866,
0.09343535453081131,
-0.002820670371875167,
0.11294553428888321,
-0.009887410327792168,
0.005168503616005182,
0.02522011660039425,
0.02332918345928192,
0.0815950483083725,
-0.08936246484518051,
0.10887141525745392,
-0.00862407311797142,
-0.06287243962287903,
0.0038077582139521837,
-0.032971713691949844,
0.032859593629837036,
0.05734149366617203,
0.024573149159550667,
0.08424020558595657,
-0.0600527785718441,
0.1446198672056198,
0.007314704824239016,
0.05498187243938446,
0.11634648591279984,
0.04833703488111496,
-0.12293869256973267,
-0.02851969003677368,
-0.11924757808446884,
0.07248678058385849,
0.04905788600444794,
-0.07874254137277603,
-0.1576012372970581,
0.025340918451547623,
0.13483932614326477,
0.0891219824552536,
0.21542224287986755,
0.12322002649307251,
-0.013563957996666431,
0.27171656489372253,
0.08952386677265167,
-0.19194145500659943,
0.05984865501523018,
-0.0265438761562109,
0.00558775570243597,
0.04305262863636017,
0.029232801869511604,
0.0384424664080143,
0.1379697173833847,
-0.018589608371257782,
-0.03366994485259056,
0.024175986647605896,
-0.09147366881370544,
-0.11602013558149338,
0.15327899158000946,
-0.15384994447231293,
-0.0649794414639473,
-0.06944704055786133,
0.05437749624252319,
0.0012117738369852304,
0.22136515378952026,
0.10608421266078949,
-0.10720137506723404,
0.01612788997590542,
0.047874514013528824,
-0.06774125248193741,
-0.09700173139572144,
0.11367304623126984,
0.12127352505922318,
0.037827953696250916,
-0.032549019902944565,
0.13063572347164154,
0.07716969400644302,
-0.02764626406133175,
-0.07838781923055649,
-0.03430619463324547,
-0.11910982429981232,
0.006258034612983465,
-0.08961863070726395,
0.018246112391352654,
-0.01279996894299984,
-0.10723625868558884,
-0.08308552205562592,
0.007003485225141048,
0.03658740222454071,
-0.06360305100679398,
0.12594065070152283,
0.1013541966676712,
-0.08380354940891266,
0.04981396719813347,
-0.0874624103307724,
0.11340407282114029,
0.027817828580737114,
0.10537691414356232,
-0.16620799899101257,
-0.06881844997406006,
0.0012127859517931938,
0.07524862885475159,
-0.07976013422012329,
0.009553061798214912,
0.0362391322851181,
0.034215591847896576,
-0.026239408180117607,
-0.0974334254860878,
-0.03812703862786293,
-0.07648146897554398,
0.002733765635639429,
-0.06177273020148277,
-0.13518022000789642,
0.02338862232863903,
-0.0369720421731472,
0.04712939262390137,
0.04104461893439293,
-0.022292425855994225,
-0.03442632406949997,
-0.002100025536492467,
-0.02293497882783413,
-0.07117632776498795,
0.10332592576742172,
-0.04960126802325249,
-0.01867341808974743,
0.024367181584239006,
-0.0013203911948949099,
-0.09865453839302063,
0.11490140855312347,
0.13411425054073334,
0.07904328405857086,
-0.008173166774213314,
-0.00887296348810196,
0.08315543085336685,
0.024768859148025513,
-0.059497132897377014,
-0.04997941106557846,
-0.0578816793859005,
-0.01103775855153799,
-0.11006730049848557,
-0.03319431096315384,
-0.05145556107163429,
0.023435749113559723,
-0.03759285435080528,
0.08202581107616425,
0.1008087769150734,
0.0037587403785437346,
-0.12597985565662384,
-0.10078350454568863,
0.058527737855911255,
0.0016311021754518151,
-0.11985600739717484,
0.030511584132909775,
0.020324116572737694,
-0.020289039239287376,
-0.03839287534356117,
0.24020956456661224,
-0.04899356886744499,
-0.14857585728168488,
-0.027854040265083313,
-0.07980065792798996,
-0.13417568802833557,
-0.0028835588600486517,
0.09450183808803558,
0.04527958482503891,
0.0019210866885259748,
0.006148973945528269,
-0.00012301304377615452,
0.0685151144862175,
0.2548173666000366,
0.0688302293419838,
0.0022807009518146515,
-0.007155134342610836,
0.12050821632146835,
0.04213239997625351,
-0.018121546134352684,
-0.09909569472074509,
0.05861556529998779,
-0.04374609515070915,
0.08832455426454544,
0.013615179806947708,
0.02055468037724495,
0.15976473689079285,
-0.04130787029862404,
0.0008283894858323038,
0.07814821600914001,
-0.03552143648266792,
-0.032923322170972824,
-0.10796109586954117,
-0.07735425233840942,
-0.12946517765522003,
-0.037043068557977676,
-0.13395346701145172,
-0.12354418635368347,
0.05780278891324997,
0.05163034796714783,
-0.10321836918592453,
0.22552746534347534,
0.07728411257266998,
-0.050891585648059845,
0.10505253076553345,
-0.015061930753290653,
0.026913221925497055,
-0.023959219455718994,
-0.011334666050970554,
-0.1418486088514328,
0.15370512008666992,
0.026647314429283142,
-0.0051162815652787685,
-0.01123642735183239,
0.02853717841207981,
0.03698915243148804,
-0.014529705047607422,
-0.07817541807889938,
-0.06355109810829163,
-0.02356579154729843,
-0.0771423950791359,
-0.013330419547855854,
-0.0888831838965416,
0.022932101041078568,
0.1545553207397461,
-0.066379114985466,
-0.14862006902694702,
-0.08192913979291916,
0.16841678321361542,
0.020939605310559273,
0.09079746901988983,
0.025233086198568344,
-0.04200680926442146,
-0.13572748005390167,
0.2166474610567093,
0.2269304394721985,
-0.1542128473520279,
-0.025468355044722557,
0.10364893823862076,
0.00011117979738628492,
-0.026944922283291817,
-0.03778068721294403,
0.04045606777071953,
0.2999686002731323,
-0.017578456550836563,
-0.03426172211766243,
-0.07715397328138351,
0.0075816004537045956,
-0.022793849930167198,
-0.06595467031002045,
0.1232529878616333,
-0.021790001541376114,
-0.11515691876411438,
0.023263365030288696,
-0.08194756507873535,
-0.05311277508735657,
-0.11730700731277466,
-0.1721322238445282,
-0.10980654507875443,
-0.027435701340436935,
0.024088412523269653,
-0.061348721385002136,
0.07147856801748276,
-0.08511550724506378,
0.03731899335980415,
-0.06423049420118332,
-0.017076538875699043,
-0.05732958763837814,
0.08539456129074097,
0.026465825736522675,
-0.06322766095399857,
0.1297496110200882,
-0.032309092581272125,
0.049773361533880234,
0.12857069075107574,
-0.0482034906744957,
-0.06228337064385414,
0.03405341878533363,
0.03615075722336769,
-0.04406554996967316,
0.016752133145928383,
-0.09243129938840866,
0.017179148271679878,
0.05633174255490303,
0.023234179243445396,
-0.048492707312107086,
-0.010278327390551567,
0.19613154232501984,
0.05273323133587837,
-0.08562851697206497,
-0.04963144659996033,
-0.04447149485349655,
0.08535847067832947,
0.09505713731050491,
-0.036568090319633484,
0.01603204570710659,
-0.08888866752386093,
0.02089628018438816,
0.08283327519893646,
0.13777311146259308,
0.033056292682886124,
-0.1904941350221634,
-0.04246451333165169,
-0.03159359097480774,
-0.030801041051745415,
-0.006107655819505453,
-0.015576153993606567,
-0.05646464601159096,
-0.028681105002760887,
-0.08539919555187225,
0.05737270787358284,
0.10107872635126114,
-0.003928616177290678,
-0.0017726797377690673,
0.10446605831384659,
-0.13698500394821167,
-0.01335693709552288,
-0.2032170295715332,
-0.1284758746623993
] |
null | null |
transformers
|
# efficientnet_b6
Implementation of EfficientNet proposed in [EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks](https://arxiv.org/abs/1905.11946)

The basic architecture is similar to MobileNetV2 as was computed by
using [Progressive Neural Architecture
Search](https://arxiv.org/abs/1905.11946) .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):

Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.

``` python
EfficientNet.efficientnet_b0()
EfficientNet.efficientnet_b1()
EfficientNet.efficientnet_b2()
EfficientNet.efficientnet_b3()
EfficientNet.efficientnet_b4()
EfficientNet.efficientnet_b5()
EfficientNet.efficientnet_b6()
EfficientNet.efficientnet_b7()
EfficientNet.efficientnet_b8()
EfficientNet.efficientnet_l2()
```
Examples:
``` python
EfficientNet.efficientnet_b0(activation = nn.SELU)
# change number of classes (default is 1000 )
EfficientNet.efficientnet_b0(n_classes=100)
# pass a different block
EfficientNet.efficientnet_b0(block=...)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = EfficientNet.efficientnet_b0()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
# [torch.Size([1, 32, 112, 112]), torch.Size([1, 24, 56, 56]), torch.Size([1, 40, 28, 28]), torch.Size([1, 80, 14, 14])]
```
|
{}
| null |
glasses/efficientnet_b6
|
[
"transformers",
"pytorch",
"arxiv:1905.11946",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1905.11946"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us
|
# efficientnet_b6
Implementation of EfficientNet proposed in EfficientNet: Rethinking
Model Scaling for Convolutional Neural
Networks
!image
The basic architecture is similar to MobileNetV2 as was computed by
using Progressive Neural Architecture
Search .
The following table shows the basic architecture
(EfficientNet-efficientnet\_b0):
!image
Then, the architecture is scaled up from
[-efficientnet\_b0]{.title-ref} to [-efficientnet\_b7]{.title-ref}
using compound scaling.
!image
Examples:
|
[
"# efficientnet_b6\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n",
"# efficientnet_b6\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
29,
144
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1905.11946 #endpoints_compatible #region-us \n# efficientnet_b6\nImplementation of EfficientNet proposed in EfficientNet: Rethinking\nModel Scaling for Convolutional Neural\nNetworks\n\n !image\n\n The basic architecture is similar to MobileNetV2 as was computed by\n using Progressive Neural Architecture\n Search .\n\n The following table shows the basic architecture\n (EfficientNet-efficientnet\\_b0):\n\n !image\n\n Then, the architecture is scaled up from\n [-efficientnet\\_b0]{.title-ref} to [-efficientnet\\_b7]{.title-ref}\n using compound scaling.\n\n !image\n\n \n\n Examples:"
] |
[
-0.03075200878083706,
-0.016268230974674225,
-0.002567846328020096,
0.01931995339691639,
0.03952009975910187,
-0.03651915863156319,
-0.030899669975042343,
0.034680042415857315,
-0.143202006816864,
0.014398453757166862,
0.08386345952749252,
0.10860269516706467,
-0.0323331356048584,
0.10617953538894653,
0.01689203269779682,
-0.16353777050971985,
-0.004095837473869324,
0.11903314292430878,
-0.07355963438749313,
0.04868718236684799,
0.12311618775129318,
-0.079684317111969,
0.09596332162618637,
0.06633547693490982,
-0.11293445527553558,
0.060837723314762115,
-0.08368240296840668,
-0.01722564734518528,
0.13566458225250244,
-0.030611345544457436,
0.09700468182563782,
-0.03323153033852577,
-0.03286941349506378,
-0.15890595316886902,
0.0009613099391572177,
0.0664728507399559,
-0.026904655620455742,
0.09953419864177704,
0.06339409947395325,
0.1211257204413414,
0.21145418286323547,
-0.0055288150906562805,
-0.09201423823833466,
-0.05697617307305336,
-0.011166860349476337,
-0.0918976441025734,
0.017422396689653397,
0.05770162120461464,
0.03437945619225502,
0.0429423451423645,
0.025110255926847458,
0.16426913440227509,
-0.10946662724018097,
0.09410940110683441,
-0.018209367990493774,
-0.3261893391609192,
-0.04513992369174957,
0.07392995059490204,
-0.050339121371507645,
0.015892455354332924,
0.02091604471206665,
0.030431494116783142,
0.06089070439338684,
0.05854809656739235,
0.13834527134895325,
-0.008736166171729565,
-0.12530459463596344,
0.013387596234679222,
-0.10838677734136581,
0.0039039149414747953,
0.18137703835964203,
0.05595766380429268,
0.08315039426088333,
-0.06810890883207321,
-0.21512103080749512,
-0.0241809394210577,
0.016487980261445045,
-0.1986044943332672,
-0.011673113331198692,
0.0013571236049756408,
-0.061276666820049286,
-0.07976429164409637,
-0.08266211301088333,
0.001958887791261077,
-0.13104428350925446,
0.09228824824094772,
0.03190264105796814,
-0.009160840883851051,
-0.04676768556237221,
0.13291719555854797,
-0.028474656865000725,
-0.07834398746490479,
0.07518038153648376,
-0.06145813688635826,
-0.005027241073548794,
0.03237561136484146,
0.023229707032442093,
-0.15008169412612915,
0.050071824342012405,
0.030416464433073997,
0.08655218034982681,
-0.057018935680389404,
0.10110323131084442,
0.07727242261171341,
-0.033201657235622406,
0.12325583398342133,
-0.1674189269542694,
-0.191090926527977,
0.007827727124094963,
0.013307822868227959,
0.05156093090772629,
0.018774600699543953,
-0.07598963379859924,
-0.04627036675810814,
0.11200970411300659,
-0.01849397085607052,
-0.013060378842055798,
0.09514426440000534,
-0.04871341586112976,
-0.1128903403878212,
0.24033033847808838,
-0.022157030180096626,
0.018763354048132896,
-0.0287792831659317,
-0.03165852651000023,
0.0024471175856888294,
0.028387993574142456,
-0.0015052477829158306,
-0.10600791871547699,
0.015695955604314804,
-0.08232371509075165,
-0.1460375338792801,
-0.06078404188156128,
-0.07584136724472046,
-0.005499106366187334,
-0.008358445949852467,
0.00468065170571208,
-0.2091146558523178,
-0.1058872640132904,
0.09423226863145828,
0.09407663345336914,
-0.016409127041697502,
-0.028610585257411003,
-0.03702746704220772,
-0.11255321651697159,
-0.1036120057106018,
-0.0268696416169405,
0.12872374057769775,
-0.04225417971611023,
0.0014907498843967915,
-0.012541447766125202,
0.03166969120502472,
-0.14507852494716644,
0.01541832648217678,
-0.11104625463485718,
0.016966868191957474,
-0.1500684767961502,
0.04871424287557602,
-0.049458153545856476,
-0.010466129519045353,
-0.06920745223760605,
-0.024500971660017967,
-0.19653542339801788,
-0.009955210611224174,
0.11796733736991882,
0.08473338931798935,
-0.025984955951571465,
-0.08658984303474426,
-0.0236013513058424,
-0.07831387966871262,
-0.17561079561710358,
0.15349499881267548,
0.00951063260436058,
0.10891113430261612,
0.0567229762673378,
0.20068615674972534,
0.12500889599323273,
-0.03658106178045273,
-0.07160377502441406,
0.02445637248456478,
-0.11082412302494049,
-0.12240047752857208,
-0.07673074305057526,
0.11297928541898727,
0.05091819539666176,
0.042648423463106155,
-0.0018170690163969994,
0.07896839082241058,
-0.05959760397672653,
-0.0655408650636673,
-0.02120361663401127,
-0.058429233729839325,
-0.010510074906051159,
0.014244742691516876,
0.07061100006103516,
0.043455205857753754,
-0.05011295527219772,
-0.18571150302886963,
0.0666293054819107,
-0.07927238196134567,
0.02487735077738762,
-0.07788964360952377,
0.016760513186454773,
-0.09671380370855331,
0.03434966132044792,
-0.07245248556137085,
-0.032651420682668686,
0.11164737492799759,
0.017251549288630486,
-0.03193223103880882,
0.1543121635913849,
0.02748824842274189,
-0.05286954343318939,
-0.1019546389579773,
-0.024295352399349213,
0.016792921349406242,
0.04086652398109436,
-0.1051705926656723,
-0.12642014026641846,
-0.049157217144966125,
-0.025546418502926826,
0.044452328234910965,
-0.11189919710159302,
-0.033976633101701736,
0.061004817485809326,
0.24265287816524506,
0.018131492659449577,
0.03323861584067345,
-0.07597624510526657,
-0.02288963831961155,
-0.09743606299161911,
-0.017323818057775497,
0.02139226533472538,
-0.00474193599075079,
-0.04362152889370918,
0.05516798049211502,
0.08030948042869568,
0.1257266402244568,
0.05695101246237755,
-0.14295968413352966,
0.05327695235610008,
0.04188162460923195,
-0.05340263620018959,
-0.08641528338193893,
0.036189381033182144,
-0.10193947702646255,
0.08788247406482697,
-0.036057040095329285,
0.11831309646368027,
-0.07500796020030975,
-0.04666106402873993,
0.02473514713346958,
0.027713647112250328,
0.04736137390136719,
0.03579716384410858,
0.11884894222021103,
-0.30257269740104675,
0.038130298256874084,
0.09543177485466003,
-0.09318958222866058,
0.09443666785955429,
0.05336790531873703,
-0.022819623351097107,
0.0030344293918460608,
0.03197367861866951,
-0.016021491959691048,
0.1721828430891037,
-0.15479417145252228,
-0.008135205134749413,
0.039143145084381104,
0.0016447837697342038,
0.19233191013336182,
-0.08050365746021271,
0.09011710435152054,
-0.0005605294136330485,
0.02267099730670452,
0.27795758843421936,
-0.0031534177251160145,
-0.03276938199996948,
0.064254991710186,
-0.05185341089963913,
-0.2715589702129364,
0.004115331918001175,
-0.04014435037970543,
-0.02688644453883171,
0.1474073976278305,
-0.006096323486417532,
-0.1741296648979187,
-0.07667258381843567,
-0.004676239565014839,
-0.04598447307944298,
-0.0006343105924315751,
-0.042183928191661835,
0.0050864978693425655,
-0.07057487219572067,
-0.036635737866163254,
-0.08066082000732422,
0.05949411913752556,
0.05799689143896103,
-0.004067194648087025,
0.05286349356174469,
0.04529130086302757,
-0.10464571416378021,
0.010040603578090668,
-0.1053728237748146,
0.008128917776048183,
0.027156811207532883,
-0.11914891004562378,
0.12598466873168945,
0.1519782692193985,
-0.030853331089019775,
-0.013480276800692081,
0.029829340055584908,
0.10582457482814789,
-0.0515705905854702,
0.07515168935060501,
-0.07661419361829758,
0.05680442601442337,
0.021544724702835083,
0.13216575980186462,
0.0029211428482085466,
-0.015671193599700928,
0.033053938299417496,
-0.012159238569438457,
0.0004122159443795681,
-0.1460714042186737,
-0.18129393458366394,
-0.04682056978344917,
0.01983504369854927,
0.079520583152771,
0.0636727511882782,
0.092635378241539,
0.06632699817419052,
-0.008270302787423134,
0.026883432641625404,
0.00859636627137661,
0.05384628847241402,
0.23021383583545685,
0.026469582691788673,
0.07165256142616272,
-0.04237162321805954,
-0.20963674783706665,
0.04617661237716675,
0.10244028270244598,
0.1989038586616516,
-0.050277967005968094,
0.09292856603860855,
0.02603476122021675,
0.11241372674703598,
0.011538270860910416,
0.11342811584472656,
-0.06501355767250061,
-0.008422568440437317,
-0.06668821722269058,
-0.035307787358760834,
0.06227073073387146,
0.07282739877700806,
-0.013583862222731113,
-0.05030471086502075,
0.07388094067573547,
0.0873972475528717,
0.11083899438381195,
0.09568364173173904,
0.11158344894647598,
-0.25003454089164734,
0.057991039007902145,
-0.04365823045372963,
-0.09233134239912033,
-0.05113724619150162,
0.05275167524814606,
0.2225513607263565,
0.006515482906252146,
0.04429253563284874,
-0.04144793748855591,
0.05511404573917389,
-0.08222442865371704,
0.038270995020866394,
0.023058444261550903,
0.08253493905067444,
0.06044120714068413,
0.09599658846855164,
-0.08502335101366043,
0.19802352786064148,
0.004158782307058573,
0.023577332496643066,
-0.07298117876052856,
-0.040492285043001175,
-0.02466178871691227,
0.043355364352464676,
0.03201354295015335,
-0.009388820268213749,
0.25300076603889465,
0.1292107105255127,
-0.12254276126623154,
0.05391806364059448,
0.032635487616062164,
0.05305709317326546,
0.07187116146087646,
0.024682842195034027,
-0.07454051822423935,
0.017284687608480453,
0.12002784013748169,
0.024343516677618027,
-0.10135812312364578,
0.006922198459506035,
0.17134930193424225,
-0.12289571017026901,
-0.055839233100414276,
-0.029350202530622482,
-0.05489111319184303,
0.2694244980812073,
0.014113106764853,
-0.001003729528747499,
-0.007928379811346531,
0.016879340633749962,
0.11883030086755753,
-0.038156360387802124,
0.03439083322882652,
-0.03800079599022865,
0.03781832754611969,
0.011996697634458542,
-0.0860016793012619,
0.08702784031629562,
-0.08984915167093277,
0.00021350837778300047,
-0.06687624752521515,
0.04088640585541725,
0.015436033718287945,
-0.013378054834902287,
0.06966301053762436,
-0.01855394057929516,
-0.1375717967748642,
-0.08415593206882477,
-0.11190392076969147,
-0.0005230430397205055,
-0.13176673650741577,
0.0814698338508606,
-0.05894625559449196,
-0.0037773814983665943,
-0.02822571061551571,
0.04966692253947258,
0.09997230768203735,
0.08669394254684448,
-0.07478120177984238,
-0.10609287023544312,
0.11914686113595963,
-0.002659411868080497,
-0.25285130739212036,
-0.14024406671524048,
0.04362526535987854,
0.034867968410253525,
0.04559621959924698,
-0.017948193475604057,
0.20754778385162354,
0.06420484185218811,
-0.03632780537009239,
-0.009516763500869274,
-0.10053443908691406,
-0.05828089267015457,
0.09280921518802643,
-0.006892838515341282,
0.1160707101225853,
-0.007546895183622837,
0.004624119959771633,
0.02671694941818714,
0.024261245504021645,
0.07920272648334503,
-0.09092710167169571,
0.10881487280130386,
-0.008950243704020977,
-0.057145338505506516,
0.003548022359609604,
-0.03351741284132004,
0.030308157205581665,
0.06118977069854736,
0.026361459866166115,
0.0856795683503151,
-0.06326370686292648,
0.14228256046772003,
0.009284356608986855,
0.053707703948020935,
0.11780751496553421,
0.0485430471599102,
-0.12116070091724396,
-0.026198718696832657,
-0.11999065428972244,
0.0745052695274353,
0.050189871340990067,
-0.07576306909322739,
-0.1564672589302063,
0.023035036399960518,
0.133600652217865,
0.08823367208242416,
0.21726946532726288,
0.1250782608985901,
-0.006740070879459381,
0.2732212543487549,
0.08710059523582458,
-0.19115950167179108,
0.06269218772649765,
-0.025476686656475067,
0.007547764573246241,
0.04643448442220688,
0.02167188562452793,
0.0376439169049263,
0.1398412138223648,
-0.01627775840461254,
-0.03483733907341957,
0.026169082149863243,
-0.09099149703979492,
-0.11477958410978317,
0.1516886204481125,
-0.15294820070266724,
-0.07387566566467285,
-0.06791076809167862,
0.05648326873779297,
0.0021483113523572683,
0.22357690334320068,
0.10369143635034561,
-0.11074794828891754,
0.01850889064371586,
0.04734766110777855,
-0.06822258234024048,
-0.0948486402630806,
0.11498255282640457,
0.12110147625207901,
0.03931519761681557,
-0.033905744552612305,
0.13143125176429749,
0.07972612977027893,
-0.03720955550670624,
-0.07841873914003372,
-0.03598332405090332,
-0.11838509142398834,
0.004149405285716057,
-0.08972124010324478,
0.01848699152469635,
-0.010545571334660053,
-0.10778108239173889,
-0.08208153396844864,
0.007379238028079271,
0.03659382089972496,
-0.06527698785066605,
0.12416776269674301,
0.1002250611782074,
-0.08259204030036926,
0.04984133690595627,
-0.09193265438079834,
0.11276588588953018,
0.024887440726161003,
0.10711459070444107,
-0.1669798046350479,
-0.07209490239620209,
-0.00010680183186195791,
0.07535847276449203,
-0.07939411699771881,
0.010082138702273369,
0.038563717156648636,
0.032069671899080276,
-0.01860366202890873,
-0.09548287838697433,
-0.03574877232313156,
-0.07778956741094589,
0.0027947823982685804,
-0.06434529274702072,
-0.13402435183525085,
0.024619929492473602,
-0.03837171569466591,
0.04689006879925728,
0.04227829352021217,
-0.019749250262975693,
-0.03402552753686905,
-0.0012740952661260962,
-0.02206234261393547,
-0.06855140626430511,
0.10136714577674866,
-0.048009127378463745,
-0.018740549683570862,
0.026007628068327904,
-0.0036050511989742517,
-0.0979178175330162,
0.11656147986650467,
0.13571608066558838,
0.08062809705734253,
-0.00531139736995101,
-0.008674932643771172,
0.08035094290971756,
0.02603941410779953,
-0.0626710057258606,
-0.050787318497896194,
-0.05841071903705597,
-0.009758886881172657,
-0.11206777393817902,
-0.03463362902402878,
-0.05214926227927208,
0.024937231093645096,
-0.03631969913840294,
0.0801575630903244,
0.10144215077161789,
0.003876553848385811,
-0.1248532384634018,
-0.10261352360248566,
0.06023748964071274,
0.0028853334952145815,
-0.12274938076734543,
0.03116948902606964,
0.020138967782258987,
-0.018438225612044334,
-0.040070127695798874,
0.23114027082920074,
-0.0492992028594017,
-0.14985865354537964,
-0.02854081057012081,
-0.07923432439565659,
-0.14190496504306793,
-0.002535797655582428,
0.08964261412620544,
0.04770767688751221,
0.0002863180707208812,
0.007456029299646616,
0.0009484713664278388,
0.06812108308076859,
0.26167184114456177,
0.06636267900466919,
0.002794884145259857,
-0.005817597731947899,
0.12054617702960968,
0.041477713733911514,
-0.02014153264462948,
-0.10183000564575195,
0.06358800083398819,
-0.048702701926231384,
0.08939103037118912,
0.01433365885168314,
0.01911332458257675,
0.15974970161914825,
-0.03628682345151901,
0.0027556137647479773,
0.07352066040039062,
-0.03472968935966492,
-0.031105700880289078,
-0.1045321449637413,
-0.0797165185213089,
-0.1316741406917572,
-0.0363289974629879,
-0.13302893936634064,
-0.12071575224399567,
0.06195826828479767,
0.049834199249744415,
-0.10450505465269089,
0.22923140227794647,
0.07842246443033218,
-0.049823008477687836,
0.10607309639453888,
-0.015390358865261078,
0.02360551618039608,
-0.02318417839705944,
-0.010705321095883846,
-0.1390915811061859,
0.15359705686569214,
0.02645980939269066,
-0.0038812237326055765,
-0.012863549403846264,
0.031047506257891655,
0.03755985572934151,
-0.014122208580374718,
-0.07784069329500198,
-0.06116935983300209,
-0.023306582123041153,
-0.08157456666231155,
-0.01529536023736,
-0.08657234907150269,
0.02182702347636223,
0.15759295225143433,
-0.06989660859107971,
-0.14508476853370667,
-0.07944592833518982,
0.17288169264793396,
0.02188272401690483,
0.08988777548074722,
0.024150166660547256,
-0.0435253269970417,
-0.13280823826789856,
0.21393606066703796,
0.22317230701446533,
-0.1524299830198288,
-0.026083270087838173,
0.10351154208183289,
-0.00036249312688596547,
-0.02513471432030201,
-0.037129953503608704,
0.04112701863050461,
0.30088740587234497,
-0.01922301948070526,
-0.0349273718893528,
-0.07722125947475433,
0.008163850754499435,
-0.02279302105307579,
-0.06293035298585892,
0.12345647811889648,
-0.022897835820913315,
-0.11576753854751587,
0.02305673062801361,
-0.0808483213186264,
-0.05791311338543892,
-0.11489938199520111,
-0.17337383329868317,
-0.10905342549085617,
-0.02849249355494976,
0.026419393718242645,
-0.058700088411569595,
0.0732090100646019,
-0.0874885767698288,
0.03671662136912346,
-0.06844028830528259,
-0.01623232662677765,
-0.05844526365399361,
0.08634612709283829,
0.02089749649167061,
-0.07225573062896729,
0.129091277718544,
-0.03213150426745415,
0.04731948301196098,
0.12942703068256378,
-0.04725639894604683,
-0.06493879109621048,
0.03252934664487839,
0.034967221319675446,
-0.0466739684343338,
0.016193190589547157,
-0.09202015399932861,
0.018721429631114006,
0.05233195051550865,
0.02393754944205284,
-0.05355163663625717,
-0.010761870071291924,
0.19723065197467804,
0.053045064210891724,
-0.08524347096681595,
-0.04836190119385719,
-0.043501850217580795,
0.08549731224775314,
0.09757054597139359,
-0.03509347513318062,
0.015434316359460354,
-0.08864304423332214,
0.018078969791531563,
0.08202855288982391,
0.13716621696949005,
0.03207720071077347,
-0.18930834531784058,
-0.040915001183748245,
-0.029030557721853256,
-0.029906727373600006,
-0.0072624399326741695,
-0.011953188106417656,
-0.05680811032652855,
-0.029593128710985184,
-0.08294236660003662,
0.0550558939576149,
0.10222949087619781,
-0.0058708940632641315,
-0.0017081352416425943,
0.09660660475492477,
-0.13375961780548096,
-0.010829835198819637,
-0.20057876408100128,
-0.12962894141674042
] |
null | null |
transformers
|
# regnetx_002
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnetx_002
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnetx_002
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnetx_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnetx_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnetx_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.012046673335134983,
0.02213221788406372,
-0.0025843351613730192,
0.01808190532028675,
-0.035390883684158325,
-0.13132107257843018,
-0.011316115036606789,
0.03665684163570404,
-0.14438119530677795,
-0.05740875378251076,
0.11733265966176987,
-0.026168769225478172,
-0.11097024381160736,
0.13589711487293243,
-0.08214926719665527,
-0.20542269945144653,
0.043501678854227066,
0.07403137534856796,
-0.05352909117937088,
0.11825255304574966,
0.1561916619539261,
-0.06232955679297447,
0.028555281460285187,
0.11036462336778641,
-0.07148232311010361,
0.021724214777350426,
-0.06889130175113678,
-0.03316297009587288,
0.022805238142609596,
0.008693167939782143,
0.1600242406129837,
0.04736216366291046,
-0.0201747864484787,
-0.09779926389455795,
0.00655911723151803,
0.07112523168325424,
-0.013074657879769802,
0.07753697782754898,
0.07324721664190292,
0.10479635000228882,
0.10814180225133896,
0.01978525146842003,
-0.09372067451477051,
-0.006218660157173872,
-0.06501209735870361,
-0.030673395842313766,
-0.0029129497706890106,
0.027661746367812157,
0.049218930304050446,
0.0662490651011467,
-0.05057360231876373,
0.1631776988506317,
-0.17973865568637848,
0.0776621624827385,
0.22397692501544952,
-0.377212256193161,
-0.0496583990752697,
0.07298526912927628,
0.16373282670974731,
-0.041456058621406555,
0.006521204486489296,
0.05522385239601135,
0.005359126720577478,
-0.03360976651310921,
0.17155681550502777,
-0.06528056412935257,
-0.1270943433046341,
0.00748354010283947,
-0.2088090479373932,
0.008974466472864151,
0.12744632363319397,
0.0569755956530571,
0.023888176307082176,
-0.05607401579618454,
-0.20968055725097656,
0.028766514733433723,
-0.02532060816884041,
-0.10135336965322495,
0.008585597388446331,
0.058111630380153656,
-0.0634048655629158,
-0.13302628695964813,
-0.10949064791202545,
0.041543539613485336,
-0.11176498979330063,
0.18709436058998108,
0.04149313643574715,
-0.013664960861206055,
-0.027172738686203957,
0.08590517193078995,
-0.07302963733673096,
-0.016255155205726624,
0.027281278744339943,
-0.134883850812912,
-0.13633926212787628,
-0.00004713023372460157,
-0.03411354497075081,
-0.08868903666734695,
-0.0023720157332718372,
0.16942152380943298,
0.06666073203086853,
-0.020846925675868988,
0.16220860183238983,
0.09823980182409286,
-0.028224939480423927,
0.08378813415765762,
-0.20287112891674042,
-0.07072647660970688,
0.08298581093549728,
-0.012858155183494091,
0.011423295363783836,
0.010246101766824722,
-0.05397545173764229,
-0.08276426792144775,
0.0925723984837532,
0.0015736967325210571,
0.038105811923742294,
0.08494600653648376,
-0.04351196810603142,
-0.03622066602110863,
0.07596960663795471,
-0.003704693168401718,
0.022120699286460876,
-0.037823665887117386,
-0.06126674264669418,
-0.08813715726137161,
0.050690971314907074,
0.03688737004995346,
-0.0999874621629715,
0.05844910815358162,
-0.0789058580994606,
-0.021250044927001,
-0.040451809763908386,
-0.13817332684993744,
-0.02944452501833439,
-0.08487418293952942,
-0.030289625748991966,
-0.17890506982803345,
-0.11840329319238663,
0.03263961896300316,
0.08987122774124146,
-0.0018144063651561737,
-0.00019397682626731694,
0.02136698178946972,
-0.1277395486831665,
-0.15011899173259735,
-0.02749623917043209,
0.06920404732227325,
-0.04210267588496208,
-0.032666582614183426,
0.04003193974494934,
0.06587440520524979,
-0.06796993315219879,
0.038424644619226456,
-0.1130366399884224,
0.02617875672876835,
-0.08458173274993896,
0.08909204602241516,
-0.06658094376325607,
0.049589887261390686,
-0.06718806177377701,
-0.06534479558467865,
-0.13083742558956146,
0.0014122631400823593,
0.09500133246183395,
0.08040560036897659,
-0.07619887590408325,
-0.011525949463248253,
0.033885013312101364,
-0.015182294882833958,
-0.054917264729738235,
0.1672392189502716,
-0.05271453782916069,
0.09949581325054169,
0.0628395676612854,
0.14031477272510529,
0.12716694176197052,
0.00506596639752388,
-0.018608806654810905,
0.03713348135352135,
-0.18747426569461823,
-0.15565072000026703,
-0.008883384987711906,
0.1372583955526352,
0.04968476668000221,
0.0032882182858884335,
-0.08773759007453918,
0.06664888560771942,
-0.06068367883563042,
-0.037696074694395065,
-0.03638613224029541,
-0.04689662903547287,
-0.005202458705753088,
0.037751249969005585,
0.08304081112146378,
0.035592347383499146,
-0.13079103827476501,
-0.06284179538488388,
0.0026996827218681574,
-0.032078683376312256,
0.06902159005403519,
-0.08340819180011749,
0.09982776641845703,
-0.09611213207244873,
0.010053631849586964,
-0.21122722327709198,
-0.024844609200954437,
0.022480303421616554,
-0.003232409479096532,
0.046988099813461304,
0.18538914620876312,
0.04555876925587654,
-0.07525800168514252,
-0.04000956937670708,
0.025848010554909706,
0.10785326361656189,
0.03115575574338436,
-0.09443282335996628,
-0.040112074464559555,
-0.0249593798071146,
-0.04976571723818779,
0.040627844631671906,
-0.06070313602685928,
-0.022276462987065315,
-0.03445480763912201,
0.13617967069149017,
0.01441704947501421,
0.036250412464141846,
0.04483022168278694,
-0.02716454491019249,
-0.05203399062156677,
-0.047869786620140076,
0.033583760261535645,
0.007306050043553114,
-0.06800168752670288,
0.07066776603460312,
0.01899201236665249,
0.17523743212223053,
0.018675629049539566,
-0.19697833061218262,
-0.014244061894714832,
0.1332191675901413,
-0.08521292358636856,
-0.00842661876231432,
-0.04225054010748863,
-0.006562882103025913,
0.10365864634513855,
-0.007502840366214514,
0.09762240946292877,
-0.12052606046199799,
-0.06581300497055054,
-0.0104013467207551,
-0.03620403632521629,
0.07277926802635193,
0.04102358594536781,
0.09630532562732697,
-0.3005679249763489,
0.04021183401346207,
-0.024134885519742966,
-0.10693471133708954,
-0.0057031866163015366,
0.07141935080289841,
-0.027159206569194794,
0.05190274119377136,
0.11762319505214691,
-0.028815321624279022,
0.022296685725450516,
-0.0002822339010890573,
-0.033731378614902496,
0.037379611283540726,
-0.0009907055646181107,
0.09805580228567123,
-0.1062062680721283,
0.0252081286162138,
0.014142322354018688,
-0.01608521305024624,
-0.021781761199235916,
-0.022435041144490242,
-0.05577269569039345,
0.05364186689257622,
0.033240754157304764,
-0.23738408088684082,
-0.0026788068935275078,
-0.0038140725810080767,
-0.02678089402616024,
0.16209322214126587,
0.051295120269060135,
-0.2273700088262558,
-0.03046572022140026,
0.0067380620166659355,
-0.0029717511497437954,
0.003742661327123642,
0.042266760021448135,
-0.04177050665020943,
-0.10338082909584045,
-0.06512509286403656,
-0.127190962433815,
-0.029115481302142143,
0.026245348155498505,
-0.01517247874289751,
0.051335737109184265,
0.05717017501592636,
-0.08778239041566849,
0.03417811542749405,
-0.14205437898635864,
0.1748162806034088,
0.06633619964122772,
-0.09329760074615479,
0.16270680725574493,
0.09168906509876251,
-0.07305917143821716,
-0.023802215233445168,
-0.012914755381643772,
0.2226857990026474,
-0.01461954414844513,
0.07334612309932709,
-0.06629512459039688,
-0.03815844655036926,
0.0622921884059906,
0.11559351533651352,
-0.0008759835036471486,
-0.024998096749186516,
0.006535295397043228,
0.05230249837040901,
-0.037479836493730545,
-0.17526137828826904,
-0.07319357246160507,
-0.12543879449367523,
-0.08856748044490814,
0.006131549831479788,
0.03793206810951233,
-0.01064252108335495,
0.04735206440091133,
-0.008530836552381516,
0.01378290168941021,
0.07193487882614136,
0.09135805070400238,
0.2022775560617447,
0.03222170099616051,
0.13673120737075806,
-0.036935266107320786,
-0.19776396453380585,
0.06530003249645233,
-0.04281771183013916,
0.3430599570274353,
-0.07411056756973267,
0.1359725296497345,
0.0413249209523201,
0.0337444543838501,
0.08392751216888428,
0.10357142984867096,
-0.03669755160808563,
-0.02616281248629093,
-0.04880094900727272,
-0.011461756192147732,
0.09455556422472,
0.021896766498684883,
0.0023559078108519316,
-0.18449367582798004,
0.13619795441627502,
0.0776795744895935,
0.09521299600601196,
0.10586054623126984,
0.10740051418542862,
-0.21587209403514862,
0.07767356187105179,
-0.014304007403552532,
-0.11618318408727646,
-0.044560544192790985,
0.01631001941859722,
0.03276776894927025,
0.04769634082913399,
0.031500015407800674,
-0.08588971942663193,
0.15012884140014648,
-0.04741961508989334,
0.042126189917325974,
-0.05668450519442558,
0.10645809769630432,
0.01840168796479702,
0.07912424206733704,
-0.08004453033208847,
0.23416218161582947,
-0.009457245469093323,
0.02780156023800373,
-0.02778824232518673,
-0.030143974348902702,
-0.012599056586623192,
0.02998187206685543,
0.09702841192483902,
0.003756230929866433,
0.08019758015871048,
0.11049089580774307,
-0.10176090896129608,
0.02309156022965908,
0.07092420011758804,
0.10063760727643967,
0.07537078112363815,
-0.014279413037002087,
-0.08551742881536484,
0.08693372458219528,
-0.013246098533272743,
-0.04219328239560127,
-0.09309394657611847,
0.07164427638053894,
0.12171907722949982,
-0.16446952521800995,
-0.0191056989133358,
-0.01306156162172556,
-0.02668478898704052,
0.3340734839439392,
0.1300865113735199,
-0.08504574000835419,
-0.07686713337898254,
0.0358005054295063,
0.18246635794639587,
-0.017546743154525757,
0.08496139198541641,
-0.058763470500707626,
-0.04167221486568451,
-0.008518999442458153,
-0.062039464712142944,
0.11966907232999802,
-0.0965862050652504,
0.0347658209502697,
-0.039297111332416534,
0.03276870772242546,
0.02473517693579197,
0.016632024198770523,
0.03337636962532997,
-0.008445888757705688,
-0.18252883851528168,
-0.10435402393341064,
-0.1338605433702469,
-0.1749059408903122,
-0.08751668781042099,
0.012393682263791561,
-0.015707816928625107,
0.08465499430894852,
-0.03198764845728874,
0.10532508790493011,
0.06407761573791504,
0.16728655993938446,
-0.0983952134847641,
-0.016682984307408333,
0.1546536386013031,
0.04617416858673096,
-0.14829973876476288,
-0.11509793251752853,
-0.00013589209993369877,
-0.03344929218292236,
-0.03806028142571449,
0.008489105850458145,
0.08220026642084122,
-0.022949067875742912,
-0.02759653702378273,
0.03961220011115074,
-0.05708692595362663,
-0.054466813802719116,
0.14334015548229218,
-0.062165647745132446,
0.35797956585884094,
-0.034065332263708115,
-0.014378203079104424,
0.0034520109184086323,
-0.058567970991134644,
0.01730910688638687,
-0.009752638638019562,
0.1001405119895935,
0.016894644126296043,
-0.08967684954404831,
-0.0072198593989014626,
-0.10062475502490997,
0.012668694369494915,
0.03575146198272705,
0.038540951907634735,
-0.01080339029431343,
-0.009792139753699303,
0.12460190057754517,
-0.03506750985980034,
0.12548001110553741,
0.10510776937007904,
0.024131350219249725,
-0.0597175732254982,
-0.07682356238365173,
-0.046038489788770676,
0.013202323578298092,
0.0516340509057045,
-0.0411057285964489,
-0.1810375303030014,
0.00993830244988203,
0.07399576902389526,
0.05769627168774605,
0.18497224152088165,
0.07662437111139297,
-0.02755160443484783,
0.13342918455600739,
0.13983844220638275,
-0.14338929951190948,
-0.04079548642039299,
0.03276639059185982,
0.02425193600356579,
0.03915946185588837,
-0.08136937022209167,
0.03776540234684944,
0.12917540967464447,
-0.04397895187139511,
-0.030233314260840416,
0.05262019857764244,
-0.11773806065320969,
-0.06644590198993683,
0.09177535772323608,
-0.13660100102424622,
-0.08028504252433777,
-0.008363968692719936,
-0.13359753787517548,
-0.03153142333030701,
0.09901826083660126,
0.04135705158114433,
-0.12119130045175552,
-0.013905083760619164,
0.09084025770425797,
0.004040263593196869,
0.008486430160701275,
0.12729068100452423,
0.10902117937803268,
0.016771579161286354,
-0.06890679150819778,
0.06622564047574997,
0.037798769772052765,
0.08605804294347763,
-0.08092379570007324,
0.026288030669093132,
-0.17491094768047333,
-0.05755145102739334,
-0.011479869484901428,
0.02160419337451458,
-0.10828156769275665,
-0.06589119136333466,
-0.14347407221794128,
-0.015383701771497726,
0.11479159444570541,
0.04054547846317291,
0.12208206206560135,
0.015610766597092152,
-0.03967602550983429,
0.043042656034231186,
-0.04121596738696098,
0.07796835899353027,
0.14383642375469208,
0.07181445509195328,
-0.14418703317642212,
-0.09382402151823044,
0.00876176729798317,
0.04444962739944458,
-0.09704304486513138,
0.050558071583509445,
-0.07624762505292892,
0.03659989684820175,
-0.12501583993434906,
-0.03311203047633171,
-0.0788053646683693,
-0.009753105230629444,
-0.05464648827910423,
-0.07966987043619156,
-0.10705167800188065,
0.03802696615457535,
-0.05598529428243637,
0.0028117068577557802,
0.013958973810076714,
-0.11828630417585373,
-0.09164094179868698,
-0.011034158058464527,
0.016181200742721558,
-0.04018769785761833,
0.08076067268848419,
-0.05351439118385315,
-0.09328578412532806,
-0.044502995908260345,
-0.016122447326779366,
-0.156439870595932,
0.11195135861635208,
0.03754133731126785,
0.048932358622550964,
-0.06362783163785934,
0.0653216689825058,
0.07597200572490692,
0.10809309780597687,
-0.042441461235284805,
-0.052138447761535645,
-0.02129237726330757,
0.021324867382645607,
-0.11758694797754288,
-0.04530191048979759,
-0.018284397199749947,
-0.005539644043892622,
0.07529933750629425,
0.12846839427947998,
0.1991482973098755,
0.05441128835082054,
-0.05644865334033966,
-0.04241260141134262,
0.024291733279824257,
0.00459941616281867,
-0.19063512980937958,
0.08152056485414505,
-0.060688264667987823,
-0.029300538823008537,
-0.024538861587643623,
0.25191739201545715,
0.0449938103556633,
0.0007014944567345083,
0.02000206522643566,
0.038910724222660065,
-0.043522145599126816,
-0.004300604574382305,
0.012142118066549301,
0.04483046010136604,
-0.05623543635010719,
0.005671272054314613,
0.022740237414836884,
0.13646630942821503,
-0.027286184951663017,
0.15020881593227386,
0.07242312282323837,
-0.0068171327002346516,
0.139911487698555,
-0.06031785160303116,
-0.05209522321820259,
-0.16958172619342804,
0.0610201470553875,
-0.0561719611287117,
0.045460913330316544,
-0.01902361959218979,
-0.03364712744951248,
0.21101084351539612,
0.03110499680042267,
0.060437221080064774,
0.06204553693532944,
-0.023149987682700157,
-0.10652115941047668,
-0.10107230395078659,
-0.07111213356256485,
-0.054475221782922745,
-0.02311788871884346,
-0.06244368106126785,
-0.07152561843395233,
0.009976240806281567,
0.062098484486341476,
-0.03473902493715286,
0.1090589091181755,
0.06651055812835693,
-0.0967944785952568,
0.06925763934850693,
-0.07781373709440231,
-0.006441062781959772,
0.13602347671985626,
-0.0012481288285925984,
-0.045120853930711746,
0.07201330363750458,
0.01346746739000082,
0.01012282632291317,
0.013515332713723183,
0.06973236799240112,
-0.030349981039762497,
-0.04270952194929123,
-0.10270506888628006,
0.055670544505119324,
0.018733829259872437,
0.025732966139912605,
-0.010480477474629879,
-0.05737805739045143,
0.04789174720644951,
0.19498561322689056,
-0.030624045059084892,
-0.12713293731212616,
-0.09825953096151352,
0.11951983720064163,
0.06062629818916321,
0.042604874819517136,
-0.05326911062002182,
-0.06747359037399292,
-0.03020888939499855,
0.3249301612377167,
0.2071998119354248,
-0.061095964163541794,
-0.014234203845262527,
0.06728456169366837,
-0.015769444406032562,
0.07724044471979141,
-0.028223110362887383,
0.0405096635222435,
0.3028632700443268,
-0.03021424077451229,
-0.10305032879114151,
-0.03814050927758217,
-0.024344688281416893,
-0.07819832116365433,
0.01065039075911045,
0.1562744528055191,
-0.005403525196015835,
-0.12827090919017792,
0.08252792805433273,
-0.11348441243171692,
-0.10290807485580444,
-0.13411635160446167,
0.0009041514131240547,
-0.06446610391139984,
0.014876521192491055,
0.07979436218738556,
-0.04684821143746376,
0.08297524601221085,
-0.05489387735724449,
-0.020925741642713547,
-0.044520579278469086,
0.0256030410528183,
-0.08989500999450684,
0.039321981370449066,
0.0666244700551033,
0.12412890791893005,
0.0023277944419533014,
0.0019094253657385707,
0.03998070955276489,
0.12370967864990234,
-0.07451821863651276,
-0.16805046796798706,
0.09168212115764618,
0.013242466375231743,
-0.06128431856632233,
0.05080050230026245,
-0.020146451890468597,
-0.03550047054886818,
-0.02255522459745407,
0.006792975123971701,
0.005935531575232744,
0.01251424290239811,
0.024920659139752388,
0.049377765506505966,
-0.12454140186309814,
-0.03356648236513138,
-0.0626995861530304,
0.06434586644172668,
0.030096059665083885,
-0.04030024632811546,
0.04060962796211243,
-0.0917993038892746,
-0.010596249252557755,
0.029319239780306816,
0.056214120239019394,
-0.01942962408065796,
-0.0845073014497757,
-0.026134034618735313,
0.05182024836540222,
0.037945836782455444,
-0.058634813874959946,
0.012689182534813881,
-0.022684425115585327,
-0.016759613528847694,
-0.01952272094786167,
0.07225210219621658,
0.17331139743328094,
0.0008709684479981661,
0.007143740076571703,
-0.022495217621326447,
-0.04462408274412155,
-0.017337126657366753,
-0.13569758832454681,
-0.014904281124472618
] |
null | null |
transformers
|
# regnetx_006
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnetx_006
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnetx_006
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnetx_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnetx_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnetx_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.011824029497802258,
0.02137240581214428,
-0.002515623113140464,
0.01886272430419922,
-0.03342592343688011,
-0.1308571696281433,
-0.011115770787000656,
0.03615366294980049,
-0.14444755017757416,
-0.05712372064590454,
0.11738130450248718,
-0.02454124577343464,
-0.11013580858707428,
0.13272157311439514,
-0.08209332078695297,
-0.2037246823310852,
0.0437607541680336,
0.07413300126791,
-0.05258989334106445,
0.11876516044139862,
0.1558147519826889,
-0.062249183654785156,
0.029316645115613937,
0.10934268683195114,
-0.07238658517599106,
0.022302161902189255,
-0.07009274512529373,
-0.03299795091152191,
0.02202955074608326,
0.007165420334786177,
0.1600281149148941,
0.04758425056934357,
-0.019725682213902473,
-0.09465909749269485,
0.006665156222879887,
0.07181599736213684,
-0.012823830358684063,
0.07752449810504913,
0.07298444211483002,
0.1061847135424614,
0.10972284525632858,
0.01983977109193802,
-0.09369558095932007,
-0.005794026888906956,
-0.06558481603860855,
-0.02896871045231819,
-0.003275932278484106,
0.02464747242629528,
0.049706876277923584,
0.06917989999055862,
-0.051161691546440125,
0.16399028897285461,
-0.17956724762916565,
0.07824699580669403,
0.22355327010154724,
-0.37640494108200073,
-0.04985687509179115,
0.07585009187459946,
0.1623292714357376,
-0.04094758257269859,
0.004826342687010765,
0.05339790880680084,
0.005483499262481928,
-0.032932620495557785,
0.16975437104701996,
-0.06462043523788452,
-0.1287519484758377,
0.006679333280771971,
-0.20869341492652893,
0.010170490480959415,
0.12993764877319336,
0.05798455327749252,
0.024241939187049866,
-0.055481310933828354,
-0.20868544280529022,
0.027425874024629593,
-0.025483667850494385,
-0.10041968524456024,
0.008615979924798012,
0.056864771991968155,
-0.06473502516746521,
-0.1304606944322586,
-0.10920718312263489,
0.041748493909835815,
-0.11216389387845993,
0.185014009475708,
0.041764628142118454,
-0.014441967941820621,
-0.02729562297463417,
0.08525050431489944,
-0.07161373645067215,
-0.016428187489509583,
0.02669130638241768,
-0.13472339510917664,
-0.13502384722232819,
-0.0002673341950867325,
-0.03401079773902893,
-0.08940417319536209,
-0.0023560356348752975,
0.16429467499256134,
0.0655398741364479,
-0.0188168715685606,
0.16206371784210205,
0.09706838428974152,
-0.028757954016327858,
0.0839560404419899,
-0.201686292886734,
-0.06862948089838028,
0.08159588277339935,
-0.011733402498066425,
0.010144738480448723,
0.010284537449479103,
-0.053328800946474075,
-0.08133133500814438,
0.09320660680532455,
0.0009213521261699498,
0.03667667135596275,
0.08717723190784454,
-0.04291915521025658,
-0.03598064184188843,
0.07696250081062317,
-0.0038664608728140593,
0.020824125036597252,
-0.037930767983198166,
-0.061355024576187134,
-0.089020736515522,
0.05026593059301376,
0.03505679965019226,
-0.10033784806728363,
0.058975111693143845,
-0.0802181288599968,
-0.022631196305155754,
-0.041678622364997864,
-0.13853076100349426,
-0.030456535518169403,
-0.0867181047797203,
-0.030228180810809135,
-0.17762896418571472,
-0.11646029353141785,
0.031756896525621414,
0.09022475779056549,
-0.0018284646794199944,
0.0008743337239138782,
0.019577134400606155,
-0.1295004040002823,
-0.1503283679485321,
-0.026400290429592133,
0.07220745086669922,
-0.041913628578186035,
-0.032081007957458496,
0.04138118773698807,
0.06698030978441238,
-0.0678456574678421,
0.03924129530787468,
-0.11400699615478516,
0.024737676605582237,
-0.08533333241939545,
0.08844701200723648,
-0.06800965219736099,
0.049803879112005234,
-0.06742893159389496,
-0.06601417809724808,
-0.13095702230930328,
0.0006163114449009299,
0.09521496295928955,
0.08106082677841187,
-0.0734279528260231,
-0.010436332784593105,
0.030081409960985184,
-0.015055162832140923,
-0.055377308279275894,
0.16525663435459137,
-0.0524136908352375,
0.10020580142736435,
0.06107147037982941,
0.14072395861148834,
0.12754258513450623,
0.005549272522330284,
-0.02142578922212124,
0.03731834143400192,
-0.1868792474269867,
-0.15569652616977692,
-0.010711172595620155,
0.13733001053333282,
0.050769269466400146,
0.003946623299270868,
-0.08747945725917816,
0.06537491828203201,
-0.06101115792989731,
-0.036639995872974396,
-0.03620269522070885,
-0.04637662321329117,
-0.007177856285125017,
0.03889033943414688,
0.08309956640005112,
0.03536108881235123,
-0.13001948595046997,
-0.06881187856197357,
0.0020694835111498833,
-0.030891742557287216,
0.06772495806217194,
-0.0832805410027504,
0.09851188212633133,
-0.0972847044467926,
0.010288482531905174,
-0.21254651248455048,
-0.027223648503422737,
0.022427964955568314,
-0.0015762868570163846,
0.04829102382063866,
0.18483121693134308,
0.04553128778934479,
-0.0766560509800911,
-0.040053460747003555,
0.02570904791355133,
0.1086340993642807,
0.030704598873853683,
-0.09325563907623291,
-0.041553888469934464,
-0.024523543193936348,
-0.05017806962132454,
0.03851890191435814,
-0.0584920309484005,
-0.02275366149842739,
-0.033645931631326675,
0.1342555731534958,
0.014126723632216454,
0.03572070598602295,
0.04537578672170639,
-0.027292374521493912,
-0.05204042047262192,
-0.04836787283420563,
0.034246932715177536,
0.006750862114131451,
-0.06653068959712982,
0.0685654878616333,
0.018139470368623734,
0.17286726832389832,
0.019373098388314247,
-0.19515928626060486,
-0.016293734312057495,
0.1346263736486435,
-0.0848412737250328,
-0.007987617515027523,
-0.04056386277079582,
-0.0047853365540504456,
0.10510949790477753,
-0.008344960398972034,
0.09632434695959091,
-0.11976680159568787,
-0.06605950742959976,
-0.008919432759284973,
-0.03430643305182457,
0.07189769297838211,
0.042285386472940445,
0.09794861078262329,
-0.3021990656852722,
0.04047157242894173,
-0.0227328110486269,
-0.10509755462408066,
-0.006390825379639864,
0.07038147747516632,
-0.027223875746130943,
0.05375600978732109,
0.11882956326007843,
-0.028983980417251587,
0.023212671279907227,
-0.002738590817898512,
-0.03357917442917824,
0.03706467151641846,
0.00022467380040325224,
0.09764733910560608,
-0.10580059140920639,
0.02584909461438656,
0.013043857179582119,
-0.01476842351257801,
-0.02321583963930607,
-0.022108862176537514,
-0.05573930963873863,
0.05353495106101036,
0.03391370177268982,
-0.2390105426311493,
-0.0028088868129998446,
-0.004309598822146654,
-0.02569514513015747,
0.16426387429237366,
0.05158768594264984,
-0.22562900185585022,
-0.029715070500969887,
0.0027466947212815285,
-0.004450700245797634,
0.003157997503876686,
0.04223351553082466,
-0.04539898782968521,
-0.10416002571582794,
-0.06482210010290146,
-0.12586411833763123,
-0.030240515246987343,
0.02466185763478279,
-0.014111320488154888,
0.050907235592603683,
0.05657780542969704,
-0.08886711299419403,
0.033898789435625076,
-0.14299607276916504,
0.17339041829109192,
0.06705931574106216,
-0.09281796962022781,
0.16270741820335388,
0.09029986709356308,
-0.07323341816663742,
-0.023263003677129745,
-0.012692692689597607,
0.22375182807445526,
-0.015196971595287323,
0.07281328737735748,
-0.06764879077672958,
-0.03677620366215706,
0.0617012120783329,
0.11564015597105026,
-0.0008757326868362725,
-0.025456774979829788,
0.007291235961019993,
0.05285001918673515,
-0.03737008571624756,
-0.17584574222564697,
-0.07283974438905716,
-0.12597976624965668,
-0.08910828828811646,
0.006265209522098303,
0.03789965435862541,
-0.012102370150387287,
0.04811498895287514,
-0.007267700508236885,
0.012186110951006413,
0.07093609124422073,
0.09195086359977722,
0.20407240092754364,
0.03231070563197136,
0.13609878718852997,
-0.03806142136454582,
-0.1972007155418396,
0.06354586035013199,
-0.042797189205884933,
0.34603074193000793,
-0.07450396567583084,
0.13493561744689941,
0.041842684149742126,
0.03436052054166794,
0.08263906091451645,
0.1055392399430275,
-0.037350453436374664,
-0.027011767029762268,
-0.04927092418074608,
-0.011020001955330372,
0.09557200223207474,
0.019845496863126755,
-0.0010172240436077118,
-0.18441617488861084,
0.13507263362407684,
0.07831954210996628,
0.0930403470993042,
0.10635437071323395,
0.10768783837556839,
-0.21783073246479034,
0.07747823745012283,
-0.016328861936926842,
-0.11514896899461746,
-0.045467838644981384,
0.01657533086836338,
0.02976895123720169,
0.04839911311864853,
0.03137028217315674,
-0.08554532378911972,
0.1495792120695114,
-0.04697273671627045,
0.04237230122089386,
-0.05620361119508743,
0.10782800614833832,
0.017604459077119827,
0.07903847098350525,
-0.08084363490343094,
0.2356954962015152,
-0.009527244605123997,
0.02807423658668995,
-0.028297647833824158,
-0.030575351789593697,
-0.0125697311013937,
0.029005561023950577,
0.09786509722471237,
0.0033802909310907125,
0.08066520094871521,
0.11112064868211746,
-0.10005723685026169,
0.024771548807621002,
0.07252906262874603,
0.10204000771045685,
0.07572226226329803,
-0.013614177703857422,
-0.0850604847073555,
0.08738891035318375,
-0.013911882415413857,
-0.041286956518888474,
-0.09495474398136139,
0.07099009305238724,
0.12191374599933624,
-0.16420039534568787,
-0.017760159447789192,
-0.012817409820854664,
-0.027804607525467873,
0.3346538841724396,
0.13214658200740814,
-0.08456292748451233,
-0.0758366659283638,
0.03524387627840042,
0.18245930969715118,
-0.017369309440255165,
0.08366285264492035,
-0.06077420338988304,
-0.043983813375234604,
-0.007088000420480967,
-0.06241718679666519,
0.12044752389192581,
-0.09691005945205688,
0.0359349362552166,
-0.03878185153007507,
0.03151988238096237,
0.024321844801306725,
0.015972113236784935,
0.032527681440114975,
-0.008113127201795578,
-0.1835576593875885,
-0.10384305566549301,
-0.13242198526859283,
-0.17869101464748383,
-0.08663715422153473,
0.01284658070653677,
-0.017179884016513824,
0.08302288502454758,
-0.0330919474363327,
0.10584232956171036,
0.06277289241552353,
0.16566191613674164,
-0.09727602452039719,
-0.01822931505739689,
0.15588364005088806,
0.04665446653962135,
-0.14797639846801758,
-0.11288738250732422,
-0.0002763595839496702,
-0.034322962164878845,
-0.039376456290483475,
0.006978659890592098,
0.08352047204971313,
-0.024483315646648407,
-0.02705194056034088,
0.041901905089616776,
-0.05450401082634926,
-0.0528542585670948,
0.1436070203781128,
-0.060791950672864914,
0.35776975750923157,
-0.03371535986661911,
-0.013127314858138561,
0.005534654948860407,
-0.05990248918533325,
0.016427302733063698,
-0.0069448924623429775,
0.10177382081747055,
0.016354849562048912,
-0.08782421797513962,
-0.007712803315371275,
-0.10108831524848938,
0.010266988538205624,
0.03681750223040581,
0.038795795291662216,
-0.010972142219543457,
-0.010350709781050682,
0.12482544779777527,
-0.034340012818574905,
0.1264340579509735,
0.10527440905570984,
0.025130050256848335,
-0.05836499109864235,
-0.07584846019744873,
-0.047898586839437485,
0.012723922729492188,
0.050997260957956314,
-0.04093869775533676,
-0.18096646666526794,
0.01020398922264576,
0.07375306636095047,
0.056553516536951065,
0.18548943102359772,
0.07701269537210464,
-0.024301694706082344,
0.1320459544658661,
0.14083479344844818,
-0.14439372718334198,
-0.04002228006720543,
0.03311850130558014,
0.024531304836273193,
0.039535075426101685,
-0.08170084655284882,
0.03562289476394653,
0.1296588033437729,
-0.0441063791513443,
-0.030517518520355225,
0.05390240252017975,
-0.11760945618152618,
-0.06659901887178421,
0.09217756241559982,
-0.13603277504444122,
-0.08030839264392853,
-0.0073829712346196175,
-0.13389502465724945,
-0.03274790197610855,
0.10137069970369339,
0.03988295793533325,
-0.12057821452617645,
-0.01308327354490757,
0.09081509709358215,
0.004091935697942972,
0.009731348603963852,
0.12695109844207764,
0.10792435705661774,
0.017502915114164352,
-0.06999532878398895,
0.06694622337818146,
0.03800730034708977,
0.08299925923347473,
-0.08070916682481766,
0.027230728417634964,
-0.17594113945960999,
-0.05747634172439575,
-0.010817117989063263,
0.02288009226322174,
-0.10723479092121124,
-0.06416055560112,
-0.14316335320472717,
-0.01670069433748722,
0.11552026122808456,
0.04058467596769333,
0.12163907289505005,
0.014595715329051018,
-0.03959665820002556,
0.043296463787555695,
-0.043632663786411285,
0.07744289934635162,
0.14102965593338013,
0.07318462431430817,
-0.14394749701023102,
-0.09893497824668884,
0.00866604596376419,
0.044296134263277054,
-0.09634029865264893,
0.04989204183220863,
-0.07568558305501938,
0.03551236540079117,
-0.12065054476261139,
-0.03488926962018013,
-0.07794025540351868,
-0.009631195105612278,
-0.0547286793589592,
-0.07869565486907959,
-0.10684382915496826,
0.039285946637392044,
-0.05595364421606064,
0.0021734661422669888,
0.014977432787418365,
-0.11786045134067535,
-0.09077969193458557,
-0.00991770625114441,
0.015055544674396515,
-0.04075276106595993,
0.08063556253910065,
-0.05400573089718819,
-0.09385862201452255,
-0.04447698965668678,
-0.017720527946949005,
-0.15652911365032196,
0.11201205104589462,
0.037336334586143494,
0.04950537532567978,
-0.06373697519302368,
0.06506867706775665,
0.07557855546474457,
0.10892009735107422,
-0.042775947600603104,
-0.05217698961496353,
-0.021604172885417938,
0.023293832316994667,
-0.11856549978256226,
-0.04420318827033043,
-0.01842576079070568,
-0.0044773174449801445,
0.07658415287733078,
0.12756019830703735,
0.1998482197523117,
0.05422455072402954,
-0.05675429105758667,
-0.04199462756514549,
0.024320822209119797,
0.004711237270385027,
-0.1918688863515854,
0.07995136827230453,
-0.06072474643588066,
-0.02866452746093273,
-0.025325966998934746,
0.2490665465593338,
0.04511602967977524,
-0.00038676243275403976,
0.020164351910352707,
0.041584312915802,
-0.04600903391838074,
-0.005714841187000275,
0.010254020802676678,
0.04450967535376549,
-0.057007256895303726,
0.007960748858749866,
0.023745866492390633,
0.13517484068870544,
-0.026878708973526955,
0.15103106200695038,
0.07261048257350922,
-0.008320605382323265,
0.13922636210918427,
-0.059144552797079086,
-0.05310589075088501,
-0.16764822602272034,
0.06317979097366333,
-0.0554521307349205,
0.0465548075735569,
-0.018707938492298126,
-0.03527485951781273,
0.2093847692012787,
0.033395037055015564,
0.06124311685562134,
0.059371065348386765,
-0.02280881442129612,
-0.10438510030508041,
-0.0976887196302414,
-0.07083015143871307,
-0.05481505021452904,
-0.022051308304071426,
-0.06275434046983719,
-0.07082129269838333,
0.011705315671861172,
0.06324117630720139,
-0.03502452373504639,
0.1129188984632492,
0.06730010360479355,
-0.09659957885742188,
0.07045188546180725,
-0.07843709737062454,
-0.008127358742058277,
0.13543258607387543,
0.00047452555736526847,
-0.044425323605537415,
0.07139244675636292,
0.01359890028834343,
0.010343334637582302,
0.012379986234009266,
0.06943150609731674,
-0.030339619144797325,
-0.04287112504243851,
-0.10261312872171402,
0.05652749538421631,
0.018246140331029892,
0.02907855436205864,
-0.011960315518081188,
-0.05690992996096611,
0.047314487397670746,
0.1957968771457672,
-0.03113294579088688,
-0.1263016015291214,
-0.09723655134439468,
0.1220276728272438,
0.060192544013261795,
0.04085111990571022,
-0.05360202118754387,
-0.06780929118394852,
-0.030065298080444336,
0.32475292682647705,
0.20722843706607819,
-0.0604507140815258,
-0.014899259433150291,
0.06764183938503265,
-0.01558429654687643,
0.07676489651203156,
-0.027272338047623634,
0.04211724177002907,
0.3024270534515381,
-0.030735120177268982,
-0.10213000327348709,
-0.039195310324430466,
-0.023570645600557327,
-0.07708588987588882,
0.009418625384569168,
0.15867756307125092,
-0.00581941707059741,
-0.12824614346027374,
0.08216150104999542,
-0.11301299184560776,
-0.103510782122612,
-0.1346026510000229,
0.000006834469331806758,
-0.06447800248861313,
0.013423609547317028,
0.07925567775964737,
-0.0466717928647995,
0.08368417620658875,
-0.054530248045921326,
-0.02066013775765896,
-0.046842027455568314,
0.02511647716164589,
-0.08947332948446274,
0.03798272833228111,
0.06609705090522766,
0.12076615542173386,
0.002601603278890252,
0.00212892540730536,
0.0385211780667305,
0.1240086629986763,
-0.07513134181499481,
-0.16894857585430145,
0.0929313451051712,
0.01270921528339386,
-0.06129422411322594,
0.05159701034426689,
-0.02024025470018387,
-0.03477576747536659,
-0.021846260875463486,
0.005332848057150841,
0.004877636209130287,
0.013454259373247623,
0.026599479839205742,
0.0497441329061985,
-0.12404299527406693,
-0.03253444656729698,
-0.06258312612771988,
0.06468550860881805,
0.03224373981356621,
-0.03927462548017502,
0.03991864621639252,
-0.09281457960605621,
-0.010958179831504822,
0.02931233122944832,
0.055189020931720734,
-0.020097453147172928,
-0.08495566248893738,
-0.025767654180526733,
0.051576029509305954,
0.0369909442961216,
-0.06325767189264297,
0.01379245426505804,
-0.022082503885030746,
-0.016930609941482544,
-0.020087534561753273,
0.07120387256145477,
0.1733584702014923,
0.0014034982305020094,
0.006981231737881899,
-0.01950785703957081,
-0.044509585946798325,
-0.016532059758901596,
-0.13640369474887848,
-0.014539728872478008
] |
null | null |
transformers
|
# regnetx_016
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnetx_016
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnetx_016
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnetx_016\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnetx_016\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnetx_016\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.011376204900443554,
0.020329561084508896,
-0.0025353587698191404,
0.017878424376249313,
-0.03341115266084671,
-0.13093939423561096,
-0.011152401566505432,
0.0362129881978035,
-0.1436641365289688,
-0.05714232847094536,
0.11730335652828217,
-0.024493427947163582,
-0.11062566190958023,
0.13353267312049866,
-0.08217353373765945,
-0.20389088988304138,
0.043317824602127075,
0.07439592480659485,
-0.05381183326244354,
0.11872711777687073,
0.15630479156970978,
-0.061978939920663834,
0.028926964849233627,
0.11089913547039032,
-0.07123952358961105,
0.02210722491145134,
-0.07028233259916306,
-0.03356755152344704,
0.0217757411301136,
0.007847491651773453,
0.16054266691207886,
0.047769684344530106,
-0.020304793491959572,
-0.0963599681854248,
0.006663289852440357,
0.07210776954889297,
-0.013025270774960518,
0.07745381444692612,
0.07325946539640427,
0.1038549542427063,
0.10786077380180359,
0.019165704026818275,
-0.09365848451852798,
-0.005296662915498018,
-0.06560228019952774,
-0.028364811092615128,
-0.002320588566362858,
0.0254819318652153,
0.04895938187837601,
0.06831996887922287,
-0.05081348493695259,
0.16263140738010406,
-0.17952924966812134,
0.07814405113458633,
0.22170181572437286,
-0.37554553151130676,
-0.049601417034864426,
0.07442715018987656,
0.16103848814964294,
-0.041444480419158936,
0.005219115410000086,
0.05370037630200386,
0.005842548329383135,
-0.03265431895852089,
0.17030611634254456,
-0.06433787941932678,
-0.12753257155418396,
0.006630184594541788,
-0.2084827870130539,
0.009519149549305439,
0.1299794763326645,
0.05727051571011543,
0.02433752827346325,
-0.05610540136694908,
-0.20917703211307526,
0.02684335969388485,
-0.025709591805934906,
-0.10022781044244766,
0.007961432449519634,
0.05742952600121498,
-0.06505999714136124,
-0.13073182106018066,
-0.10967215895652771,
0.04261697083711624,
-0.11203568428754807,
0.18505597114562988,
0.041544388979673386,
-0.014291100203990936,
-0.026978526264429092,
0.0854228213429451,
-0.0719335749745369,
-0.016316138207912445,
0.027365202084183693,
-0.1345152109861374,
-0.13459524512290955,
-0.0008516763336956501,
-0.03427406772971153,
-0.08899343758821487,
-0.002657651901245117,
0.16594557464122772,
0.06577029079198837,
-0.019751332700252533,
0.16223888099193573,
0.09786316007375717,
-0.027525415644049644,
0.08371064066886902,
-0.2024424523115158,
-0.06819376349449158,
0.08194397389888763,
-0.012036813423037529,
0.010247603058815002,
0.010504390113055706,
-0.05301517993211746,
-0.08187758922576904,
0.09338168054819107,
0.0010059085907414556,
0.037248220294713974,
0.08649837225675583,
-0.043751060962677,
-0.03655259683728218,
0.07711710035800934,
-0.003791946917772293,
0.020601147785782814,
-0.03786742314696312,
-0.06111522763967514,
-0.08914672583341599,
0.050990235060453415,
0.03543882071971893,
-0.10065522789955139,
0.05855223536491394,
-0.08038181811571121,
-0.02208494208753109,
-0.04104417935013771,
-0.13814550638198853,
-0.03001554124057293,
-0.08637858927249908,
-0.029774248600006104,
-0.17759796977043152,
-0.11817307025194168,
0.03178471699357033,
0.09032076597213745,
-0.0017355270683765411,
0.0003954653220716864,
0.020120877772569656,
-0.12896783649921417,
-0.15073665976524353,
-0.02659037709236145,
0.07043630629777908,
-0.042294979095458984,
-0.032016146928071976,
0.04024730250239372,
0.06676769256591797,
-0.06684444099664688,
0.039644915610551834,
-0.11377302557229996,
0.02471374347805977,
-0.0862419605255127,
0.08965747803449631,
-0.067510224878788,
0.050242409110069275,
-0.06723813712596893,
-0.0662045031785965,
-0.1313694715499878,
0.0007295876857824624,
0.09501870721578598,
0.08097054064273834,
-0.073038250207901,
-0.011255105026066303,
0.02858791872859001,
-0.015280261635780334,
-0.05459967255592346,
0.16609658300876617,
-0.05240797623991966,
0.10125965625047684,
0.061175838112831116,
0.14087770879268646,
0.12914100289344788,
0.005621321499347687,
-0.0211396012455225,
0.03797818720340729,
-0.18738307058811188,
-0.15588565170764923,
-0.010563169606029987,
0.13771086931228638,
0.05005205422639847,
0.003475600155070424,
-0.08715478330850601,
0.06565544754266739,
-0.06152084469795227,
-0.03719717264175415,
-0.036696288734674454,
-0.04678814113140106,
-0.007478286512196064,
0.038813795894384384,
0.08328060060739517,
0.03612237796187401,
-0.1306261271238327,
-0.06737526506185532,
0.0024652162101119757,
-0.0308742243796587,
0.06798966974020004,
-0.08314507454633713,
0.09924691915512085,
-0.09777633845806122,
0.00993363931775093,
-0.21199968457221985,
-0.02544933743774891,
0.021686695516109467,
-0.00014741881750524044,
0.048083558678627014,
0.18548673391342163,
0.04534991830587387,
-0.0767643079161644,
-0.0398196317255497,
0.02579512447118759,
0.10918159037828445,
0.030639614909887314,
-0.09310723096132278,
-0.04131331667304039,
-0.025165952742099762,
-0.04964868724346161,
0.039983347058296204,
-0.05826704204082489,
-0.02248060330748558,
-0.034515075385570526,
0.13508129119873047,
0.013837862759828568,
0.03624187037348747,
0.045459114015102386,
-0.027615105733275414,
-0.05201714113354683,
-0.04785562679171562,
0.034156013280153275,
0.006996168289333582,
-0.06656038761138916,
0.06902817636728287,
0.019402023404836655,
0.1750503033399582,
0.01904432103037834,
-0.19743049144744873,
-0.016058316454291344,
0.1348070651292801,
-0.08516732603311539,
-0.008016952313482761,
-0.04223904386162758,
-0.005080122966319323,
0.1054481640458107,
-0.008316944353282452,
0.09682527184486389,
-0.11958983540534973,
-0.06524442136287689,
-0.009282581508159637,
-0.03480222821235657,
0.07219249755144119,
0.042210862040519714,
0.09631181508302689,
-0.30006295442581177,
0.04047596454620361,
-0.023970410227775574,
-0.10480692237615585,
-0.006189238745719194,
0.07054031640291214,
-0.02770073339343071,
0.05355583876371384,
0.11905543506145477,
-0.029115498065948486,
0.02285701222717762,
-0.0026092238258570433,
-0.03369588777422905,
0.03722133859992027,
-0.0002681068435776979,
0.09777143597602844,
-0.10643962770700455,
0.025704437866806984,
0.0137728126719594,
-0.014993046410381794,
-0.024260828271508217,
-0.022721214219927788,
-0.05534572899341583,
0.05323464423418045,
0.03315186873078346,
-0.23811176419258118,
-0.0029964435379952192,
-0.004209545906633139,
-0.025968771427869797,
0.1637752205133438,
0.05172071233391762,
-0.22678057849407196,
-0.030185269191861153,
0.004257482942193747,
-0.0035295498091727495,
0.003240160411223769,
0.0418514646589756,
-0.04497397318482399,
-0.10434169322252274,
-0.06384105235338211,
-0.12387645989656448,
-0.029651951044797897,
0.024161214008927345,
-0.0137124452739954,
0.05087580531835556,
0.0568796768784523,
-0.08864638954401016,
0.03402407839894295,
-0.14231061935424805,
0.17355486750602722,
0.06714155524969101,
-0.09307630360126495,
0.1621747463941574,
0.09082508087158203,
-0.07315658032894135,
-0.02332763373851776,
-0.01313385646790266,
0.22458840906620026,
-0.015398474410176277,
0.07276342064142227,
-0.06795091927051544,
-0.036916933953762054,
0.06155592203140259,
0.11542641371488571,
-0.0003932490071747452,
-0.024718020111322403,
0.006502365227788687,
0.05285298824310303,
-0.037179067730903625,
-0.17462828755378723,
-0.07373004406690598,
-0.12552249431610107,
-0.08853687345981598,
0.007002068217843771,
0.03750937432050705,
-0.011040317825973034,
0.04849691689014435,
-0.008386853151023388,
0.013622717000544071,
0.0711233988404274,
0.09170938283205032,
0.2049834132194519,
0.03188268467783928,
0.1363217979669571,
-0.0382237546145916,
-0.19790560007095337,
0.06399524211883545,
-0.0420939065515995,
0.34533050656318665,
-0.07472842931747437,
0.13678330183029175,
0.041797373443841934,
0.03422581031918526,
0.0833343118429184,
0.10504768043756485,
-0.03756576031446457,
-0.026781708002090454,
-0.048918694257736206,
-0.010831059888005257,
0.09496103227138519,
0.020057113841176033,
-0.0010439206380397081,
-0.18433775007724762,
0.1352674961090088,
0.07787703722715378,
0.09348035603761673,
0.10702987015247345,
0.1082359254360199,
-0.2175389975309372,
0.0773017406463623,
-0.015977133065462112,
-0.11574508994817734,
-0.04549954831600189,
0.01676231063902378,
0.02869529277086258,
0.04848857969045639,
0.031574077904224396,
-0.08592619746923447,
0.15023331344127655,
-0.047435957938432693,
0.04226263612508774,
-0.05652327463030815,
0.10724277049303055,
0.01716979406774044,
0.07907257229089737,
-0.08009259402751923,
0.23457691073417664,
-0.008973180316388607,
0.028271226212382317,
-0.028928903862833977,
-0.03137924522161484,
-0.012237899005413055,
0.029803551733493805,
0.09725728631019592,
0.0031418476719409227,
0.08319077640771866,
0.11051022261381149,
-0.10043564438819885,
0.024141117930412292,
0.07106123864650726,
0.10199812054634094,
0.07532139867544174,
-0.013791610486805439,
-0.0850372388958931,
0.08752159774303436,
-0.014623104594647884,
-0.040937528014183044,
-0.09444180876016617,
0.07108619809150696,
0.12222681939601898,
-0.164194256067276,
-0.01775982417166233,
-0.013169106096029282,
-0.027149798348546028,
0.333575040102005,
0.13236498832702637,
-0.0853438675403595,
-0.07603207975625992,
0.03561662137508392,
0.18267926573753357,
-0.01665271818637848,
0.08443200588226318,
-0.060393039137125015,
-0.04360181838274002,
-0.007714010775089264,
-0.06219012290239334,
0.11907447129487991,
-0.09747693687677383,
0.03641905635595322,
-0.03853517025709152,
0.03151790797710419,
0.024039437994360924,
0.016439126804471016,
0.032192163169384,
-0.00847556721419096,
-0.18429869413375854,
-0.10342233628034592,
-0.1333765983581543,
-0.1796065866947174,
-0.08720064908266068,
0.01390769425779581,
-0.016595162451267242,
0.0832216888666153,
-0.03329918161034584,
0.10652763396501541,
0.06226802244782448,
0.1647651344537735,
-0.09741316735744476,
-0.018178889527916908,
0.15496951341629028,
0.04684806987643242,
-0.14815618097782135,
-0.11351798474788666,
-0.0002402725222054869,
-0.03378457948565483,
-0.03953520953655243,
0.0070411269553005695,
0.08424684405326843,
-0.02442803792655468,
-0.027179738506674767,
0.04380766302347183,
-0.05518651381134987,
-0.053567249327898026,
0.14490750432014465,
-0.062153562903404236,
0.35901036858558655,
-0.03365813568234444,
-0.013242563232779503,
0.005264592822641134,
-0.058837875723838806,
0.016386380419135094,
-0.0072081726975739,
0.10131946206092834,
0.01648271456360817,
-0.08737532049417496,
-0.007247653789818287,
-0.10066070407629013,
0.010443730279803276,
0.037299927324056625,
0.03855688497424126,
-0.010444635525345802,
-0.009152643382549286,
0.1246492862701416,
-0.034274064004421234,
0.12647245824337006,
0.1060241088271141,
0.024926848709583282,
-0.05853809043765068,
-0.07622124254703522,
-0.04738154634833336,
0.013012588955461979,
0.0513538122177124,
-0.04090701788663864,
-0.181285560131073,
0.009947971440851688,
0.07359547168016434,
0.05674247443675995,
0.18641991913318634,
0.0775294229388237,
-0.024532774463295937,
0.13266576826572418,
0.14051616191864014,
-0.14501379430294037,
-0.04105961322784424,
0.0332278311252594,
0.0241161547601223,
0.03958071395754814,
-0.08190787583589554,
0.03611376881599426,
0.1291905641555786,
-0.04355607554316521,
-0.030363496392965317,
0.05350058153271675,
-0.11694376170635223,
-0.0671137273311615,
0.0921599492430687,
-0.13564801216125488,
-0.08095657825469971,
-0.008199336007237434,
-0.1343308836221695,
-0.0321308895945549,
0.10085578262805939,
0.04013071954250336,
-0.12135165929794312,
-0.013286443427205086,
0.09110282361507416,
0.00401950953528285,
0.009765052236616611,
0.12685175240039825,
0.10879410058259964,
0.017295178025960922,
-0.07034226506948471,
0.06721676141023636,
0.03817828744649887,
0.08348134160041809,
-0.08088821917772293,
0.02520189806818962,
-0.17601387202739716,
-0.05781202018260956,
-0.0119786923751235,
0.022971516475081444,
-0.10847160965204239,
-0.0646699070930481,
-0.14325225353240967,
-0.016477417200803757,
0.11597628891468048,
0.041537344455718994,
0.12196469306945801,
0.014525802806019783,
-0.03971382975578308,
0.043154630810022354,
-0.04347682744264603,
0.07819031924009323,
0.14201536774635315,
0.07222041487693787,
-0.1438683271408081,
-0.09832026809453964,
0.009184601716697216,
0.04504244029521942,
-0.09666801989078522,
0.049065012484788895,
-0.07495974749326706,
0.036027781665325165,
-0.12247519940137863,
-0.03416711464524269,
-0.07878372818231583,
-0.009536286816000938,
-0.0543154738843441,
-0.07937205582857132,
-0.10742031037807465,
0.03901456668972969,
-0.05539179965853691,
0.0027149461675435305,
0.015151331201195717,
-0.11779185384511948,
-0.0908527597784996,
-0.010157328099012375,
0.015240770764648914,
-0.04057422652840614,
0.08016853779554367,
-0.05405810847878456,
-0.09433843940496445,
-0.044975776225328445,
-0.016501495614647865,
-0.15619026124477386,
0.11187504231929779,
0.03657205402851105,
0.049561284482479095,
-0.06437590718269348,
0.06540694832801819,
0.0753745585680008,
0.10865867137908936,
-0.04280361160635948,
-0.051994238048791885,
-0.021111030131578445,
0.023518510162830353,
-0.11818550527095795,
-0.04393693059682846,
-0.01888294704258442,
-0.004776942078024149,
0.07545539736747742,
0.12721550464630127,
0.2002592235803604,
0.05427752062678337,
-0.05667642131447792,
-0.04238786920905113,
0.024417702108621597,
0.004355710931122303,
-0.1914229691028595,
0.0804213136434555,
-0.06040232628583908,
-0.029138606041669846,
-0.025304822251200676,
0.2510194182395935,
0.04510747268795967,
0.0000631351867923513,
0.0202658548951149,
0.042800042778253555,
-0.04668281599879265,
-0.005844170693308115,
0.010746821761131287,
0.0448087602853775,
-0.056683775037527084,
0.006534208077937365,
0.023933570832014084,
0.13565242290496826,
-0.027384785935282707,
0.14954601228237152,
0.07265272736549377,
-0.009204406291246414,
0.13924795389175415,
-0.06026076897978783,
-0.053647398948669434,
-0.16736219823360443,
0.06273762881755829,
-0.055798497051000595,
0.0458071306347847,
-0.01816137321293354,
-0.03600960597395897,
0.21007171273231506,
0.03323175758123398,
0.06161061301827431,
0.06017046049237251,
-0.022508539259433746,
-0.10488009452819824,
-0.09891404956579208,
-0.07024963200092316,
-0.05585446208715439,
-0.022209426388144493,
-0.06284981220960617,
-0.07093256711959839,
0.012394213117659092,
0.0632876604795456,
-0.035033900290727615,
0.11234503239393234,
0.06630833446979523,
-0.09705698490142822,
0.07018329948186874,
-0.07840335369110107,
-0.006635178346186876,
0.13377727568149567,
0.00034113036235794425,
-0.04459068179130554,
0.07153362035751343,
0.013541149906814098,
0.010003306902945042,
0.013564431108534336,
0.06959782540798187,
-0.03064357489347458,
-0.04317083954811096,
-0.10290566086769104,
0.05631555989384651,
0.017726313322782516,
0.028711503371596336,
-0.012382387183606625,
-0.05642314627766609,
0.047301679849624634,
0.19531261920928955,
-0.030206087976694107,
-0.12688423693180084,
-0.09816417098045349,
0.12089651823043823,
0.0604291632771492,
0.04042848199605942,
-0.05311817303299904,
-0.06835977733135223,
-0.02993207797408104,
0.32548660039901733,
0.2078307867050171,
-0.06031384319067001,
-0.01525877695530653,
0.06745445728302002,
-0.015648366883397102,
0.07786168158054352,
-0.028816450387239456,
0.04156043753027916,
0.30319738388061523,
-0.03162117302417755,
-0.10307258367538452,
-0.03937821090221405,
-0.023898664861917496,
-0.07723786681890488,
0.009314767085015774,
0.1578640639781952,
-0.005829808302223682,
-0.12818294763565063,
0.08242563903331757,
-0.1140207052230835,
-0.10319678485393524,
-0.13370618224143982,
0.0003118762979283929,
-0.06416071206331253,
0.013126179575920105,
0.07889190316200256,
-0.04683292657136917,
0.08364557474851608,
-0.054499946534633636,
-0.02091837115585804,
-0.04567023366689682,
0.024936223402619362,
-0.08950101584196091,
0.03852565586566925,
0.06581886112689972,
0.12249957025051117,
0.0018860507989302278,
0.0017554936930537224,
0.038497813045978546,
0.12372932583093643,
-0.07463334500789642,
-0.16855083405971527,
0.09208987653255463,
0.012704634107649326,
-0.06027046591043472,
0.05132194608449936,
-0.02039763517677784,
-0.034677132964134216,
-0.021900704130530357,
0.005736273247748613,
0.0038214644882827997,
0.013158532790839672,
0.027316182851791382,
0.05000525340437889,
-0.12399722635746002,
-0.033752478659152985,
-0.06202055513858795,
0.0641142725944519,
0.031709976494312286,
-0.039126310497522354,
0.040145568549633026,
-0.09298896044492722,
-0.011117568239569664,
0.029885558411478996,
0.05555500462651253,
-0.019930368289351463,
-0.08543556183576584,
-0.025661103427410126,
0.052198659628629684,
0.0366937629878521,
-0.06280457973480225,
0.013489903882145882,
-0.02187824249267578,
-0.017595011740922928,
-0.019586749374866486,
0.07151956111192703,
0.17340737581253052,
0.0022942651994526386,
0.007524291984736919,
-0.01744915172457695,
-0.044319119304418564,
-0.01703096181154251,
-0.1357831060886383,
-0.014417571946978569
] |
null | null |
transformers
|
# regnety_002
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnety_002
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnety_002
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnety_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnety_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnety_002\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.014749549329280853,
0.026628712192177773,
-0.0026284712366759777,
0.017091089859604836,
-0.03499603644013405,
-0.13327935338020325,
-0.010177196003496647,
0.03754694014787674,
-0.14391572773456573,
-0.05763405188918114,
0.11726874113082886,
-0.02673686295747757,
-0.11292830854654312,
0.1342785358428955,
-0.08281828463077545,
-0.20773543417453766,
0.04582802578806877,
0.07467897981405258,
-0.0540713407099247,
0.11975109577178955,
0.15643735229969025,
-0.06145348772406578,
0.028121143579483032,
0.11137842386960983,
-0.06924310326576233,
0.021076681092381477,
-0.0715031623840332,
-0.033736471086740494,
0.019962245598435402,
0.006434908136725426,
0.16038815677165985,
0.0475776307284832,
-0.02057962864637375,
-0.09824495017528534,
0.0062717655673623085,
0.07175564020872116,
-0.012116524390876293,
0.07696937024593353,
0.07204046100378036,
0.10580462217330933,
0.11029968410730362,
0.01840497925877571,
-0.0913110226392746,
-0.008079038932919502,
-0.06363227218389511,
-0.024027708917856216,
-0.002018829109147191,
0.026975922286510468,
0.05363548919558525,
0.06807924062013626,
-0.05133069306612015,
0.16296163201332092,
-0.18206855654716492,
0.07586579769849777,
0.22307531535625458,
-0.3750515282154083,
-0.04830913618206978,
0.07630082219839096,
0.1600673794746399,
-0.04754059389233589,
0.008695508353412151,
0.054620224982500076,
0.003824701299890876,
-0.03270747512578964,
0.16926079988479614,
-0.0644475668668747,
-0.12379872053861618,
0.008220929652452469,
-0.20734894275665283,
0.005643145181238651,
0.12893830239772797,
0.05668942257761955,
0.026232270523905754,
-0.05781994387507439,
-0.21135582029819489,
0.03373447433114052,
-0.026806877925992012,
-0.10054761916399002,
0.00793396681547165,
0.06072409823536873,
-0.06628954410552979,
-0.13209405541419983,
-0.1103421300649643,
0.04385186359286308,
-0.11353058367967606,
0.18300795555114746,
0.04242585971951485,
-0.01366126537322998,
-0.02732817642390728,
0.08769986033439636,
-0.07053562998771667,
-0.01528283953666687,
0.02882133051753044,
-0.13374118506908417,
-0.13400553166866302,
-0.0008711095433682203,
-0.03593774512410164,
-0.08462397754192352,
-0.0013836555881425738,
0.1693989634513855,
0.0671212375164032,
-0.021453114226460457,
0.16530998051166534,
0.09889542311429977,
-0.027029134333133698,
0.08530041575431824,
-0.20536455512046814,
-0.06918957084417343,
0.08458179235458374,
-0.011089930310845375,
0.011631276458501816,
0.00971196684986353,
-0.05497140809893608,
-0.08359219133853912,
0.0950808972120285,
-0.0005729995900765061,
0.04174460470676422,
0.08637498319149017,
-0.04405970126390457,
-0.03806168958544731,
0.07735021412372589,
-0.0036911282222718,
0.019672777503728867,
-0.03784262016415596,
-0.06349758058786392,
-0.09258521348237991,
0.0510394424200058,
0.03816306218504906,
-0.10116563737392426,
0.06018419191241264,
-0.07918976992368698,
-0.021799063310027122,
-0.0401758998632431,
-0.13510437309741974,
-0.029641037806868553,
-0.08204374462366104,
-0.030038513243198395,
-0.17806407809257507,
-0.12241146713495255,
0.03272504732012749,
0.08927994966506958,
-0.001418513129465282,
-0.0014942217385396361,
0.019680967554450035,
-0.13051995635032654,
-0.15220226347446442,
-0.027041561901569366,
0.06598495692014694,
-0.04326914623379707,
-0.03158242627978325,
0.036349911242723465,
0.06456057727336884,
-0.06658830493688583,
0.03890496492385864,
-0.11702736467123032,
0.02634376287460327,
-0.08626068383455276,
0.09096399694681168,
-0.06311117857694626,
0.05097685381770134,
-0.06432860344648361,
-0.06668390333652496,
-0.13002845644950867,
0.001166664413176477,
0.09264200925827026,
0.08365676552057266,
-0.07346952706575394,
-0.011728133074939251,
0.027770210057497025,
-0.0142567940056324,
-0.05236341059207916,
0.16724057495594025,
-0.05313678830862045,
0.09873795509338379,
0.06277532130479813,
0.14089561998844147,
0.129648819565773,
0.00693359924480319,
-0.017955999821424484,
0.03663928806781769,
-0.186164990067482,
-0.1530018299818039,
-0.009677466005086899,
0.13645662367343903,
0.05095386132597923,
0.004606286529451609,
-0.08916014432907104,
0.06594192236661911,
-0.0617133192718029,
-0.038331594318151474,
-0.03716760873794556,
-0.04691710323095322,
-0.007081606425344944,
0.04023781418800354,
0.08356378227472305,
0.03600841388106346,
-0.1303105503320694,
-0.06014459580183029,
0.0042911069467663765,
-0.0320228673517704,
0.069316066801548,
-0.08545617759227753,
0.10281305760145187,
-0.09292157739400864,
0.008717182092368603,
-0.21217411756515503,
-0.02372443489730358,
0.020403187721967697,
-0.0025614923797547817,
0.04839625582098961,
0.18425267934799194,
0.0457565039396286,
-0.07555491477251053,
-0.04075862839818001,
0.024661963805556297,
0.10906119644641876,
0.030873393639922142,
-0.09434150159358978,
-0.03989962488412857,
-0.02529698982834816,
-0.04894740879535675,
0.04456963390111923,
-0.057435497641563416,
-0.021255766972899437,
-0.03309156745672226,
0.14022201299667358,
0.012756970711052418,
0.03614116832613945,
0.046482596546411514,
-0.026114191859960556,
-0.05178702622652054,
-0.048470210283994675,
0.03495996445417404,
0.007725479081273079,
-0.06686172634363174,
0.0661378875374794,
0.01547669805586338,
0.1758395880460739,
0.020120419561862946,
-0.1983484923839569,
-0.01182787586003542,
0.13574093580245972,
-0.08533651381731033,
-0.008832033723592758,
-0.04308130219578743,
-0.005408295430243015,
0.10148349404335022,
-0.007472989149391651,
0.09766153991222382,
-0.11840708553791046,
-0.06491957604885101,
-0.012437624856829643,
-0.03786268085241318,
0.07117768377065659,
0.0409640371799469,
0.09116887301206589,
-0.29753759503364563,
0.04203777387738228,
-0.0245040412992239,
-0.10611674189567566,
-0.005676762200891972,
0.07101813703775406,
-0.02876855805516243,
0.053863152861595154,
0.11612610518932343,
-0.03183933347463608,
0.020761525258421898,
-0.00005852792673977092,
-0.03324451670050621,
0.037243928760290146,
-0.0012423167936503887,
0.0983223170042038,
-0.10666550695896149,
0.024874931201338768,
0.013785433024168015,
-0.015679486095905304,
-0.020919449627399445,
-0.025600329041481018,
-0.05315141752362251,
0.052686747163534164,
0.036368805915117264,
-0.2371356338262558,
-0.0028924611397087574,
-0.005232538096606731,
-0.02512277476489544,
0.16440872848033905,
0.05358618125319481,
-0.2295752465724945,
-0.03103199414908886,
0.00629393057897687,
-0.0024943866301327944,
0.0052668810822069645,
0.0428408719599247,
-0.044252410531044006,
-0.10348248481750488,
-0.06609442085027695,
-0.12494324892759323,
-0.02738994173705578,
0.026202041655778885,
-0.015179735608398914,
0.05285497382283211,
0.05713484063744545,
-0.08790765702724457,
0.03297245502471924,
-0.14130361378192902,
0.1752299666404724,
0.06751831620931625,
-0.09519936144351959,
0.16044364869594574,
0.09290865808725357,
-0.07296746969223022,
-0.022953087463974953,
-0.012904707342386246,
0.22519634664058685,
-0.012517676688730717,
0.07248782366514206,
-0.06760545819997787,
-0.037935320287942886,
0.06137803569436073,
0.11782412230968475,
-0.0016355756670236588,
-0.0250075776129961,
0.007432622369378805,
0.0541134737432003,
-0.03658771514892578,
-0.1717730015516281,
-0.07851550728082657,
-0.12269537150859833,
-0.08751895278692245,
0.007875068113207817,
0.03678343445062637,
-0.009463649243116379,
0.04687666520476341,
-0.011584214866161346,
0.013819600455462933,
0.07315380126237869,
0.09222257137298584,
0.2054165154695511,
0.03187379240989685,
0.13728536665439606,
-0.036318838596343994,
-0.19972831010818481,
0.06460157781839371,
-0.04557591676712036,
0.3405452370643616,
-0.07516656816005707,
0.13266366720199585,
0.04016118496656418,
0.037357643246650696,
0.08408187329769135,
0.1013806015253067,
-0.034406013786792755,
-0.024153882637619972,
-0.04781634733080864,
-0.011806064285337925,
0.09444808214902878,
0.023406613618135452,
0.002947345143184066,
-0.18683092296123505,
0.1360563188791275,
0.07598051428794861,
0.09523945301771164,
0.10806199908256531,
0.11063557863235474,
-0.21708083152770996,
0.07755684107542038,
-0.011830710805952549,
-0.11651749908924103,
-0.04385058581829071,
0.018487535417079926,
0.02898016758263111,
0.047949131578207016,
0.02970280312001705,
-0.08454074710607529,
0.1518000215291977,
-0.04980005696415901,
0.042482584714889526,
-0.057474277913570404,
0.10542367398738861,
0.019755708053708076,
0.07938607037067413,
-0.08012783527374268,
0.23402883112430573,
-0.010279166512191296,
0.027983834967017174,
-0.02836752124130726,
-0.03186126425862312,
-0.01137574203312397,
0.030665535479784012,
0.09457534551620483,
0.0043121883645653725,
0.08054882287979126,
0.10764294117689133,
-0.1040685847401619,
0.022516464814543724,
0.0686318501830101,
0.09849141538143158,
0.07721303403377533,
-0.015575258992612362,
-0.08514348417520523,
0.08710366487503052,
-0.015104921534657478,
-0.038157690316438675,
-0.09277252852916718,
0.07177035510540009,
0.12036065757274628,
-0.15980206429958344,
-0.020041722804307938,
-0.013873337768018246,
-0.02173949033021927,
0.3330283463001251,
0.13485810160636902,
-0.08723986148834229,
-0.07386726140975952,
0.03388960286974907,
0.1846686601638794,
-0.01748599298298359,
0.08418779075145721,
-0.057954322546720505,
-0.043862540274858475,
-0.005741976201534271,
-0.06388074904680252,
0.11779344826936722,
-0.09398870170116425,
0.03521501272916794,
-0.036969028413295746,
0.034967053681612015,
0.026109715923666954,
0.016850637272000313,
0.03377268463373184,
-0.0070596919395029545,
-0.1844356805086136,
-0.10318467020988464,
-0.13638417422771454,
-0.1712958812713623,
-0.08267021924257278,
0.014688319526612759,
-0.013942381367087364,
0.08620918542146683,
-0.03579001501202583,
0.10781548172235489,
0.06104154512286186,
0.16756758093833923,
-0.09758814424276352,
-0.016763176769018173,
0.15414172410964966,
0.04822775349020958,
-0.14768663048744202,
-0.11556953191757202,
-0.0037982433568686247,
-0.034317586570978165,
-0.041822753846645355,
0.00892091728746891,
0.08367784321308136,
-0.023139744997024536,
-0.029183607548475266,
0.03596312552690506,
-0.05675671994686127,
-0.05562782287597656,
0.14843177795410156,
-0.06491117179393768,
0.36110618710517883,
-0.035153716802597046,
-0.014370022341609001,
0.0051835523918271065,
-0.06103992462158203,
0.018741784617304802,
-0.008908497169613838,
0.09955327957868576,
0.01677027717232704,
-0.08562009036540985,
-0.005073921289294958,
-0.10055024921894073,
0.014164915308356285,
0.034517865628004074,
0.03656797856092453,
-0.011517254635691643,
-0.009497157298028469,
0.12528489530086517,
-0.03568115830421448,
0.12535525858402252,
0.11010757088661194,
0.025558363646268845,
-0.059648267924785614,
-0.07736240327358246,
-0.04817614704370499,
0.013177255168557167,
0.05209643393754959,
-0.04333425685763359,
-0.18138039112091064,
0.009168153628706932,
0.07362141460180283,
0.05700509250164032,
0.18681618571281433,
0.07809240370988846,
-0.027889417484402657,
0.13086621463298798,
0.14060211181640625,
-0.14578260481357574,
-0.03836912661790848,
0.03366697579622269,
0.022886818274855614,
0.038910821080207825,
-0.08351664990186691,
0.0361802764236927,
0.1290920525789261,
-0.04451672360301018,
-0.029988043010234833,
0.05199015885591507,
-0.1157383844256401,
-0.06683266162872314,
0.09260601550340652,
-0.13994821906089783,
-0.08077123016119003,
-0.008944499306380749,
-0.13494864106178284,
-0.030784886330366135,
0.09571581333875656,
0.03981475159525871,
-0.12206453084945679,
-0.015254728496074677,
0.09235741943120956,
0.0033413313794881105,
0.007993821986019611,
0.12718993425369263,
0.10869372636079788,
0.017056114971637726,
-0.06940693408250809,
0.06785110384225845,
0.03717350587248802,
0.08112829923629761,
-0.08123371750116348,
0.026655109599232674,
-0.17600186169147491,
-0.057174813002347946,
-0.01485341228544712,
0.020230215042829514,
-0.10733092576265335,
-0.06538756191730499,
-0.14462479948997498,
-0.014597443863749504,
0.11461328715085983,
0.04062458127737045,
0.12494852393865585,
0.019381247460842133,
-0.040734995156526566,
0.04295855760574341,
-0.04254986345767975,
0.07627087086439133,
0.14777140319347382,
0.06882182508707047,
-0.14657078683376312,
-0.09201253950595856,
0.008901463821530342,
0.044715896248817444,
-0.0979815274477005,
0.04876513034105301,
-0.07558796554803848,
0.03799400478601456,
-0.1285657286643982,
-0.03464201092720032,
-0.08292727172374725,
-0.009258167818188667,
-0.05407729372382164,
-0.08138620853424072,
-0.10577517002820969,
0.037833940237760544,
-0.05475962162017822,
0.0022240455728024244,
0.012417139485478401,
-0.1193958967924118,
-0.09170860052108765,
-0.010989891365170479,
0.01785111427307129,
-0.04121108725667,
0.07994705438613892,
-0.05156933516263962,
-0.09623216837644577,
-0.045598581433296204,
-0.012462465092539787,
-0.15815065801143646,
0.11365829408168793,
0.0362873412668705,
0.049180205911397934,
-0.06427057832479477,
0.06342228502035141,
0.07672584056854248,
0.10956671088933945,
-0.04170345142483711,
-0.04647894203662872,
-0.020221499726176262,
0.020915985107421875,
-0.116651751101017,
-0.04517306759953499,
-0.02074728161096573,
-0.008279159665107727,
0.07799716293811798,
0.1289614588022232,
0.2012457400560379,
0.053934674710035324,
-0.05620487034320831,
-0.04188268631696701,
0.022711079567670822,
0.003834279952570796,
-0.19054445624351501,
0.08149775117635727,
-0.06070924177765846,
-0.02878737263381481,
-0.02536720037460327,
0.2522051930427551,
0.04202989488840103,
-0.0008002888644114137,
0.021282415837049484,
0.043884195387363434,
-0.03715701028704643,
-0.003363383933901787,
0.013982892967760563,
0.04431944712996483,
-0.05887417867779732,
0.0059201824478805065,
0.02274780161678791,
0.13662244379520416,
-0.02712577022612095,
0.14495904743671417,
0.07320567220449448,
-0.004892710130661726,
0.13817083835601807,
-0.061061982065439224,
-0.04839989170432091,
-0.17395944893360138,
0.06255700439214706,
-0.05344715341925621,
0.04382956773042679,
-0.018789909780025482,
-0.03335772827267647,
0.2123793363571167,
0.02989266626536846,
0.060951583087444305,
0.060350608080625534,
-0.022051258012652397,
-0.10567977279424667,
-0.10129439830780029,
-0.07215217500925064,
-0.05570110306143761,
-0.02155601605772972,
-0.06314361095428467,
-0.07317902892827988,
0.017728347331285477,
0.06009748578071594,
-0.03547242656350136,
0.10995049774646759,
0.06676764041185379,
-0.09702487289905548,
0.06802113354206085,
-0.07835233956575394,
-0.006883611902594566,
0.1358335167169571,
-0.0017678852891549468,
-0.04570680484175682,
0.06987367570400238,
0.012492740526795387,
0.010597982443869114,
0.013973328284919262,
0.06980808824300766,
-0.03353742137551308,
-0.04357549548149109,
-0.10287945717573166,
0.055229853838682175,
0.01943396031856537,
0.025081243366003036,
-0.011494354344904423,
-0.058789752423763275,
0.04867449402809143,
0.20083987712860107,
-0.03279688209295273,
-0.1316072642803192,
-0.09814755618572235,
0.12269315868616104,
0.05848907306790352,
0.04042321443557739,
-0.05099937692284584,
-0.0674552470445633,
-0.030398553237318993,
0.3248636722564697,
0.2073773443698883,
-0.05830894783139229,
-0.014894777908921242,
0.06637739390134811,
-0.01609799452126026,
0.07563360780477524,
-0.02967010624706745,
0.042917873710393906,
0.30421119928359985,
-0.032542966306209564,
-0.10782576352357864,
-0.03773955628275871,
-0.025585131719708443,
-0.08150696754455566,
0.00997842289507389,
0.1543944627046585,
-0.0069344379007816315,
-0.12926402688026428,
0.08259481191635132,
-0.11746954172849655,
-0.09988265484571457,
-0.1328197419643402,
-0.001865290803834796,
-0.06347253173589706,
0.017096279188990593,
0.08034369349479675,
-0.047560352832078934,
0.08386074751615524,
-0.05564609542489052,
-0.017829863354563713,
-0.049277205020189285,
0.025631587952375412,
-0.0929768830537796,
0.04231100156903267,
0.0670943558216095,
0.12725763022899628,
0.0017699551535770297,
0.0036565857008099556,
0.03968476876616478,
0.1234951987862587,
-0.07275764644145966,
-0.16784241795539856,
0.0910509005188942,
0.014792480506002903,
-0.060669172555208206,
0.04899998754262924,
-0.021186457946896553,
-0.035200007259845734,
-0.021604305133223534,
0.008906199596822262,
0.0031697219237685204,
0.011728017590939999,
0.024289045482873917,
0.05090978741645813,
-0.12696626782417297,
-0.03557369485497475,
-0.06300537288188934,
0.0624786838889122,
0.02859751135110855,
-0.040356919169425964,
0.04079199209809303,
-0.09190019965171814,
-0.012214843183755875,
0.029434124007821083,
0.051009226590394974,
-0.020750150084495544,
-0.08359696716070175,
-0.026435382664203644,
0.052097391337156296,
0.03593093529343605,
-0.05745232105255127,
0.011526044458150864,
-0.02121553011238575,
-0.01709897629916668,
-0.01929357647895813,
0.0755009576678276,
0.17252430319786072,
0.0001194943324662745,
0.00788762979209423,
-0.01961638033390045,
-0.04470796510577202,
-0.015769099816679955,
-0.13795515894889832,
-0.01685832068324089
] |
null | null |
transformers
|
# regnety_004
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnety_004
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnety_004
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnety_004\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnety_004\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnety_004\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.01351251546293497,
0.024365639314055443,
-0.0025744701270014048,
0.017712295055389404,
-0.03334663808345795,
-0.1326739639043808,
-0.011234686709940434,
0.036966584622859955,
-0.146514430642128,
-0.05724862962961197,
0.1176166906952858,
-0.02518608048558235,
-0.11321226507425308,
0.13214339315891266,
-0.08249181509017944,
-0.20696282386779785,
0.045084916055202484,
0.07595542818307877,
-0.05235728994011879,
0.11974959075450897,
0.15649062395095825,
-0.06204254925251007,
0.029487350955605507,
0.1112031564116478,
-0.06798672676086426,
0.020828962326049805,
-0.07255768775939941,
-0.03466999903321266,
0.019978811964392662,
0.006391346920281649,
0.15987379848957062,
0.04635568708181381,
-0.02100915089249611,
-0.09578601270914078,
0.006403952371329069,
0.07239822298288345,
-0.01239679753780365,
0.07720736414194107,
0.07142503559589386,
0.1055295392870903,
0.10922247171401978,
0.01697177067399025,
-0.09029175341129303,
-0.007678466383367777,
-0.06465678662061691,
-0.025013290345668793,
-0.0021465725731104612,
0.024168331176042557,
0.05258398875594139,
0.07005748897790909,
-0.051813844591379166,
0.16337715089321136,
-0.180980384349823,
0.07727014273405075,
0.22254829108715057,
-0.37718722224235535,
-0.04901448264718056,
0.0779632031917572,
0.16044089198112488,
-0.04676419496536255,
0.00747880432754755,
0.05499894917011261,
0.0038676420226693153,
-0.031811803579330444,
0.1706119179725647,
-0.06444648653268814,
-0.12185527384281158,
0.008259931579232216,
-0.20715513825416565,
0.007547444198280573,
0.13202303647994995,
0.05674521252512932,
0.026323745027184486,
-0.05905016511678696,
-0.21010638773441315,
0.03060411661863327,
-0.027834607288241386,
-0.09888016432523727,
0.00792712438851595,
0.059328436851501465,
-0.06824573129415512,
-0.12942713499069214,
-0.1089353933930397,
0.04563141241669655,
-0.11283358186483383,
0.18089696764945984,
0.04318613186478615,
-0.015009078197181225,
-0.025605764240026474,
0.0868600606918335,
-0.07027237117290497,
-0.01491759717464447,
0.029009031131863594,
-0.13344180583953857,
-0.13287466764450073,
-0.0014971193158999085,
-0.037059202790260315,
-0.08398433029651642,
-0.0013636213261634111,
0.16917625069618225,
0.06618135422468185,
-0.020285077393054962,
0.16546931862831116,
0.09780583530664444,
-0.02744586579501629,
0.08665578812360764,
-0.20617760717868805,
-0.07088647782802582,
0.08490345627069473,
-0.011056440882384777,
0.011447064578533173,
0.009627273306250572,
-0.05413518100976944,
-0.08269300311803818,
0.09577560424804688,
-0.00040135334711521864,
0.039930131286382675,
0.08885549008846283,
-0.04250752925872803,
-0.03783636540174484,
0.07794031500816345,
-0.0040579973720014095,
0.018758755177259445,
-0.03810764104127884,
-0.06286413967609406,
-0.092120461165905,
0.05045418068766594,
0.03672226890921593,
-0.10153307765722275,
0.05997083708643913,
-0.07977960258722305,
-0.023252902552485466,
-0.04098931699991226,
-0.13614162802696228,
-0.03151262551546097,
-0.082528255879879,
-0.02977648749947548,
-0.17740370333194733,
-0.1218760684132576,
0.03215480223298073,
0.08955036103725433,
-0.0016851888503879309,
-0.0010780594311654568,
0.019571347162127495,
-0.13054150342941284,
-0.15265235304832458,
-0.025863848626613617,
0.06679030507802963,
-0.043456416577100754,
-0.03167590871453285,
0.037254806607961655,
0.06565196067094803,
-0.06564591079950333,
0.039619430899620056,
-0.11630259454250336,
0.025526443496346474,
-0.09004148095846176,
0.09042946994304657,
-0.06316150724887848,
0.05272899195551872,
-0.06370846927165985,
-0.06848587840795517,
-0.12841586768627167,
0.0010235130321234465,
0.09230829030275345,
0.081975057721138,
-0.07110262662172318,
-0.011146347969770432,
0.023515911772847176,
-0.012778683565557003,
-0.052862633019685745,
0.16686256229877472,
-0.05316264182329178,
0.09885251522064209,
0.0629129558801651,
0.14077693223953247,
0.12943004071712494,
0.007173089776188135,
-0.019328314810991287,
0.036489613354206085,
-0.18559695780277252,
-0.1518452763557434,
-0.011174468323588371,
0.1368040293455124,
0.04995303973555565,
0.00543223088607192,
-0.0888356938958168,
0.06491243839263916,
-0.06201888620853424,
-0.03790951147675514,
-0.036765649914741516,
-0.04715364798903465,
-0.008613169193267822,
0.040090031921863556,
0.08389119803905487,
0.035631950944662094,
-0.13063377141952515,
-0.0649452805519104,
0.00399225065484643,
-0.03224857151508331,
0.06889455765485764,
-0.08504509925842285,
0.10347211360931396,
-0.09325367212295532,
0.008788754232227802,
-0.21254560351371765,
-0.024577433243393898,
0.019813943654298782,
0.00014706944057252258,
0.04874042794108391,
0.1826152205467224,
0.04563116654753685,
-0.07555004954338074,
-0.04102207347750664,
0.02528403140604496,
0.11079748719930649,
0.03062305971980095,
-0.09345965832471848,
-0.04103266820311546,
-0.02498294785618782,
-0.04812535271048546,
0.043198056519031525,
-0.0573340579867363,
-0.021807508543133736,
-0.03325766324996948,
0.13818776607513428,
0.01215639803558588,
0.03636260703206062,
0.04679662361741066,
-0.025278450921177864,
-0.05166802182793617,
-0.04852011054754257,
0.03549771010875702,
0.0074325622990727425,
-0.06649511307477951,
0.0638565719127655,
0.014034289866685867,
0.1766996681690216,
0.020884141325950623,
-0.20034155249595642,
-0.014201764017343521,
0.1363697350025177,
-0.0847366526722908,
-0.008371485397219658,
-0.040983766317367554,
-0.005190794821828604,
0.10340781509876251,
-0.008393564261496067,
0.09857680648565292,
-0.1174183264374733,
-0.06540493667125702,
-0.01176695991307497,
-0.03516373783349991,
0.07224522531032562,
0.041690632700920105,
0.09018117189407349,
-0.29411834478378296,
0.04115555062890053,
-0.02561614289879799,
-0.10487775504589081,
-0.0062460582703351974,
0.0704907476902008,
-0.02817337028682232,
0.05473896861076355,
0.11605380475521088,
-0.031266894191503525,
0.021582957357168198,
-0.002272943500429392,
-0.03406975418329239,
0.03727978095412254,
-0.001459773862734437,
0.09741541743278503,
-0.10493871569633484,
0.025307657197117805,
0.013601329177618027,
-0.014974416233599186,
-0.02126637101173401,
-0.025046957656741142,
-0.053315628319978714,
0.05186333879828453,
0.03692799434065819,
-0.23811857402324677,
-0.0029779337346553802,
-0.0047074914909899235,
-0.023735325783491135,
0.16575761139392853,
0.053153496235609055,
-0.22965338826179504,
-0.030707087367773056,
0.0018826484447345138,
-0.003883597208186984,
0.004649795591831207,
0.042950477451086044,
-0.04614678770303726,
-0.10448548942804337,
-0.06534047424793243,
-0.12347768247127533,
-0.028158031404018402,
0.025727123022079468,
-0.014946775510907173,
0.051852885633707047,
0.057472653687000275,
-0.0891045406460762,
0.033380068838596344,
-0.14155834913253784,
0.17412254214286804,
0.06775210052728653,
-0.09383473545312881,
0.1613466739654541,
0.09116873145103455,
-0.07177158445119858,
-0.022667178884148598,
-0.013580510392785072,
0.22809596359729767,
-0.013574883341789246,
0.07246897369623184,
-0.06920135021209717,
-0.03765895590186119,
0.060284651815891266,
0.11691532284021378,
-0.00042025139555335045,
-0.025682423263788223,
0.00688744755461812,
0.05476018041372299,
-0.03629356250166893,
-0.17090721428394318,
-0.07852567732334137,
-0.1231505423784256,
-0.08732426166534424,
0.008447502739727497,
0.03642510250210762,
-0.008477416820824146,
0.047130800783634186,
-0.010747622698545456,
0.015505736693739891,
0.07264973223209381,
0.09277404844760895,
0.20998558402061462,
0.030852582305669785,
0.137603297829628,
-0.03689904510974884,
-0.20038358867168427,
0.0636100023984909,
-0.045905906707048416,
0.34280693531036377,
-0.07555820792913437,
0.13419575989246368,
0.04064953699707985,
0.037404611706733704,
0.08305744081735611,
0.10239051282405853,
-0.03616403415799141,
-0.02468068338930607,
-0.04794212803244591,
-0.012028277851641178,
0.09668314456939697,
0.02125624381005764,
-0.00023109104949980974,
-0.1867377907037735,
0.13559691607952118,
0.07427212595939636,
0.09503845870494843,
0.10833387821912766,
0.10916142165660858,
-0.21960267424583435,
0.07733495533466339,
-0.01411699689924717,
-0.11605898290872574,
-0.04421232268214226,
0.018768971785902977,
0.025032177567481995,
0.048191916197538376,
0.030288884416222572,
-0.08450911194086075,
0.1516079604625702,
-0.05093120038509369,
0.04348301514983177,
-0.05615973100066185,
0.10725449025630951,
0.01856551691889763,
0.07882434874773026,
-0.0802319198846817,
0.23302112519741058,
-0.009951145388185978,
0.028978200629353523,
-0.028522782027721405,
-0.03269766643643379,
-0.010970844887197018,
0.03275386244058609,
0.0942276269197464,
0.003481533844023943,
0.0831679180264473,
0.10633612424135208,
-0.10276073217391968,
0.023380298167467117,
0.06856955587863922,
0.09903625398874283,
0.0778341293334961,
-0.01519565749913454,
-0.08552804589271545,
0.08701184391975403,
-0.016697414219379425,
-0.03741840645670891,
-0.09372686594724655,
0.0713203102350235,
0.12179053574800491,
-0.15847797691822052,
-0.01887180469930172,
-0.014074213802814484,
-0.025843171402812004,
0.3331182897090912,
0.13676239550113678,
-0.08722644299268723,
-0.07363954186439514,
0.0322258323431015,
0.18503209948539734,
-0.016608156263828278,
0.0852002277970314,
-0.05845455080270767,
-0.04543696716427803,
-0.004487236496061087,
-0.06350786983966827,
0.11765618622303009,
-0.09458813816308975,
0.036565836519002914,
-0.036333005875349045,
0.03384749963879585,
0.024812275543808937,
0.01595333218574524,
0.033509548753499985,
-0.005969140212982893,
-0.18650512397289276,
-0.10293170064687729,
-0.13723896443843842,
-0.17493867874145508,
-0.08164286613464355,
0.01439441554248333,
-0.015922505408525467,
0.08724743127822876,
-0.03608483448624611,
0.10850702226161957,
0.06047645956277847,
0.16442346572875977,
-0.09709928184747696,
-0.018210163339972496,
0.15407414734363556,
0.04877043142914772,
-0.1463082879781723,
-0.1143803820014,
-0.004640643484890461,
-0.03522396460175514,
-0.042788662016391754,
0.006634848657995462,
0.08586367219686508,
-0.024126609787344933,
-0.027487915009260178,
0.03999067470431328,
-0.0541289858520031,
-0.05518259480595589,
0.14925631880760193,
-0.06461963802576065,
0.3621748983860016,
-0.03573114797472954,
-0.013409058563411236,
0.006176817696541548,
-0.05951038748025894,
0.015868820250034332,
-0.009414419531822205,
0.10118799656629562,
0.016655389219522476,
-0.08481843769550323,
-0.00589995039626956,
-0.10020913183689117,
0.012516038492321968,
0.03579996898770332,
0.03665066882967949,
-0.011602948419749737,
-0.008543282747268677,
0.12563201785087585,
-0.035033125430345535,
0.12595191597938538,
0.10917390882968903,
0.02529185451567173,
-0.060536276549100876,
-0.0773022323846817,
-0.04857577383518219,
0.012535272166132927,
0.0514460988342762,
-0.043051473796367645,
-0.18210281431674957,
0.009032455272972584,
0.0732128694653511,
0.05651094391942024,
0.18613852560520172,
0.07814642786979675,
-0.02599312923848629,
0.12798534333705902,
0.14025869965553284,
-0.14706295728683472,
-0.041878700256347656,
0.03497576341032982,
0.022417156025767326,
0.039291150867938995,
-0.08185911923646927,
0.03457491099834442,
0.12885096669197083,
-0.04548380523920059,
-0.030038535594940186,
0.05247246101498604,
-0.11491608619689941,
-0.06705005466938019,
0.09251922369003296,
-0.13974227011203766,
-0.07876154035329819,
-0.00833185762166977,
-0.1388145536184311,
-0.030682316049933434,
0.09716171771287918,
0.03924047201871872,
-0.12137985229492188,
-0.014632400125265121,
0.09236712753772736,
0.002864937763661146,
0.009118168614804745,
0.12685567140579224,
0.10827305912971497,
0.017270242795348167,
-0.07039152085781097,
0.06667662411928177,
0.03801223635673523,
0.08011840283870697,
-0.08110792189836502,
0.02686334401369095,
-0.17756244540214539,
-0.05765136703848839,
-0.016427306458353996,
0.022924918681383133,
-0.10732600092887878,
-0.0647188201546669,
-0.14599700272083282,
-0.01557191926985979,
0.11583928018808365,
0.040633607655763626,
0.12473924458026886,
0.018206577748060226,
-0.0413394495844841,
0.04291630908846855,
-0.04379136860370636,
0.07646282762289047,
0.1457045078277588,
0.06978921592235565,
-0.1457216739654541,
-0.09470812231302261,
0.009773178026080132,
0.04385024681687355,
-0.09723588824272156,
0.04797698184847832,
-0.07533030211925507,
0.03760015219449997,
-0.12648530304431915,
-0.035911448299884796,
-0.08190286159515381,
-0.008391490206122398,
-0.05430760607123375,
-0.07959319651126862,
-0.10593869537115097,
0.038357678800821304,
-0.05359766259789467,
0.0018792301416397095,
0.014536804519593716,
-0.11912744492292404,
-0.09099795669317245,
-0.01018831878900528,
0.01622551493346691,
-0.041913118213415146,
0.08019163459539413,
-0.0531584732234478,
-0.09691242128610611,
-0.04678545892238617,
-0.010792658664286137,
-0.15776991844177246,
0.11224992573261261,
0.035451848059892654,
0.04923311620950699,
-0.0651315450668335,
0.0638142079114914,
0.0760510042309761,
0.11014574766159058,
-0.0409947969019413,
-0.046467725187540054,
-0.020759832113981247,
0.022528517991304398,
-0.11549986153841019,
-0.04339196905493736,
-0.02039043977856636,
-0.008108697831630707,
0.07693970203399658,
0.12803776562213898,
0.20130036771297455,
0.05353222414851189,
-0.0569261759519577,
-0.0417870432138443,
0.022614335641264915,
0.003105886746197939,
-0.192461296916008,
0.08118697255849838,
-0.06094204634428024,
-0.0278973039239645,
-0.02608749456703663,
0.25470906496047974,
0.042465690523386,
-0.001038839342072606,
0.021334156394004822,
0.04631752148270607,
-0.039500851184129715,
-0.0044904532842338085,
0.013979027979075909,
0.04454859718680382,
-0.059143632650375366,
0.005595607217401266,
0.024266038089990616,
0.13656826317310333,
-0.02751660905778408,
0.1426595002412796,
0.07260245084762573,
-0.004990208428353071,
0.13626644015312195,
-0.060793839395046234,
-0.04984277859330177,
-0.17211325466632843,
0.06355437636375427,
-0.05316878482699394,
0.04604261368513107,
-0.018089534714818,
-0.03548799827694893,
0.21177710592746735,
0.031340427696704865,
0.061834439635276794,
0.05915328115224838,
-0.022133931517601013,
-0.10352787375450134,
-0.09785003215074539,
-0.07188889384269714,
-0.05625394359230995,
-0.021218908950686455,
-0.06376203894615173,
-0.07366473972797394,
0.018734203651547432,
0.06135670095682144,
-0.03517477959394455,
0.11368865519762039,
0.06715545803308487,
-0.09610719233751297,
0.06921122223138809,
-0.07809661328792572,
-0.00778895104303956,
0.13565093278884888,
-0.0010309407953172922,
-0.04647982120513916,
0.06839270889759064,
0.012118563987314701,
0.010014473460614681,
0.014005793258547783,
0.0702422484755516,
-0.03292175754904747,
-0.043494079262018204,
-0.10259042680263519,
0.05571437254548073,
0.019316576421260834,
0.027642443776130676,
-0.012788278982043266,
-0.05914211645722389,
0.04754314944148064,
0.20039203763008118,
-0.031891923397779465,
-0.130451500415802,
-0.09843137115240097,
0.12229640781879425,
0.05884125828742981,
0.038078755140304565,
-0.0522221215069294,
-0.06685307621955872,
-0.028663381934165955,
0.3283718526363373,
0.2096903771162033,
-0.05738618224859238,
-0.015819383785128593,
0.06681510806083679,
-0.01619795337319374,
0.0764300599694252,
-0.02923964522778988,
0.04455016553401947,
0.3033990263938904,
-0.032746538519859314,
-0.10775096714496613,
-0.03938613831996918,
-0.024748116731643677,
-0.08267029374837875,
0.008546791039407253,
0.15668435394763947,
-0.00697901239618659,
-0.12849996984004974,
0.08263581991195679,
-0.11642618477344513,
-0.1002560630440712,
-0.13197743892669678,
-0.0017037406796589494,
-0.06376451253890991,
0.014804989099502563,
0.0797463059425354,
-0.04766983911395073,
0.08334759622812271,
-0.054430607706308365,
-0.01817007176578045,
-0.04915674030780792,
0.02516317181289196,
-0.09146104753017426,
0.04033558443188667,
0.0675559788942337,
0.12731069326400757,
0.0014567371690645814,
0.003076738677918911,
0.038612257689237595,
0.12392064929008484,
-0.0731949433684349,
-0.1676032692193985,
0.09180701524019241,
0.014373992569744587,
-0.060307957231998444,
0.05009613558650017,
-0.022204330191016197,
-0.0345686674118042,
-0.020980188623070717,
0.006992836017161608,
0.004372835624963045,
0.012597248889505863,
0.025798799470067024,
0.050551556050777435,
-0.12585283815860748,
-0.035918474197387695,
-0.06283608824014664,
0.06284072250127792,
0.030591044574975967,
-0.039676081389188766,
0.040587425231933594,
-0.09262526780366898,
-0.012750828638672829,
0.029586780816316605,
0.04792582988739014,
-0.021694594994187355,
-0.08491552621126175,
-0.026166323572397232,
0.05243375152349472,
0.035113830119371414,
-0.0616050660610199,
0.01151981484144926,
-0.019986389204859734,
-0.016831951215863228,
-0.019816884770989418,
0.07429223507642746,
0.17277991771697998,
0.0005339612835086882,
0.007901989854872227,
-0.013508138246834278,
-0.04510049149394035,
-0.016155395656824112,
-0.13840284943580627,
-0.016720077022910118
] |
null | null |
transformers
|
# regnety_006
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnety_006
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnety_006
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnety_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnety_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnety_006\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.014487121254205704,
0.02572806552052498,
-0.002556465333327651,
0.017966587096452713,
-0.03285202011466026,
-0.1327628195285797,
-0.010050980374217033,
0.03709428757429123,
-0.1439252346754074,
-0.057301610708236694,
0.11745420098304749,
-0.0250018909573555,
-0.11204412579536438,
0.13101275265216827,
-0.08290412276983261,
-0.20583476126194,
0.0459219254553318,
0.07479992508888245,
-0.05318852514028549,
0.12020492553710938,
0.15610231459140778,
-0.06152290850877762,
0.028991224244236946,
0.11021243035793304,
-0.07023125141859055,
0.021691348403692245,
-0.07251103967428207,
-0.03351534530520439,
0.01931239292025566,
0.005136662162840366,
0.1603676676750183,
0.04788503423333168,
-0.02020864188671112,
-0.09486914426088333,
0.0064256698824465275,
0.0724458247423172,
-0.011987443082034588,
0.07700920850038528,
0.07180190831422806,
0.10725993663072586,
0.11155380308628082,
0.018466074019670486,
-0.0913868397474289,
-0.007549724541604519,
-0.0643966794013977,
-0.02263643778860569,
-0.0025771302171051502,
0.02368796244263649,
0.05401120334863663,
0.07099771499633789,
-0.05190863460302353,
0.16384264826774597,
-0.18179525434970856,
0.0765443667769432,
0.22279945015907288,
-0.3742867708206177,
-0.04858900606632233,
0.07918433099985123,
0.1590244621038437,
-0.046745870262384415,
0.006898105144500732,
0.052861228585243225,
0.0040057054720819,
-0.03210548684000969,
0.1672828048467636,
-0.06388533860445023,
-0.12555629014968872,
0.0074215419590473175,
-0.20722395181655884,
0.007072348613291979,
0.1314484030008316,
0.05766885727643967,
0.026458648964762688,
-0.057191330939531326,
-0.2102120965719223,
0.032313358038663864,
-0.026968052610754967,
-0.09961521625518799,
0.007995791733264923,
0.059277404099702835,
-0.0674857348203659,
-0.12952390313148499,
-0.10996560752391815,
0.04399760439991951,
-0.11381007730960846,
0.18108190596103668,
0.042650435119867325,
-0.014341759495437145,
-0.027439134195446968,
0.08675029128789902,
-0.06928718090057373,
-0.015388992615044117,
0.028168993070721626,
-0.13365089893341064,
-0.13278453052043915,
-0.0010423591593280435,
-0.03580809012055397,
-0.0856221616268158,
-0.0013385788770392537,
0.16407950222492218,
0.06570395827293396,
-0.019326819106936455,
0.16510836780071259,
0.09769798815250397,
-0.02764013037085533,
0.08548867702484131,
-0.2041185349225998,
-0.06695441901683807,
0.08322346210479736,
-0.010074925608932972,
0.010285777971148491,
0.009682434611022472,
-0.05439413711428642,
-0.08212060481309891,
0.09558572620153427,
-0.0010491457069292665,
0.040109552443027496,
0.0885552316904068,
-0.04342687129974365,
-0.03774898499250412,
0.0785021036863327,
-0.0038139442913234234,
0.018552085384726524,
-0.03793862834572792,
-0.06349378824234009,
-0.09331350773572922,
0.05041080713272095,
0.03621623292565346,
-0.1014539822936058,
0.06069405376911163,
-0.08054249733686447,
-0.0231829471886158,
-0.04142513498663902,
-0.13574829697608948,
-0.03074100986123085,
-0.08419319987297058,
-0.029920659959316254,
-0.17679926753044128,
-0.12020990997552872,
0.03176494315266609,
0.08966068178415298,
-0.001526877167634666,
-0.00031187760760076344,
0.01792537420988083,
-0.13220661878585815,
-0.15231777727603912,
-0.025829335674643517,
0.0693734809756279,
-0.04295442998409271,
-0.030954234302043915,
0.0379115529358387,
0.0658194050192833,
-0.06658323109149933,
0.0396992564201355,
-0.11783741414546967,
0.024848388507962227,
-0.08706934750080109,
0.09023673832416534,
-0.06470192223787308,
0.05108713358640671,
-0.06467436999082565,
-0.06735637784004211,
-0.1298747956752777,
0.00035751634277403355,
0.09293951839208603,
0.08420483767986298,
-0.07082825154066086,
-0.010578314773738384,
0.024181483313441277,
-0.014117816463112831,
-0.05290555581450462,
0.16515026986598969,
-0.05283049866557121,
0.09939306229352951,
0.06102917343378067,
0.14119629561901093,
0.12988467514514923,
0.007293728180229664,
-0.020806875079870224,
0.03681119158864021,
-0.18566901981830597,
-0.15316778421401978,
-0.011457223445177078,
0.13653340935707092,
0.05194314569234848,
0.005231440532952547,
-0.08898304402828217,
0.06459053605794907,
-0.06206861510872841,
-0.037135619670152664,
-0.036904335021972656,
-0.04637797549366951,
-0.008899248205125332,
0.04127758368849754,
0.08362086117267609,
0.035648684948682785,
-0.12946677207946777,
-0.06651948392391205,
0.0034814628306776285,
-0.030850762501358986,
0.06799211353063583,
-0.08530750125646591,
0.10146390646696091,
-0.09429752826690674,
0.009020033292472363,
-0.21352733671665192,
-0.026216374710202217,
0.02051183208823204,
-0.0007460396154783666,
0.0496317520737648,
0.18376225233078003,
0.045787252485752106,
-0.07699327915906906,
-0.04079125076532364,
0.024620847776532173,
0.1098732203245163,
0.030339179560542107,
-0.0932406559586525,
-0.041350822895765305,
-0.024699006229639053,
-0.049538176506757736,
0.04197797179222107,
-0.05523284524679184,
-0.021834779530763626,
-0.03232529014348984,
0.13784804940223694,
0.01252822671085596,
0.03566157445311546,
0.046940822154283524,
-0.026223119348287582,
-0.051819588989019394,
-0.04897444695234299,
0.035641975700855255,
0.007165545132011175,
-0.06539151817560196,
0.06431476771831512,
0.014723190106451511,
0.17337076365947723,
0.020808057859539986,
-0.1963147073984146,
-0.014049464836716652,
0.13720452785491943,
-0.08489906787872314,
-0.008242187090218067,
-0.04137484356760979,
-0.0036247295793145895,
0.10332715511322021,
-0.00829920545220375,
0.0964708998799324,
-0.11776919662952423,
-0.06516053527593613,
-0.010750820860266685,
-0.03582146018743515,
0.07032165676355362,
0.042373646050691605,
0.09316980838775635,
-0.29930999875068665,
0.04227812960743904,
-0.02311673015356064,
-0.10407266765832901,
-0.006551901809871197,
0.06997407972812653,
-0.028680821880698204,
0.055657919496297836,
0.11730246245861053,
-0.03179359808564186,
0.021696612238883972,
-0.0024297316558659077,
-0.03304600343108177,
0.03692764788866043,
-0.00009314848284702748,
0.09781579673290253,
-0.10623687505722046,
0.02545679174363613,
0.012660471722483635,
-0.0143642732873559,
-0.022411441430449486,
-0.025075437501072884,
-0.053248293697834015,
0.05268510431051254,
0.036907684057950974,
-0.23884396255016327,
-0.0029960775282233953,
-0.005627746693789959,
-0.02409668080508709,
0.16656626760959625,
0.053725529462099075,
-0.22788985073566437,
-0.03016112558543682,
0.0020949605386704206,
-0.0039527881890535355,
0.0046295467764139175,
0.04281192272901535,
-0.04798092693090439,
-0.10431767255067825,
-0.06573355942964554,
-0.12379920482635498,
-0.028852680698037148,
0.024676969274878502,
-0.014171849004924297,
0.052325669676065445,
0.05645955726504326,
-0.089040607213974,
0.03269287943840027,
-0.1423209011554718,
0.1737082600593567,
0.06832653284072876,
-0.09454479068517685,
0.16048561036586761,
0.09133903682231903,
-0.07326973974704742,
-0.02244090475142002,
-0.012771488167345524,
0.2263079136610031,
-0.013183262199163437,
0.07197058200836182,
-0.06884275376796722,
-0.03640693426132202,
0.060850076377391815,
0.11769445240497589,
-0.0015923846513032913,
-0.025504207238554955,
0.00809377059340477,
0.0545642115175724,
-0.036486536264419556,
-0.17270931601524353,
-0.07787952572107315,
-0.12337962538003922,
-0.08813158422708511,
0.007998928427696228,
0.03680257871747017,
-0.010917918756604195,
0.04762915521860123,
-0.01004867348819971,
0.012251744978129864,
0.07206965237855911,
0.09277215600013733,
0.20711550116539001,
0.03192269057035446,
0.1366383284330368,
-0.03749164938926697,
-0.19902779161930084,
0.06283789873123169,
-0.04554031044244766,
0.3438211679458618,
-0.07533691078424454,
0.13172779977321625,
0.04065639153122902,
0.037593774497509,
0.08276714384555817,
0.10356242209672928,
-0.0351463183760643,
-0.025156037881970406,
-0.04844612628221512,
-0.01129491999745369,
0.09554389119148254,
0.021193325519561768,
-0.0004270566569175571,
-0.18677137792110443,
0.13486294448375702,
0.07665605843067169,
0.09300310909748077,
0.10834499448537827,
0.11067213118076324,
-0.21886824071407318,
0.07741580903530121,
-0.014048523269593716,
-0.11543842405080795,
-0.04469893127679825,
0.0186501182615757,
0.02615879848599434,
0.048657625913619995,
0.029596861451864243,
-0.08426370471715927,
0.15125823020935059,
-0.04911394789814949,
0.042697202414274216,
-0.05708210542798042,
0.10679558664560318,
0.018923800438642502,
0.07924189418554306,
-0.0811316967010498,
0.23563936352729797,
-0.010271347127854824,
0.02823844738304615,
-0.028786201030015945,
-0.03220503777265549,
-0.011455424129962921,
0.029682103544473648,
0.09559225291013718,
0.003917745314538479,
0.08071725815534592,
0.1083231270313263,
-0.10221841931343079,
0.024261146783828735,
0.07038635015487671,
0.10008279979228973,
0.07746314257383347,
-0.014929329976439476,
-0.0847284123301506,
0.08752437680959702,
-0.01591184176504612,
-0.03732446953654289,
-0.09477932006120682,
0.07103516906499863,
0.12051635980606079,
-0.15977708995342255,
-0.018619868904352188,
-0.013614150695502758,
-0.023165537044405937,
0.33349186182022095,
0.1368996500968933,
-0.08665988594293594,
-0.0729341059923172,
0.03341267630457878,
0.184651717543602,
-0.017363568767905235,
0.08286212384700775,
-0.06005392596125603,
-0.046242855489254,
-0.004440890625119209,
-0.06412798166275024,
0.11871790885925293,
-0.09440233558416367,
0.0363212451338768,
-0.03661935776472092,
0.03364809975028038,
0.025550110265612602,
0.016145898029208183,
0.03294578567147255,
-0.00668503250926733,
-0.1854725033044815,
-0.10275034606456757,
-0.1346399486064911,
-0.17544928193092346,
-0.08183831721544266,
0.015014773234724998,
-0.015414689667522907,
0.08467673510313034,
-0.03684299439191818,
0.10814222693443298,
0.0599467009305954,
0.16586601734161377,
-0.09658011794090271,
-0.018280768766999245,
0.15528464317321777,
0.048642244189977646,
-0.14740030467510223,
-0.11328788101673126,
-0.0036746272817254066,
-0.035203397274017334,
-0.04319861903786659,
0.007335766684263945,
0.08480213582515717,
-0.024723175913095474,
-0.02845717966556549,
0.03854641318321228,
-0.05412259325385094,
-0.05391255021095276,
0.14826489984989166,
-0.06330724060535431,
0.361095666885376,
-0.03479403629899025,
-0.01305034477263689,
0.007363345939666033,
-0.062088627368211746,
0.017883602529764175,
-0.005849780514836311,
0.1012921929359436,
0.016209207475185394,
-0.08393989503383636,
-0.00571850873529911,
-0.1010625883936882,
0.011651629582047462,
0.03554508090019226,
0.036978188902139664,
-0.01154535636305809,
-0.010397437028586864,
0.12564483284950256,
-0.034804780036211014,
0.12618175148963928,
0.11001010984182358,
0.026538070291280746,
-0.05830386281013489,
-0.07633154094219208,
-0.04996804520487785,
0.012607714161276817,
0.05143197998404503,
-0.04301045835018158,
-0.18133805692195892,
0.009454118087887764,
0.07333125174045563,
0.05584261566400528,
0.18722526729106903,
0.07838800549507141,
-0.024566305801272392,
0.12949639558792114,
0.14156322181224823,
-0.14678119122982025,
-0.03782596066594124,
0.03407684713602066,
0.023180140182375908,
0.03929401561617851,
-0.08383508026599884,
0.034041471779346466,
0.12963852286338806,
-0.04468943178653717,
-0.030176639556884766,
0.053370241075754166,
-0.1155681386590004,
-0.0669003501534462,
0.09288734942674637,
-0.13922759890556335,
-0.0810634046792984,
-0.007936372421681881,
-0.1352211833000183,
-0.03207862377166748,
0.09812357276678085,
0.03829773887991905,
-0.12128127366304398,
-0.014444814994931221,
0.09221009910106659,
0.003497152589261532,
0.009351743385195732,
0.12692254781723022,
0.10757463425397873,
0.017822420224547386,
-0.07053089886903763,
0.06829524040222168,
0.03750559687614441,
0.07816989719867706,
-0.0809788927435875,
0.027686364948749542,
-0.17690932750701904,
-0.05717102438211441,
-0.014029277488589287,
0.02171603962779045,
-0.10629425197839737,
-0.06356147676706314,
-0.14429928362369537,
-0.015999017283320427,
0.11541486531496048,
0.040799565613269806,
0.12432398647069931,
0.018192235380411148,
-0.04058477655053139,
0.043029073625802994,
-0.04505658522248268,
0.07579676061868668,
0.14455722272396088,
0.07027693092823029,
-0.1461995244026184,
-0.09733141213655472,
0.008845222182571888,
0.04454215615987778,
-0.09722459316253662,
0.04821062460541725,
-0.07511265575885773,
0.03676103428006172,
-0.12375447899103165,
-0.036278728395700455,
-0.08168637007474899,
-0.009174011647701263,
-0.054198067635297775,
-0.08033394813537598,
-0.10553759336471558,
0.039151549339294434,
-0.054792389273643494,
0.0015870695933699608,
0.013596246019005775,
-0.11889331042766571,
-0.09073377400636673,
-0.010015425272285938,
0.0166460070759058,
-0.04166746139526367,
0.07981101423501968,
-0.05216951295733452,
-0.09670098125934601,
-0.045575372874736786,
-0.01431460864841938,
-0.158184215426445,
0.11357778310775757,
0.0361919105052948,
0.049800265580415726,
-0.06456498056650162,
0.06334622949361801,
0.07630547136068344,
0.11034908145666122,
-0.04204683005809784,
-0.046862877905368805,
-0.02061266265809536,
0.023067886009812355,
-0.11772601306438446,
-0.044119883328676224,
-0.02076740376651287,
-0.007087526377290487,
0.07921311259269714,
0.12820902466773987,
0.2018732875585556,
0.053724221885204315,
-0.05627042055130005,
-0.04144754633307457,
0.022857818752527237,
0.0038752066902816296,
-0.19190765917301178,
0.07973507791757584,
-0.060775309801101685,
-0.02812862955033779,
-0.026082515716552734,
0.24937672913074493,
0.042424991726875305,
-0.0017220528097823262,
0.021394163370132446,
0.046380020678043365,
-0.04000860080122948,
-0.004857645835727453,
0.012026489712297916,
0.04395366460084915,
-0.05956564471125603,
0.008299327455461025,
0.023904815316200256,
0.13534748554229736,
-0.026782365515828133,
0.1460043042898178,
0.0732835903763771,
-0.006415197160094976,
0.1374017745256424,
-0.05992865934967995,
-0.049649499356746674,
-0.17174451053142548,
0.06474562734365463,
-0.052827928215265274,
0.044871389865875244,
-0.018545998260378838,
-0.034927528351545334,
0.21065615117549896,
0.03231906518340111,
0.06187421455979347,
0.05769313499331474,
-0.02171342633664608,
-0.10356062650680542,
-0.09754499793052673,
-0.07194340229034424,
-0.05593203008174896,
-0.020529931411147118,
-0.06350211054086685,
-0.07244398444890976,
0.019062718376517296,
0.06126509979367256,
-0.0357343927025795,
0.11353228241205215,
0.06775184720754623,
-0.0967385545372963,
0.06929923593997955,
-0.07893551141023636,
-0.00855894386768341,
0.1353600174188614,
0.00009868729102890939,
-0.04503566399216652,
0.06915553659200668,
0.012668746523559093,
0.01087400782853365,
0.012765642255544662,
0.06953849643468857,
-0.03338140249252319,
-0.0437593050301075,
-0.10276912152767181,
0.05617193505167961,
0.018933983519673347,
0.02867693267762661,
-0.012989208102226257,
-0.05818936228752136,
0.04804350063204765,
0.20148681104183197,
-0.03316113352775574,
-0.13044358789920807,
-0.09702950716018677,
0.1252167969942093,
0.058223657310009,
0.03877401724457741,
-0.051402606070041656,
-0.06770707666873932,
-0.0302360150963068,
0.3247493803501129,
0.20745617151260376,
-0.057727888226509094,
-0.015522939153015614,
0.06672047078609467,
-0.015893252566456795,
0.07535110414028168,
-0.028581652790308,
0.044478803873062134,
0.30364659428596497,
-0.03306381776928902,
-0.10676554590463638,
-0.038888465613126755,
-0.0245667677372694,
-0.08036614209413528,
0.008637904189527035,
0.1569235622882843,
-0.00732534471899271,
-0.12914927303791046,
0.08216334879398346,
-0.1166616827249527,
-0.10058149695396423,
-0.1333170086145401,
-0.0028691051993519068,
-0.06343763321638107,
0.015500121749937534,
0.0796925276517868,
-0.047334395349025726,
0.08465214818716049,
-0.05519258975982666,
-0.017619427293539047,
-0.051456745713949203,
0.025135492905974388,
-0.09258835017681122,
0.040923476219177246,
0.0665762647986412,
0.12363453954458237,
0.0019374858820810914,
0.0037511521950364113,
0.038287919014692307,
0.12387290596961975,
-0.07340548932552338,
-0.1687074601650238,
0.09238342195749283,
0.01411646418273449,
-0.06056395173072815,
0.050015855580568314,
-0.0213589109480381,
-0.03438377007842064,
-0.020962810143828392,
0.007254738826304674,
0.00213463231921196,
0.012830480933189392,
0.02595275640487671,
0.05113113671541214,
-0.12629637122154236,
-0.03431161493062973,
-0.06297557801008224,
0.0628972128033638,
0.030837494879961014,
-0.039290573447942734,
0.03998203203082085,
-0.09288646280765533,
-0.01260877214372158,
0.02938826195895672,
0.0501580610871315,
-0.021534213796257973,
-0.08410512655973434,
-0.0260353721678257,
0.051856644451618195,
0.035093214362859726,
-0.062258508056402206,
0.012845905497670174,
-0.020732585340738297,
-0.017250970005989075,
-0.01987767405807972,
0.07426411658525467,
0.17248888313770294,
0.0006583144422620535,
0.007699982728809118,
-0.016739580780267715,
-0.044595517218112946,
-0.015027862973511219,
-0.1384694129228592,
-0.01628931052982807
] |
null | null |
transformers
|
# regnety_008
Implementation of RegNet proposed in [Designing Network Design
Spaces](https://arxiv.org/abs/2003.13678)
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.

The paper is really well written and very interesting, I highly
recommended read it.
``` python
ResNet.regnetx_002()
ResNet.regnetx_004()
ResNet.regnetx_006()
ResNet.regnetx_008()
ResNet.regnetx_016()
ResNet.regnetx_040()
ResNet.regnetx_064()
ResNet.regnetx_080()
ResNet.regnetx_120()
ResNet.regnetx_160()
ResNet.regnetx_320()
# Y variants (with SE)
ResNet.regnety_002()
# ...
ResNet.regnetx_320()
You can easily customize your model
```
Examples:
``` python
# change activation
RegNet.regnetx_004(activation = nn.SELU)
# change number of classes (default is 1000 )
RegNet.regnetx_004(n_classes=100)
# pass a different block
RegNet.regnetx_004(block=RegNetYBotteneckBlock)
# change the steam
model = RegNet.regnetx_004(stem=ResNetStemC)
change shortcut
model = RegNet.regnetx_004(block=partial(RegNetYBotteneckBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = RegNet.regnetx_004()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 32, 112, 112]), torch.Size([1, 32, 56, 56]), torch.Size([1, 64, 28, 28]), torch.Size([1, 160, 14, 14])]
```
|
{}
| null |
glasses/regnety_008
|
[
"transformers",
"pytorch",
"arxiv:2003.13678",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2003.13678"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us
|
# regnety_008
Implementation of RegNet proposed in Designing Network Design
Spaces
The main idea is to start with a high dimensional search space and
iteratively reduce the search space by empirically apply constrains
based on the best performing models sampled by the current search
space.
The resulting models are light, accurate, and faster than
EfficientNets (up to 5x times!)
For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the
bottleneck ratio $b_i$ for all stage $i$. The following table shows
all the restrictions applied from one search space to the next one.
!image
The paper is really well written and very interesting, I highly
recommended read it.
Examples:
|
[
"# regnety_008\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n",
"# regnety_008\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
30,
168
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2003.13678 #endpoints_compatible #region-us \n# regnety_008\nImplementation of RegNet proposed in Designing Network Design\nSpaces\n\n The main idea is to start with a high dimensional search space and\n iteratively reduce the search space by empirically apply constrains\n based on the best performing models sampled by the current search\n space.\n\n The resulting models are light, accurate, and faster than\n EfficientNets (up to 5x times!)\n\n For example, to go from $AnyNet_A$ to $AnyNet_B$ they fixed the\n bottleneck ratio $b_i$ for all stage $i$. The following table shows\n all the restrictions applied from one search space to the next one.\n\n !image\n\n The paper is really well written and very interesting, I highly\n recommended read it.\n\n \n\n Examples:"
] |
[
0.013164542615413666,
0.026898430660367012,
-0.002556412247940898,
0.01796565391123295,
-0.032915957272052765,
-0.13260714709758759,
-0.009736540727317333,
0.03720356523990631,
-0.1442309319972992,
-0.05763379484415054,
0.11754785478115082,
-0.025805748999118805,
-0.1120627149939537,
0.12957783043384552,
-0.08266472816467285,
-0.20613083243370056,
0.04649335891008377,
0.07578179985284805,
-0.04888502135872841,
0.1195661723613739,
0.15647485852241516,
-0.062233537435531616,
0.029144015163183212,
0.11043868958950043,
-0.06785251200199127,
0.02147364802658558,
-0.07252320647239685,
-0.03434203937649727,
0.01975448615849018,
0.006573630031198263,
0.1603347510099411,
0.047458913177251816,
-0.02051622048020363,
-0.0971161276102066,
0.005894701462239027,
0.07258398085832596,
-0.012369124218821526,
0.07701205462217331,
0.07229156047105789,
0.10774350166320801,
0.11081741005182266,
0.01908656768500805,
-0.09073618054389954,
-0.007734519429504871,
-0.06437905132770538,
-0.022943692281842232,
-0.002910125767812133,
0.02674034796655178,
0.05413567274808884,
0.07023269683122635,
-0.05134942755103111,
0.16406764090061188,
-0.180669367313385,
0.07757442444562912,
0.22393333911895752,
-0.37517672777175903,
-0.04789477959275246,
0.07766691595315933,
0.1604604572057724,
-0.04545333608984947,
0.007826614193618298,
0.05280275270342827,
0.003817246062681079,
-0.032052867114543915,
0.1676071137189865,
-0.0641040951013565,
-0.12477507442235947,
0.007688846904784441,
-0.206816166639328,
0.006395699456334114,
0.13216765224933624,
0.058539554476737976,
0.026454804465174675,
-0.05832085385918617,
-0.21041671931743622,
0.027420157566666603,
-0.027126150205731392,
-0.099823497235775,
0.007703098934143782,
0.059450436383485794,
-0.06548530608415604,
-0.1292845606803894,
-0.10999312996864319,
0.04484294727444649,
-0.11333790421485901,
0.1806028187274933,
0.04292242228984833,
-0.014682788401842117,
-0.026606209576129913,
0.08667390048503876,
-0.07196871936321259,
-0.015657756477594376,
0.028225641697645187,
-0.13223610818386078,
-0.13259267807006836,
-0.0018951408565044403,
-0.03667580708861351,
-0.08604638278484344,
-0.0009570040856488049,
0.16937971115112305,
0.06571737676858902,
-0.01874157413840294,
0.16488902270793915,
0.09803277254104614,
-0.026939241215586662,
0.088042713701725,
-0.2059987187385559,
-0.06854887306690216,
0.08415082097053528,
-0.009304962120950222,
0.011029529385268688,
0.008844858966767788,
-0.05483592674136162,
-0.08308713138103485,
0.0980541855096817,
-0.0013867876259610057,
0.04033345356583595,
0.08738842606544495,
-0.04309098422527313,
-0.037623729556798935,
0.08084169030189514,
-0.004215532913804054,
0.018871551379561424,
-0.038072582334280014,
-0.0630619153380394,
-0.09199357777833939,
0.05110940337181091,
0.037957388907670975,
-0.10170777142047882,
0.06060583144426346,
-0.07999415695667267,
-0.02289748750627041,
-0.04137541353702545,
-0.13560132682323456,
-0.030419809743762016,
-0.08460923284292221,
-0.029583614319562912,
-0.1774814873933792,
-0.12141631543636322,
0.03090144321322441,
0.08893243968486786,
-0.0005671670660376549,
0.00045504665467888117,
0.01874067261815071,
-0.13087661564350128,
-0.15305237472057343,
-0.02685508131980896,
0.06514859199523926,
-0.04368727281689644,
-0.03107050433754921,
0.038055434823036194,
0.0652470514178276,
-0.06897872686386108,
0.03962668403983116,
-0.11719819158315659,
0.025158394128084183,
-0.08649565279483795,
0.08937348425388336,
-0.06357872486114502,
0.05336567759513855,
-0.06267866492271423,
-0.0673576146364212,
-0.12833663821220398,
0.0012511956738308072,
0.09342370182275772,
0.08286183327436447,
-0.07187981903553009,
-0.009890553541481495,
0.023460131138563156,
-0.013695579022169113,
-0.05409581959247589,
0.16687998175621033,
-0.05218231678009033,
0.09655271470546722,
0.06284474581480026,
0.140996515750885,
0.13060952723026276,
0.007943887263536453,
-0.020278021693229675,
0.03639737516641617,
-0.18489094078540802,
-0.15444877743721008,
-0.011427897959947586,
0.1354673057794571,
0.052188146859407425,
0.005508407950401306,
-0.08809517323970795,
0.0647682175040245,
-0.062061987817287445,
-0.03767351433634758,
-0.03719362989068031,
-0.046828512102365494,
-0.010255994275212288,
0.041381753981113434,
0.08314129710197449,
0.035022247582674026,
-0.12988996505737305,
-0.0641017034649849,
0.004042783286422491,
-0.031214937567710876,
0.06860455125570297,
-0.08560999482870102,
0.10330624133348465,
-0.09317479282617569,
0.008079387247562408,
-0.21267586946487427,
-0.024125223979353905,
0.020024890080094337,
0.0007827988592907786,
0.047970402985811234,
0.18358498811721802,
0.04575493931770325,
-0.07645754516124725,
-0.040241580456495285,
0.024497823789715767,
0.11048291623592377,
0.03101504035294056,
-0.09483765065670013,
-0.039443012326955795,
-0.025605926290154457,
-0.048536721616983414,
0.03996347635984421,
-0.05601778253912926,
-0.02098403312265873,
-0.032924093306064606,
0.13837532699108124,
0.012372820638120174,
0.035467714071273804,
0.04645159840583801,
-0.025002120062708855,
-0.05231700465083122,
-0.04867890849709511,
0.035709839314222336,
0.006768562365323305,
-0.06647792458534241,
0.06208081170916557,
0.013861807063221931,
0.17647318542003632,
0.020565813407301903,
-0.1991814374923706,
-0.012888194061815739,
0.13684126734733582,
-0.08429763466119766,
-0.00826850812882185,
-0.04080468416213989,
-0.004846363328397274,
0.10366545617580414,
-0.007852202281355858,
0.09815569221973419,
-0.11769548058509827,
-0.06464210897684097,
-0.011785528622567654,
-0.03603995218873024,
0.07046531140804291,
0.04141150414943695,
0.09163223952054977,
-0.29313719272613525,
0.041372161358594894,
-0.02429702877998352,
-0.10561587661504745,
-0.007069853134453297,
0.0698876604437828,
-0.028723448514938354,
0.055809371173381805,
0.1167031079530716,
-0.0317959189414978,
0.02082941122353077,
-0.005915501154959202,
-0.03342986851930618,
0.03723587468266487,
-0.0006209397688508034,
0.09868159145116806,
-0.1062438040971756,
0.02541457861661911,
0.01355045661330223,
-0.014550589956343174,
-0.021230822429060936,
-0.026304183527827263,
-0.05362952873110771,
0.05228004232048988,
0.03627096861600876,
-0.2389000803232193,
-0.0018829877953976393,
-0.005526042077690363,
-0.02428528666496277,
0.16552741825580597,
0.053280048072338104,
-0.22986704111099243,
-0.03233083710074425,
0.0029449197463691235,
-0.004950165748596191,
0.00523735024034977,
0.0430351160466671,
-0.04583397135138512,
-0.10457025468349457,
-0.06652774661779404,
-0.12538152933120728,
-0.028437897562980652,
0.02556433342397213,
-0.013731024228036404,
0.051616620272397995,
0.057702671736478806,
-0.08918941020965576,
0.03251006454229355,
-0.14110729098320007,
0.17452965676784515,
0.06791523844003677,
-0.09421684592962265,
0.1612124890089035,
0.09233644604682922,
-0.07334364205598831,
-0.02228841371834278,
-0.012327549047768116,
0.22620148956775665,
-0.013034360483288765,
0.07242485880851746,
-0.06831426173448563,
-0.03810899704694748,
0.06098746880888939,
0.11691746860742569,
-0.0014520243275910616,
-0.025963809341192245,
0.007360450923442841,
0.05417686328291893,
-0.03627266734838486,
-0.17133013904094696,
-0.07768178731203079,
-0.12280896306037903,
-0.08740799874067307,
0.0077474163845181465,
0.03724591061472893,
-0.007780879270285368,
0.04749758914113045,
-0.010619929060339928,
0.014351469464600086,
0.07130814343690872,
0.09220341593027115,
0.20782950520515442,
0.031089330092072487,
0.13674111664295197,
-0.03728145360946655,
-0.19868429005146027,
0.06321866810321808,
-0.045856691896915436,
0.34022095799446106,
-0.07398378849029541,
0.1335420459508896,
0.0405140221118927,
0.03793727979063988,
0.0830698236823082,
0.10348182171583176,
-0.03556853160262108,
-0.02556619793176651,
-0.04756598919630051,
-0.011285990476608276,
0.09418650716543198,
0.021612441167235374,
-0.00021990160166751593,
-0.18642066419124603,
0.13496704399585724,
0.07537313550710678,
0.09362299740314484,
0.1080654039978981,
0.10974200069904327,
-0.21996858716011047,
0.07842779904603958,
-0.01380324736237526,
-0.11633431911468506,
-0.045735400170087814,
0.019728489220142365,
0.025732384994626045,
0.04938671737909317,
0.029833519831299782,
-0.08314958214759827,
0.15177451074123383,
-0.052177850157022476,
0.042683906853199005,
-0.057093337178230286,
0.10579357296228409,
0.01853492669761181,
0.07927467674016953,
-0.08352267742156982,
0.232350692152977,
-0.01057368703186512,
0.0287458598613739,
-0.02890545316040516,
-0.03228195011615753,
-0.012304076924920082,
0.03251340985298157,
0.09406659752130508,
0.0036629876121878624,
0.08343623578548431,
0.10669732838869095,
-0.10283208638429642,
0.023327860981225967,
0.0681244432926178,
0.1011609211564064,
0.07707337290048599,
-0.014365977607667446,
-0.08460469543933868,
0.08734185993671417,
-0.016665298491716385,
-0.03784995898604393,
-0.09419279545545578,
0.07123734802007675,
0.12318053841590881,
-0.15788811445236206,
-0.019680587574839592,
-0.013532723300158978,
-0.02639324776828289,
0.3319265842437744,
0.13620401918888092,
-0.0876350849866867,
-0.07325220853090286,
0.0322464257478714,
0.1849382221698761,
-0.016832133755087852,
0.08376628905534744,
-0.05930197238922119,
-0.04655316844582558,
-0.0046549830585718155,
-0.0629035085439682,
0.11821414530277252,
-0.09437579661607742,
0.035623013973236084,
-0.03641093522310257,
0.03456656262278557,
0.025518132373690605,
0.015697205439209938,
0.03278480842709541,
-0.006773519795387983,
-0.18537917733192444,
-0.10280050337314606,
-0.1366482824087143,
-0.17265957593917847,
-0.08329536765813828,
0.01497708447277546,
-0.014266232028603554,
0.08465337008237839,
-0.03647182881832123,
0.10811527073383331,
0.06024999916553497,
0.16558857262134552,
-0.09601209312677383,
-0.01891322247684002,
0.15595600008964539,
0.04842379689216614,
-0.145868718624115,
-0.11491890996694565,
-0.004716337192803621,
-0.03494619205594063,
-0.04267989099025726,
0.008122548460960388,
0.08525989949703217,
-0.02290552668273449,
-0.02768971212208271,
0.04013640806078911,
-0.055410273373126984,
-0.05479150637984276,
0.15053211152553558,
-0.06421833485364914,
0.36133873462677,
-0.0363447330892086,
-0.01401294581592083,
0.005588148720562458,
-0.06287040561437607,
0.015979120507836342,
-0.005499275401234627,
0.10086500644683838,
0.015771551057696342,
-0.08550627529621124,
-0.005239678081125021,
-0.1013801321387291,
0.01252138614654541,
0.03774610906839371,
0.03745940327644348,
-0.012430723756551743,
-0.010926494374871254,
0.12498302012681961,
-0.03512723371386528,
0.12688741087913513,
0.10966034978628159,
0.025752834975719452,
-0.06193620711565018,
-0.07702699303627014,
-0.04892346262931824,
0.012593146413564682,
0.05124116316437721,
-0.04257524758577347,
-0.18128584325313568,
0.00802907720208168,
0.0725875049829483,
0.056615691632032394,
0.18898029625415802,
0.07771402597427368,
-0.02617953158915043,
0.12948140501976013,
0.14066407084465027,
-0.15028242766857147,
-0.038525208830833435,
0.03320404887199402,
0.022790011018514633,
0.03955744206905365,
-0.0817665085196495,
0.0349222794175148,
0.1300087422132492,
-0.0451686792075634,
-0.029147977009415627,
0.052244964987039566,
-0.11519742757081985,
-0.06699008494615555,
0.09337609261274338,
-0.13970713317394257,
-0.07807211577892303,
-0.008762486279010773,
-0.13649381697177887,
-0.031793177127838135,
0.09851756691932678,
0.038371190428733826,
-0.12125388532876968,
-0.01402952428907156,
0.09187914431095123,
0.0029461176600307226,
0.008870881982147694,
0.1269216686487198,
0.109056755900383,
0.017665361985564232,
-0.070406973361969,
0.0683087632060051,
0.0370139516890049,
0.07963148504495621,
-0.08110231161117554,
0.027782388031482697,
-0.17688216269016266,
-0.0572955459356308,
-0.01563822105526924,
0.022911859676241875,
-0.10736840963363647,
-0.0658508688211441,
-0.14593447744846344,
-0.014397228136658669,
0.11558503657579422,
0.040660612285137177,
0.12464205920696259,
0.01842587999999523,
-0.04081796109676361,
0.04191748797893524,
-0.04462084919214249,
0.07548981159925461,
0.14512112736701965,
0.07026010006666183,
-0.14782297611236572,
-0.09463327378034592,
0.009355121292173862,
0.043396469205617905,
-0.09650767594575882,
0.048366792500019073,
-0.07500213384628296,
0.03781505674123764,
-0.12656566500663757,
-0.03653522580862045,
-0.08266672492027283,
-0.008528671227395535,
-0.05325398966670036,
-0.07998046278953552,
-0.10520216822624207,
0.038585115224123,
-0.05380672961473465,
0.0013267642352730036,
0.01390023436397314,
-0.11945564299821854,
-0.0907793864607811,
-0.010313179343938828,
0.015763062983751297,
-0.041675008833408356,
0.08008074015378952,
-0.05359598621726036,
-0.09670765697956085,
-0.04637105390429497,
-0.013750014826655388,
-0.15718546509742737,
0.11291053146123886,
0.03598238527774811,
0.04925793409347534,
-0.06226735934615135,
0.06357043236494064,
0.07698540389537811,
0.1103447675704956,
-0.04164860025048256,
-0.04792282357811928,
-0.02182801440358162,
0.02141786739230156,
-0.11592276394367218,
-0.04389025270938873,
-0.021003028377890587,
-0.008189278654754162,
0.07945268601179123,
0.12765920162200928,
0.2017996609210968,
0.053625885397195816,
-0.05621781200170517,
-0.041313592344522476,
0.023334523662924767,
0.004065985791385174,
-0.19113178551197052,
0.08049886673688889,
-0.061077017337083817,
-0.028062526136636734,
-0.02553226426243782,
0.25291892886161804,
0.041751690208911896,
-0.001125828712247312,
0.02104249969124794,
0.04556291550397873,
-0.03903058171272278,
-0.004689711611717939,
0.015169138088822365,
0.04395957663655281,
-0.05977196246385574,
0.006291558034718037,
0.024033010005950928,
0.1359071135520935,
-0.02860148809850216,
0.143717959523201,
0.07288670539855957,
-0.008380791172385216,
0.13811640441417694,
-0.060474179685115814,
-0.04942827299237251,
-0.17095720767974854,
0.06282511353492737,
-0.05344480648636818,
0.0453430600464344,
-0.018315039575099945,
-0.03348980471491814,
0.21069957315921783,
0.03203217312693596,
0.061016835272312164,
0.05934659391641617,
-0.021911924704909325,
-0.10300985723733902,
-0.09722782671451569,
-0.07219137251377106,
-0.05620832368731499,
-0.020809149369597435,
-0.06356451660394669,
-0.07260213792324066,
0.020891260355710983,
0.06115490198135376,
-0.036546021699905396,
0.11345065385103226,
0.06711823493242264,
-0.09578444063663483,
0.06957041472196579,
-0.07877232134342194,
-0.0074140457436442375,
0.13472528755664825,
-0.001468569040298462,
-0.04491717368364334,
0.06837675720453262,
0.012326187454164028,
0.010773371905088425,
0.014082781039178371,
0.06948388367891312,
-0.03247324377298355,
-0.04450361803174019,
-0.10250627994537354,
0.054913390427827835,
0.019767077639698982,
0.027842789888381958,
-0.012843738310039043,
-0.05827082321047783,
0.04774181917309761,
0.19935861229896545,
-0.032414600253105164,
-0.13103502988815308,
-0.09738422930240631,
0.1214028149843216,
0.06014220044016838,
0.038723014295101166,
-0.051800694316625595,
-0.06852097809314728,
-0.02836228348314762,
0.3259834945201874,
0.20511727035045624,
-0.056921038776636124,
-0.015753021463751793,
0.06656324863433838,
-0.01600617729127407,
0.07524293661117554,
-0.02918655052781105,
0.04413502290844917,
0.3019333779811859,
-0.031852252781391144,
-0.10782606899738312,
-0.039023976773023605,
-0.024762509390711784,
-0.08115679025650024,
0.007863438688218594,
0.15649732947349548,
-0.007196896709501743,
-0.12928663194179535,
0.08261436969041824,
-0.11562781780958176,
-0.10028532147407532,
-0.1315745711326599,
-0.0021657270845025778,
-0.06347734481096268,
0.015982776880264282,
0.08155198395252228,
-0.046936508268117905,
0.08337957412004471,
-0.05431624501943588,
-0.017532866448163986,
-0.04879486560821533,
0.02464168891310692,
-0.09343624860048294,
0.04128609970211983,
0.065590500831604,
0.12619413435459137,
0.002555854618549347,
0.003748243907466531,
0.03962276130914688,
0.12391489744186401,
-0.07267938554286957,
-0.16757723689079285,
0.09349701553583145,
0.013224910013377666,
-0.059568487107753754,
0.049781836569309235,
-0.022884035483002663,
-0.033201612532138824,
-0.02135188691318035,
0.007117062341421843,
0.002454001922160387,
0.012315955944359303,
0.02501041628420353,
0.050588276237249374,
-0.12665195763111115,
-0.03542329743504524,
-0.062232859432697296,
0.0632576197385788,
0.029318975284695625,
-0.03967124596238136,
0.03961552679538727,
-0.09224165230989456,
-0.012648234143853188,
0.029359349980950356,
0.04913642629981041,
-0.021702593192458153,
-0.08482346683740616,
-0.025461526587605476,
0.05129721760749817,
0.034422315657138824,
-0.061086300760507584,
0.013212737627327442,
-0.02059774100780487,
-0.01808343455195427,
-0.020097766071558,
0.07380291819572449,
0.17260447144508362,
0.0009658150956965983,
0.008139032870531082,
-0.013686338439583778,
-0.044616322964429855,
-0.01453987043350935,
-0.13927757740020752,
-0.016823792830109596
] |
null | null |
transformers
|
# resnet152
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet152
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet152
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet152\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet152\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
25
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet152\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10818974673748016,
0.06613535434007645,
-0.0024012227077037096,
0.08432082831859589,
0.09452369064092636,
0.007899167947471142,
-0.01924489066004753,
0.08886951953172684,
-0.0934191420674324,
-0.01676483452320099,
0.10410603880882263,
0.10288263857364655,
0.0018767660949379206,
-0.05798373371362686,
-0.05322994664311409,
-0.23582100868225098,
0.08791632950305939,
0.06915178894996643,
-0.010044611990451813,
0.05710233002901077,
0.07016177475452423,
-0.08305878192186356,
0.09625174105167389,
0.002720733406022191,
-0.1727951467037201,
0.0046561299823224545,
-0.014967839233577251,
-0.06821102648973465,
0.07885288447141647,
0.02480626292526722,
0.09655872732400894,
0.009024116210639477,
0.09739656746387482,
-0.10998407006263733,
0.03232359141111374,
0.04782625287771225,
-0.076600082218647,
0.05910027027130127,
0.10402549058198929,
-0.030962692573666573,
0.02968779020011425,
0.04967635124921799,
-0.07085110992193222,
-0.04556448012590408,
0.0002646059147082269,
-0.06924348324537277,
0.004084331449121237,
0.17049695551395416,
0.0832083597779274,
0.09114151448011398,
0.033138155937194824,
0.08168957382440567,
-0.11110606044530869,
0.0935872495174408,
0.1698964536190033,
-0.2841587960720062,
-0.050824712961912155,
0.11339621990919113,
-0.05733805522322655,
-0.08100832253694534,
-0.06992969661951065,
0.02873172238469124,
0.022912899032235146,
0.03527866676449776,
-0.12728169560432434,
0.0016196734504774213,
-0.08136111497879028,
-0.007716609630733728,
-0.08697943389415741,
-0.029094597324728966,
0.05198640376329422,
0.09180288016796112,
0.061459776014089584,
-0.0006724712438881397,
-0.15120454132556915,
0.014595634303987026,
-0.10414362698793411,
0.04383706673979759,
0.020500898361206055,
0.06109912320971489,
-0.0853133574128151,
-0.009397456422448158,
-0.059814561158418655,
-0.03859240189194679,
-0.14552322030067444,
-0.04052390158176422,
0.054574742913246155,
0.13129831850528717,
-0.11410915851593018,
0.02524024061858654,
0.03334755077958107,
-0.04146923869848251,
-0.003025235142558813,
-0.08030007034540176,
0.021225690841674805,
0.04696355387568474,
-0.07503014802932739,
-0.02937173657119274,
0.09247478097677231,
0.19497807323932648,
-0.06321350485086441,
-0.0027170295361429453,
-0.024664489552378654,
0.14059801399707794,
-0.0921318456530571,
0.009634926915168762,
-0.24355673789978027,
0.1166038066148758,
0.05069781839847565,
-0.09115725755691528,
0.019247399643063545,
-0.014707835391163826,
-0.08429466187953949,
-0.02777137979865074,
0.1166374683380127,
-0.03858016058802605,
-0.015596740879118443,
0.06856352835893631,
-0.011368952691555023,
-0.046101465821266174,
0.12007394433021545,
-0.03475494310259819,
-0.03462931886315346,
-0.003913551103323698,
-0.10051364451646805,
0.07618312537670135,
0.10784577578306198,
0.001157456892542541,
-0.0974477156996727,
0.005295210052281618,
-0.009849916212260723,
-0.00010044479131465778,
0.0039356062188744545,
-0.03499307110905647,
0.04266725480556488,
0.009517185389995575,
0.05849288031458855,
-0.215628519654274,
-0.15194834768772125,
0.005491227377206087,
0.11430788785219193,
-0.04044397920370102,
0.010132684372365475,
-0.012793872505426407,
-0.12587425112724304,
-0.0032227779738605022,
0.014827076345682144,
0.02986285090446472,
-0.071710966527462,
0.018216809257864952,
-0.13453340530395508,
0.12598088383674622,
-0.11149843037128448,
0.035101987421512604,
-0.08733487874269485,
-0.04117245599627495,
-0.08304166793823242,
0.09841156005859375,
-0.07604867964982986,
0.09721612185239792,
-0.13378110527992249,
-0.07959335297346115,
-0.06696052849292755,
-0.030269348993897438,
0.007139707449823618,
0.06924396008253098,
-0.09067720919847488,
-0.007013928145170212,
0.14957186579704285,
-0.10826050490140915,
-0.08683475852012634,
0.08240298926830292,
-0.03370048478245735,
-0.09868154674768448,
0.00013580590893980116,
0.15822412073612213,
0.021990181878209114,
0.0025826310738921165,
-0.04933902621269226,
0.02423732914030552,
-0.1329658180475235,
-0.12740769982337952,
-0.035159043967723846,
0.14381761848926544,
0.030383698642253876,
-0.004868263378739357,
-0.08881841599941254,
0.11255131661891937,
-0.08245785534381866,
-0.1206231415271759,
-0.04123382642865181,
-0.09995295852422714,
-0.014404512010514736,
0.04617143049836159,
0.1081310585141182,
0.06704960763454437,
-0.040828295052051544,
-0.14951471984386444,
0.05740690231323242,
-0.05289696156978607,
0.10893260687589645,
-0.09421804547309875,
0.1601564586162567,
-0.033528704196214676,
0.03380389139056206,
-0.23330508172512054,
0.01621529646217823,
0.0031641556415706873,
0.018680457025766373,
-0.03663589805364609,
-0.0492018461227417,
0.040244076400995255,
-0.05212138220667839,
-0.036860886961221695,
-0.07080142945051193,
0.07117331773042679,
-0.01630302704870701,
-0.006772866006940603,
-0.123683862388134,
-0.0027231285348534584,
-0.041787270456552505,
0.016199029982089996,
-0.12580405175685883,
-0.055377282202243805,
0.023546254262328148,
0.12719590961933136,
-0.02884654700756073,
0.08473373204469681,
-0.028521737083792686,
-0.058845046907663345,
-0.019571680575609207,
-0.05643630772829056,
0.09263110905885696,
0.01434566080570221,
-0.016980033367872238,
0.0678691491484642,
0.0951327532529831,
0.15411317348480225,
0.15047958493232727,
-0.28420814871788025,
-0.03741157054901123,
0.0821373388171196,
-0.030390610918402672,
-0.005559566896408796,
-0.007665461394935846,
0.15738451480865479,
0.10806018859148026,
-0.027662841603159904,
0.07192050665616989,
-0.05841565504670143,
0.09562568366527557,
0.012992111034691334,
-0.027537096291780472,
0.0048354556784033775,
0.04627570882439613,
0.08041856437921524,
-0.2708304822444916,
0.08712854981422424,
0.10492429882287979,
-0.164970263838768,
-0.04343029111623764,
-0.04452931880950928,
-0.06251415610313416,
0.10015841573476791,
0.07042593508958817,
-0.009785419330000877,
0.06660879403352737,
0.0354318730533123,
-0.023112017661333084,
0.06024568900465965,
-0.09546910226345062,
0.023622022941708565,
-0.1257723718881607,
0.02281082421541214,
-0.05036706477403641,
0.0033125565387308598,
0.01939280703663826,
-0.002633716445416212,
-0.0011915462091565132,
0.10859545320272446,
-0.011329010128974915,
-0.2343520075082779,
0.05162671580910683,
-0.06547699123620987,
-0.0277421772480011,
0.20912975072860718,
-0.025749923661351204,
-0.2372901737689972,
-0.007034441456198692,
-0.016703490167856216,
-0.07665075361728668,
-0.014531492255628109,
0.026334458962082863,
-0.11955146491527557,
-0.06504231691360474,
-0.02055683173239231,
-0.10805032402276993,
0.11234232038259506,
0.030934670940041542,
0.017544982954859734,
0.028675194829702377,
0.038074836134910583,
-0.05027708411216736,
0.014959021471440792,
-0.14106500148773193,
-0.08963130414485931,
0.13354797661304474,
-0.08637920767068863,
0.08019542694091797,
0.0834202691912651,
-0.016877876594662666,
0.019033264368772507,
0.02313399314880371,
0.1413944512605667,
-0.051363710314035416,
0.022213265299797058,
0.09558479487895966,
-0.011429932899773121,
0.05158742144703865,
0.08796259760856628,
0.008596833795309067,
-0.07150301337242126,
-0.009820508770644665,
0.027996599674224854,
-0.042378995567560196,
-0.09424053877592087,
-0.1342313438653946,
-0.03754584491252899,
-0.011499670334160328,
0.15764322876930237,
0.10332594066858292,
0.053407054394483566,
0.10027205944061279,
0.012897802516818047,
-0.0286250002682209,
0.06166284903883934,
0.06298922747373581,
0.10560159385204315,
0.037258341908454895,
0.1124640628695488,
-0.07867543399333954,
-0.06370984762907028,
0.03626532480120659,
0.10530783236026764,
0.25515568256378174,
-0.033889200538396835,
-0.12758585810661316,
0.045196857303380966,
0.2485973984003067,
0.10745706409215927,
0.1058669239282608,
-0.06723836064338684,
-0.014650007709860802,
0.002639430109411478,
0.00046689293230883777,
0.011366425082087517,
0.046682603657245636,
-0.013278908096253872,
-0.06882987171411514,
0.03682047873735428,
0.03606371954083443,
0.026798686012625694,
0.2447752207517624,
0.09385930001735687,
-0.3182317614555359,
0.010721088387072086,
-0.04229787364602089,
-0.00241122511215508,
-0.015864670276641846,
0.13384006917476654,
-0.00546931941062212,
-0.036874692887067795,
0.079063780605793,
-0.1307005137205124,
0.10628242790699005,
-0.057251859456300735,
0.009496713988482952,
0.045170966535806656,
-0.11850789934396744,
0.08485100418329239,
0.04925025627017021,
-0.1398787945508957,
0.1730283796787262,
-0.04620257019996643,
-0.06426499038934708,
-0.06841189414262772,
-0.04763828590512276,
0.10791481286287308,
0.1370832324028015,
0.24510100483894348,
0.016015078872442245,
-0.0072020129300653934,
0.08321953564882278,
-0.02675764076411724,
0.034245721995830536,
0.049749478697776794,
0.07697491347789764,
-0.060930948704481125,
0.03552933409810066,
-0.05349824205040932,
0.0392349548637867,
0.12831072509288788,
-0.02178393490612507,
-0.09988068044185638,
-0.05873989686369896,
0.06737210601568222,
-0.04521084949374199,
-0.03481964021921158,
-0.06525304168462753,
-0.07587043941020966,
0.12755712866783142,
0.028327416628599167,
-0.02749832719564438,
-0.08541560173034668,
0.09470177441835403,
0.06516164541244507,
-0.06197734922170639,
0.10710356384515762,
-0.049017637968063354,
0.02611648291349411,
-0.019811047241091728,
-0.2453279197216034,
0.17172232270240784,
-0.1161225289106369,
-0.007411592174321413,
0.029308203607797623,
-0.0642542913556099,
-0.010075208730995655,
-0.025685230270028114,
0.05770816653966904,
0.04079622030258179,
-0.1229582130908966,
-0.02413293346762657,
0.015345178544521332,
-0.04105302691459656,
0.020856326445937157,
0.08882645517587662,
-0.05565613880753517,
-0.08317764848470688,
-0.037159889936447144,
0.14998982846736908,
0.158731609582901,
-0.03966809809207916,
-0.10384392738342285,
0.008742857724428177,
0.11235277354717255,
0.031002579256892204,
-0.3863310217857361,
-0.08948057889938354,
-0.01996181719005108,
-0.0687824934720993,
-0.09561841189861298,
-0.12914659082889557,
0.23676630854606628,
-0.0045560626313090324,
-0.05980438366532326,
0.0021133110858500004,
-0.13933031260967255,
-0.04149642586708069,
0.25489744544029236,
0.07489411532878876,
0.28931447863578796,
-0.08863574266433716,
0.025557780638337135,
-0.058470796793699265,
0.018218612298369408,
0.09040924906730652,
-0.0009777230443432927,
0.07157332450151443,
-0.002067079534754157,
0.04593370482325554,
-0.040685512125492096,
-0.029225150123238564,
-0.0241328626871109,
0.11328720301389694,
0.09607915580272675,
-0.09926813840866089,
0.07219762355089188,
0.06050710752606392,
-0.017970694229006767,
0.051080573350191116,
0.08728888630867004,
0.11057693511247635,
-0.02037101238965988,
0.022898215800523758,
-0.092969611287117,
0.0512564592063427,
0.10535518825054169,
-0.0007568892324343324,
-0.10641773045063019,
0.06242360919713974,
0.10803931951522827,
0.08525972068309784,
0.28091666102409363,
0.1269017606973648,
0.031226063147187233,
0.06829903274774551,
0.00836858432739973,
-0.07850103080272675,
-0.0838339701294899,
-0.10201044380664825,
-0.0368652381002903,
0.14440001547336578,
-0.15058332681655884,
0.05898451432585716,
0.10098864883184433,
-0.0021526131313294172,
-0.0010125534608960152,
0.09998670220375061,
0.01962711103260517,
-0.03192717581987381,
0.0993560254573822,
-0.06458333134651184,
-0.023302489891648293,
0.011656776070594788,
0.0746946856379509,
0.10584282130002975,
0.14105188846588135,
0.10602646321058273,
-0.10345996916294098,
-0.0020022732205688953,
0.0238160640001297,
0.0770404264330864,
-0.0528142973780632,
0.06092532351613045,
0.14524300396442413,
-0.023580165579915047,
-0.14537359774112701,
0.1998865157365799,
0.007805156521499157,
-0.15158411860466003,
-0.024313589558005333,
-0.05755370110273361,
-0.1417725831270218,
-0.12860316038131714,
-0.012536868453025818,
-0.0004498759808484465,
-0.24187521636486053,
-0.08528973907232285,
-0.002678398508578539,
-0.07146645337343216,
0.11632100492715836,
0.029513750225305557,
0.13984788954257965,
-0.0206841379404068,
-0.03466323763132095,
-0.08050648123025894,
-0.03757909685373306,
0.019611096009612083,
0.014926774427294731,
0.06595045328140259,
-0.16212019324302673,
-0.20859374105930328,
0.015697384253144264,
0.14990943670272827,
-0.08144320547580719,
0.007223732303828001,
-0.08340299874544144,
0.09018861502408981,
-0.13027237355709076,
0.07061126828193665,
-0.05964643508195877,
0.028524182736873627,
-0.0596381276845932,
-0.03165270760655403,
-0.06237015128135681,
0.055145263671875,
-0.0975603237748146,
-0.024472668766975403,
0.0000973991263890639,
-0.025078486651182175,
-0.11482187360525131,
-0.03204797953367233,
-0.02030736766755581,
0.007104266434907913,
0.13337749242782593,
0.08333717286586761,
-0.10519818961620331,
0.0930681899189949,
-0.1417512595653534,
-0.25758084654808044,
0.1559239774942398,
0.04006470367312431,
-0.01916566677391529,
0.05874568596482277,
0.05538036674261093,
0.0746905580163002,
-0.04538731649518013,
-0.015559464693069458,
0.013894429430365562,
-0.046416446566581726,
-0.0177853275090456,
-0.23662036657333374,
0.020456556230783463,
-0.02946486882865429,
0.0033908223267644644,
0.08145192265510559,
0.10410173237323761,
0.11846307665109634,
-0.05730803310871124,
-0.0061925784684717655,
-0.03935740888118744,
-0.0037275943905115128,
-0.009310013614594936,
-0.11400236189365387,
-0.012151382863521576,
-0.036993879824876785,
0.022258765995502472,
-0.07756419479846954,
0.1637030839920044,
-0.018893970176577568,
-0.0977492406964302,
0.025302095338702202,
0.21901284158229828,
-0.1140279471874237,
0.07985396683216095,
0.18748795986175537,
0.0845615342259407,
-0.005268813110888004,
0.0484919399023056,
0.053434114903211594,
0.09858933836221695,
0.1878422200679779,
0.0036670949775725603,
0.24711838364601135,
0.03357064723968506,
0.11431510001420975,
0.07642687857151031,
-0.016985246911644936,
0.05105653405189514,
0.09098725020885468,
-0.09445679187774658,
0.10408880561590195,
0.004391776397824287,
0.003539885627105832,
0.13518153131008148,
0.030458495020866394,
0.00707822572439909,
0.01631692424416542,
-0.007168662268668413,
-0.08502525836229324,
-0.20598286390304565,
-0.10031009465456009,
-0.14422641694545746,
0.04845598340034485,
-0.029854750260710716,
-0.082390196621418,
0.21844008564949036,
0.08050581812858582,
-0.04448872432112694,
0.04902830719947815,
-0.08316866308450699,
-0.05179456248879433,
0.16156016290187836,
-0.008504977449774742,
-0.1061779335141182,
0.1378166526556015,
-0.049470867961645126,
0.06577543914318085,
0.0607542060315609,
-0.016564615070819855,
0.0018816267838701606,
0.06584744155406952,
0.15710429847240448,
-0.03821737319231033,
-0.042191315442323685,
-0.03174568712711334,
0.024217041209340096,
-0.04374988004565239,
0.0006446139304898679,
-0.01613815501332283,
0.030842142179608345,
0.046834949404001236,
0.23451964557170868,
-0.10412292182445526,
-0.07961664348840714,
-0.11460436880588531,
-0.04081594944000244,
-0.00048394553596153855,
0.06064639613032341,
-0.04795769602060318,
-0.03841521218419075,
-0.020583676174283028,
0.21316032111644745,
0.22996734082698822,
-0.18117570877075195,
0.003915432374924421,
0.08396941423416138,
0.003316595684736967,
-0.03594119846820831,
0.028861675411462784,
0.1695103645324707,
0.11042801290750504,
-0.06575145572423935,
-0.13252194225788116,
-0.058973878622055054,
-0.056658562272787094,
-0.11974234879016876,
0.014788368716835976,
0.05539034307003021,
-0.07537635415792465,
-0.1320449709892273,
0.12162170559167862,
-0.17802771925926208,
-0.017103519290685654,
0.0797295793890953,
-0.08653170615434647,
-0.12181220948696136,
-0.051482945680618286,
0.07502575218677521,
0.09923916310071945,
0.007804583758115768,
-0.078059621155262,
0.0448879711329937,
0.017283247783780098,
0.028535842895507812,
-0.15713687241077423,
-0.04322453960776329,
-0.06411543488502502,
-0.02003789134323597,
0.12531369924545288,
0.037958189845085144,
-0.005051938816905022,
0.0440249890089035,
0.02727152220904827,
-0.06888920813798904,
0.009605171158909798,
-0.050183750689029694,
-0.12649187445640564,
0.025794275104999542,
0.06452889740467072,
-0.030392931774258614,
-0.006479048170149326,
0.05473869666457176,
-0.16815094649791718,
-0.0245952270925045,
0.04908130317926407,
0.011374694295227528,
-0.11536586284637451,
0.01313723623752594,
-0.06827271729707718,
0.05272507295012474,
0.04018601402640343,
0.004083958920091391,
-0.002431678818538785,
-0.05345117673277855,
-0.029847800731658936,
0.007408834062516689,
-0.12472425401210785,
0.008928775787353516,
-0.02966778539121151,
-0.03536927327513695,
-0.24379460513591766,
0.05840245634317398,
0.0165408868342638,
-0.01757870241999626,
-0.03479206562042236,
-0.04284033179283142,
-0.08489808440208435,
0.07748420536518097,
0.11419587582349777,
-0.0423598438501358,
0.011391589418053627,
0.01735433004796505,
0.07002576440572739,
-0.0009129256359301507,
-0.02345772460103035,
-0.10941298305988312
] |
null | null |
transformers
|
# resnet18
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet18
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet18
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet18\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet18\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
25
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet18\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10833189636468887,
0.0590878501534462,
-0.002427423605695367,
0.08445870131254196,
0.09392944723367691,
0.007413317449390888,
-0.0171232707798481,
0.08950914442539215,
-0.08780159801244736,
-0.016066092997789383,
0.1032320111989975,
0.10085496306419373,
0.0009229255374521017,
-0.059029292315244675,
-0.05191517993807793,
-0.23135946691036224,
0.08916395902633667,
0.06856659054756165,
-0.005469571799039841,
0.0578107051551342,
0.07054551690816879,
-0.08247371762990952,
0.09645633399486542,
0.0033164804335683584,
-0.17098739743232727,
0.005296309012919664,
-0.014952093362808228,
-0.06864529848098755,
0.07969840615987778,
0.02389148436486721,
0.09683578461408615,
0.00769916083663702,
0.09658785164356232,
-0.10618451237678528,
0.03260910138487816,
0.04935264587402344,
-0.07694289833307266,
0.05746491625905037,
0.10131663829088211,
-0.032483190298080444,
0.03564507141709328,
0.046368807554244995,
-0.06957929581403732,
-0.043972257524728775,
-0.0005392665043473244,
-0.06970439106225967,
0.00547011336311698,
0.1643662005662918,
0.08559306710958481,
0.0916973277926445,
0.030726874247193336,
0.08371113985776901,
-0.11053753644227982,
0.09312813729047775,
0.16597796976566315,
-0.28661108016967773,
-0.052473682910203934,
0.1139921173453331,
-0.05343339219689369,
-0.07496042549610138,
-0.06772594153881073,
0.02949168160557747,
0.023758063092827797,
0.03659816458821297,
-0.12621085345745087,
0.001916407491080463,
-0.07232581079006195,
-0.008899198845028877,
-0.08730331808328629,
-0.02968299575150013,
0.05235586315393448,
0.09180663526058197,
0.06209952384233475,
-0.005625143181532621,
-0.15068045258522034,
0.014975675381720066,
-0.10559726506471634,
0.043930307030677795,
0.021255962550640106,
0.0628235936164856,
-0.08474858850240707,
-0.010296249762177467,
-0.059898413717746735,
-0.03692224994301796,
-0.14267943799495697,
-0.04313983768224716,
0.05566835775971413,
0.1312689632177353,
-0.11390481889247894,
0.02183084562420845,
0.03965736925601959,
-0.040767353028059006,
-0.0015747619327157736,
-0.07946714013814926,
0.01996421255171299,
0.04809962585568428,
-0.07551735639572144,
-0.031132644042372704,
0.09288199990987778,
0.19344750046730042,
-0.06570172309875488,
-0.0028441851027309895,
-0.028993619605898857,
0.1409275084733963,
-0.09193103015422821,
0.00942207220941782,
-0.24305357038974762,
0.11875134706497192,
0.049467433243989944,
-0.09228796511888504,
0.020369917154312134,
-0.015669453889131546,
-0.08344244956970215,
-0.024971358478069305,
0.11477012187242508,
-0.03625126928091049,
-0.015189433470368385,
0.06674495339393616,
-0.010770091786980629,
-0.044749073684215546,
0.12389436364173889,
-0.033489253371953964,
-0.035896047949790955,
-0.00657764682546258,
-0.09854351729154587,
0.07929868251085281,
0.10526572912931442,
0.0012755361385643482,
-0.09800642728805542,
0.0034193561878055334,
-0.010560546070337296,
-0.0036228878889232874,
0.005274265538901091,
-0.03249549865722656,
0.04283265024423599,
0.012202405370771885,
0.0595448799431324,
-0.21419432759284973,
-0.15069803595542908,
0.00431803846731782,
0.11501163989305496,
-0.04075184836983681,
0.00973635446280241,
-0.011270757764577866,
-0.12403278052806854,
-0.004796783439815044,
0.016150090843439102,
0.027198731899261475,
-0.07216468453407288,
0.01644599251449108,
-0.13762551546096802,
0.1250375211238861,
-0.10867364704608917,
0.0360933281481266,
-0.08979547768831253,
-0.04112773761153221,
-0.08124137669801712,
0.09899753332138062,
-0.07651510834693909,
0.09696662425994873,
-0.13317149877548218,
-0.08054891973733902,
-0.0646531954407692,
-0.027736946940422058,
0.0069056604988873005,
0.06735993921756744,
-0.09112463146448135,
-0.009093542583286762,
0.1493605375289917,
-0.10743250697851181,
-0.08770588040351868,
0.08553639054298401,
-0.032262999564409256,
-0.1035945862531662,
0.0009690170409157872,
0.1585521697998047,
0.017314238473773003,
0.005002265330404043,
-0.05080058053135872,
0.020609816536307335,
-0.1314362734556198,
-0.12891988456249237,
-0.03663627803325653,
0.1461564004421234,
0.032823871821165085,
-0.004933373536914587,
-0.08650149405002594,
0.11223704367876053,
-0.08244996517896652,
-0.1204480230808258,
-0.041097406297922134,
-0.09912043809890747,
-0.015759548172354698,
0.04567047581076622,
0.10486544668674469,
0.06364783644676208,
-0.04372663423418999,
-0.1582537740468979,
0.05668729916214943,
-0.05147487670183182,
0.11091714352369308,
-0.09483896195888519,
0.16122770309448242,
-0.03104846179485321,
0.03536086902022362,
-0.2314506471157074,
0.012313601560890675,
0.0047907112166285515,
0.01999795064330101,
-0.03531622886657715,
-0.05037637799978256,
0.03804197907447815,
-0.05428389832377434,
-0.03669145330786705,
-0.07108452916145325,
0.07098864763975143,
-0.018790896981954575,
-0.006253357045352459,
-0.12541241943836212,
-0.0012494404800236225,
-0.04111621528863907,
0.016542386263608932,
-0.12517958879470825,
-0.055252041667699814,
0.020757760852575302,
0.12409012764692307,
-0.031119374558329582,
0.084125816822052,
-0.03026021458208561,
-0.05992048606276512,
-0.019589349627494812,
-0.05639069154858589,
0.09083510935306549,
0.013912148773670197,
-0.018624868243932724,
0.06993715465068817,
0.09622195363044739,
0.15061788260936737,
0.15132898092269897,
-0.28297311067581177,
-0.03668709844350815,
0.08669000118970871,
-0.03176882490515709,
-0.004887137562036514,
-0.005040653515607119,
0.1610235720872879,
0.11008157581090927,
-0.025472084060311317,
0.07035542279481888,
-0.05764060094952583,
0.09608543664216995,
0.014851157553493977,
-0.025932660326361656,
0.003763519460335374,
0.044834837317466736,
0.07828744500875473,
-0.26647183299064636,
0.085648313164711,
0.10978491604328156,
-0.16185201704502106,
-0.04569552466273308,
-0.04433711618185043,
-0.061537835747003555,
0.10207632184028625,
0.06804031133651733,
-0.01086703035980463,
0.06372611969709396,
0.03649399057030678,
-0.02516072429716587,
0.05940423533320427,
-0.09650266915559769,
0.02410086803138256,
-0.1223168671131134,
0.022248174995183945,
-0.0508209653198719,
0.003459157655015588,
0.02529950812458992,
-0.0004393239214550704,
-0.0020973756909370422,
0.1092408150434494,
-0.011615782044827938,
-0.23616641759872437,
0.05089627206325531,
-0.06614464521408081,
-0.027042534202337265,
0.20877715945243835,
-0.025178736075758934,
-0.23632358014583588,
-0.004667267668992281,
-0.01592095196247101,
-0.07873092591762543,
-0.01529051922261715,
0.026378732174634933,
-0.12199126929044724,
-0.06417261064052582,
-0.019817080348730087,
-0.11190495640039444,
0.11099645495414734,
0.029630107805132866,
0.017574453726410866,
0.026799295097589493,
0.035299088805913925,
-0.04965181276202202,
0.013094378635287285,
-0.1421935111284256,
-0.09351509809494019,
0.13315925002098083,
-0.0839957445859909,
0.08017124980688095,
0.08365364372730255,
-0.015476829372346401,
0.017895517870783806,
0.023032087832689285,
0.13760490715503693,
-0.053963255137205124,
0.023264460265636444,
0.09633347392082214,
-0.009464415721595287,
0.0496780127286911,
0.08546072244644165,
0.00848378799855709,
-0.07320556789636612,
-0.009381989948451519,
0.029002707451581955,
-0.04497799277305603,
-0.09821715205907822,
-0.1386813372373581,
-0.03768300637602806,
-0.010416911914944649,
0.15562263131141663,
0.10418270528316498,
0.05133793130517006,
0.10034725815057755,
0.010731613263487816,
-0.030047863721847534,
0.059260498732328415,
0.06293913722038269,
0.10753077268600464,
0.03656348958611488,
0.11301884800195694,
-0.07726376503705978,
-0.06335398554801941,
0.0378958024084568,
0.10415059328079224,
0.2513435482978821,
-0.03183405473828316,
-0.12555764615535736,
0.047476351261138916,
0.2454768717288971,
0.10551436990499496,
0.1027003675699234,
-0.06409139931201935,
-0.015165395103394985,
0.0015622162027284503,
0.0014131086645647883,
0.010319190099835396,
0.04585370793938637,
-0.01470712386071682,
-0.06956598907709122,
0.036445438861846924,
0.038348276168107986,
0.029329201206564903,
0.24631258845329285,
0.09106580913066864,
-0.31850701570510864,
0.011334402486681938,
-0.04185492545366287,
-0.002004818757995963,
-0.015409623272716999,
0.13675841689109802,
-0.009044293314218521,
-0.03672991693019867,
0.08003798127174377,
-0.13059648871421814,
0.10560715198516846,
-0.057317085564136505,
0.01160863321274519,
0.043867237865924835,
-0.11890251189470291,
0.08424197882413864,
0.04854778200387955,
-0.14104929566383362,
0.17326122522354126,
-0.04735117405653,
-0.0667208880186081,
-0.068894162774086,
-0.04777953773736954,
0.10700077563524246,
0.1382003277540207,
0.24900028109550476,
0.016232389956712723,
-0.0118406368419528,
0.08786526322364807,
-0.025836626067757607,
0.038212358951568604,
0.0462002269923687,
0.07736870646476746,
-0.062021270394325256,
0.036441706120967865,
-0.052437372505664825,
0.038745298981666565,
0.1318478137254715,
-0.020500997081398964,
-0.0989229828119278,
-0.05979592353105545,
0.07146967202425003,
-0.04932108148932457,
-0.035673823207616806,
-0.06532499194145203,
-0.07584834843873978,
0.13405057787895203,
0.03344975784420967,
-0.02903294749557972,
-0.08568013459444046,
0.09125253558158875,
0.06544310599565506,
-0.06245578080415726,
0.10484876483678818,
-0.047806963324546814,
0.028552019968628883,
-0.023515967652201653,
-0.24604815244674683,
0.17254814505577087,
-0.1172332763671875,
-0.005401873029768467,
0.030582016333937645,
-0.06329060345888138,
-0.007151931989938021,
-0.024588551372289658,
0.05641651153564453,
0.04117543250322342,
-0.12492061406373978,
-0.023003671318292618,
0.017656488344073296,
-0.04419570788741112,
0.015150308609008789,
0.08753520250320435,
-0.05172092467546463,
-0.08218792825937271,
-0.03647290915250778,
0.14887303113937378,
0.156918466091156,
-0.04086429253220558,
-0.10232173651456833,
0.007266228087246418,
0.11563953012228012,
0.032465800642967224,
-0.3830968141555786,
-0.09161095321178436,
-0.024718239903450012,
-0.07008667290210724,
-0.09477666020393372,
-0.1304376721382141,
0.23529493808746338,
-0.004883686080574989,
-0.060345638543367386,
0.0012904125032946467,
-0.13738557696342468,
-0.04018969461321831,
0.2581824064254761,
0.07182613015174866,
0.28992658853530884,
-0.08808527141809464,
0.026932310312986374,
-0.05820132791996002,
0.02780277468264103,
0.08846331387758255,
-0.0007674670196138322,
0.07008310407400131,
-0.0027043595910072327,
0.05101611837744713,
-0.04134479537606239,
-0.026947487145662308,
-0.027438703924417496,
0.11467868089675903,
0.09526150673627853,
-0.09775664657354355,
0.07779604941606522,
0.05637235566973686,
-0.0183874499052763,
0.05111856386065483,
0.08338604867458344,
0.10868354141712189,
-0.018460573628544807,
0.023192791268229485,
-0.09187272936105728,
0.0522148534655571,
0.10550768673419952,
-0.002729859435930848,
-0.10765654593706131,
0.06323911249637604,
0.107317715883255,
0.08562743663787842,
0.2829594314098358,
0.12788648903369904,
0.03586834296584129,
0.06529750674962997,
0.00771061098203063,
-0.0842113196849823,
-0.0807250440120697,
-0.10170657932758331,
-0.03590451180934906,
0.14418736100196838,
-0.15305107831954956,
0.056602802127599716,
0.10164200514554977,
-0.005059631075710058,
-0.00220761657692492,
0.09956888109445572,
0.015348609536886215,
-0.032577645033597946,
0.09793657809495926,
-0.06443679332733154,
-0.02312101423740387,
0.010842598974704742,
0.07411061227321625,
0.10710951685905457,
0.13959762454032898,
0.10680116713047028,
-0.10473377257585526,
-0.0028924057260155678,
0.02328706532716751,
0.07809270918369293,
-0.05178646743297577,
0.05769188702106476,
0.14465245604515076,
-0.025304945185780525,
-0.14614537358283997,
0.19747337698936462,
0.009037137031555176,
-0.15308573842048645,
-0.022248076274991035,
-0.053982872515916824,
-0.13858114182949066,
-0.13009127974510193,
-0.013510038144886494,
-0.0019759985152632,
-0.24294835329055786,
-0.08486021310091019,
-0.0019005379872396588,
-0.07361559569835663,
0.1169290840625763,
0.02204442210495472,
0.13975074887275696,
-0.02271650731563568,
-0.03489846736192703,
-0.0816081240773201,
-0.03722835332155228,
0.019887149333953857,
0.01224428042769432,
0.06621038168668747,
-0.15998315811157227,
-0.21500861644744873,
0.01613551750779152,
0.1509610116481781,
-0.0813034400343895,
0.007975853979587555,
-0.08077580481767654,
0.09203259646892548,
-0.13180257380008698,
0.06755657494068146,
-0.06026376038789749,
0.0268597062677145,
-0.060783691704273224,
-0.030696716159582138,
-0.06360463052988052,
0.05531303957104683,
-0.09586887806653976,
-0.024302000179886818,
0.0014838308561593294,
-0.021138574928045273,
-0.11767838895320892,
-0.030417192727327347,
-0.021801738068461418,
0.007080960087478161,
0.13326580822467804,
0.08111780881881714,
-0.10484153777360916,
0.0931369736790657,
-0.14354851841926575,
-0.25595933198928833,
0.1556413173675537,
0.039455246180295944,
-0.018500661477446556,
0.06184939295053482,
0.055820759385824203,
0.07424615323543549,
-0.04877262935042381,
-0.014555108733475208,
0.006969897076487541,
-0.04714391008019447,
-0.017124686390161514,
-0.23401367664337158,
0.020489437505602837,
-0.02585102804005146,
0.002247859025374055,
0.08081390708684921,
0.10528143495321274,
0.11591736972332001,
-0.05825112760066986,
-0.0049665383994579315,
-0.04082168638706207,
-0.004608532413840294,
-0.009250106289982796,
-0.11347143352031708,
-0.008605066686868668,
-0.03784414380788803,
0.020886993035674095,
-0.0794011801481247,
0.16608694195747375,
-0.014744565822184086,
-0.09557057172060013,
0.02693353220820427,
0.22225771844387054,
-0.11745105683803558,
0.07965832203626633,
0.17870362102985382,
0.08471299707889557,
-0.005806950386613607,
0.05076675862073898,
0.04872910678386688,
0.09899885952472687,
0.18857689201831818,
0.001290513901039958,
0.24324975907802582,
0.0384662002325058,
0.11417335271835327,
0.07841764390468597,
-0.015675028786063194,
0.04977463558316231,
0.08862914144992828,
-0.09459378570318222,
0.10339350253343582,
0.008571579121053219,
-0.00021850652410648763,
0.13538120687007904,
0.030605899170041084,
0.008464991115033627,
0.016463879495859146,
-0.004822987597435713,
-0.08440810441970825,
-0.20424209535121918,
-0.0994885191321373,
-0.14376957714557648,
0.04786734655499458,
-0.0315607525408268,
-0.08413831889629364,
0.21669946610927582,
0.08164103329181671,
-0.04624015465378761,
0.04981165751814842,
-0.07826276868581772,
-0.05024388059973717,
0.16272316873073578,
-0.007684167940169573,
-0.10622568428516388,
0.13857251405715942,
-0.048904404044151306,
0.0640561580657959,
0.061720263212919235,
-0.01725178398191929,
0.0019893755670636892,
0.06553024798631668,
0.16132068634033203,
-0.038575779646635056,
-0.04228678345680237,
-0.032346222549676895,
0.02321389876306057,
-0.04414878413081169,
-0.0016500750789418817,
-0.016019554808735847,
0.029759744182229042,
0.04754052311182022,
0.23248682916164398,
-0.10221357643604279,
-0.07475889474153519,
-0.1126231849193573,
-0.0381355956196785,
-0.0016767211491242051,
0.05732806399464607,
-0.04896414279937744,
-0.036509208381175995,
-0.022027568891644478,
0.21103036403656006,
0.23181891441345215,
-0.1809857338666916,
0.0036115916445851326,
0.08237669616937637,
0.00295643275603652,
-0.03881070390343666,
0.030071690678596497,
0.17068953812122345,
0.1082858145236969,
-0.06575600057840347,
-0.12914849817752838,
-0.062089405953884125,
-0.05849618837237358,
-0.1210717260837555,
0.017142318189144135,
0.05896804481744766,
-0.07356797903776169,
-0.12840421497821808,
0.12274118512868881,
-0.17734409868717194,
-0.01671498641371727,
0.0787510797381401,
-0.0853494331240654,
-0.12138157337903976,
-0.051406171172857285,
0.07329875230789185,
0.1018935814499855,
0.00903222244232893,
-0.07785455882549286,
0.04318561777472496,
0.014073831960558891,
0.02787809818983078,
-0.15368665754795074,
-0.04248252883553505,
-0.06426951289176941,
-0.02106315642595291,
0.12896691262722015,
0.03828010335564613,
-0.009185871109366417,
0.042606912553310394,
0.028342951089143753,
-0.06875499337911606,
0.008978831581771374,
-0.04940902814269066,
-0.12226800620555878,
0.026106899604201317,
0.06378495693206787,
-0.03047916851937771,
-0.007524765562266111,
0.05869164317846298,
-0.1668347418308258,
-0.024590667337179184,
0.04501776769757271,
0.01154970470815897,
-0.11275368183851242,
0.014814568683505058,
-0.06845129281282425,
0.052471961826086044,
0.03971651941537857,
0.003817080520093441,
-0.003207996254786849,
-0.05252712592482567,
-0.031295157968997955,
0.007487901486456394,
-0.12097004801034927,
0.010698172263801098,
-0.02833576127886772,
-0.03477328643202782,
-0.23887363076210022,
0.05871821939945221,
0.012754385359585285,
-0.014852779917418957,
-0.034231457859277725,
-0.041603680700063705,
-0.08650173246860504,
0.07724209129810333,
0.11301469057798386,
-0.04449598863720894,
0.011380890384316444,
0.023147547617554665,
0.07211627066135406,
-0.002987536136060953,
-0.023065079003572464,
-0.10727699100971222
] |
null | null |
transformers
|
# resnet26
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet26
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet26
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet26\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet26\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
25
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet26\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10744696855545044,
0.06904590129852295,
-0.0022883068304508924,
0.08351658284664154,
0.09360119700431824,
0.0069902511313557625,
-0.014396166428923607,
0.08862730860710144,
-0.09012933820486069,
-0.015143324621021748,
0.10387680679559708,
0.10111113637685776,
0.0007344374898821115,
-0.056969840079545975,
-0.051522914320230484,
-0.2318657636642456,
0.08840201050043106,
0.06792847067117691,
-0.009744362905621529,
0.05858120322227478,
0.06854964792728424,
-0.08277694880962372,
0.0957566499710083,
0.00023002871603239328,
-0.17394426465034485,
0.007484584115445614,
-0.016370732337236404,
-0.06604522466659546,
0.07782573997974396,
0.02265040948987007,
0.09632889926433563,
0.007235366851091385,
0.09808646887540817,
-0.10887936502695084,
0.03212638571858406,
0.048604197800159454,
-0.07610014826059341,
0.05720343440771103,
0.10132841020822525,
-0.028057487681508064,
0.036999933421611786,
0.04587714746594429,
-0.07245166599750519,
-0.04545115679502487,
-0.00021353363990783691,
-0.06296145170927048,
0.005112644750624895,
0.16655375063419342,
0.08250875771045685,
0.09524770826101303,
0.030742289498448372,
0.08591732382774353,
-0.11022476851940155,
0.09178781509399414,
0.16527795791625977,
-0.28602585196495056,
-0.05270390585064888,
0.11240458488464355,
-0.056943606585264206,
-0.07834335416555405,
-0.07266394048929214,
0.026583556085824966,
0.021382464095950127,
0.036166898906230927,
-0.13119496405124664,
0.0013039857149124146,
-0.07745200395584106,
-0.007613510359078646,
-0.08729974925518036,
-0.030024681240320206,
0.05443035438656807,
0.09366820007562637,
0.06380660086870193,
-0.0017804482486099005,
-0.15224559605121613,
0.017513427883386612,
-0.10386259853839874,
0.04471311718225479,
0.021872693672776222,
0.062209226191043854,
-0.08914440870285034,
-0.008099004626274109,
-0.05984089896082878,
-0.0378132238984108,
-0.1444302201271057,
-0.040475524961948395,
0.05496782064437866,
0.13207276165485382,
-0.11410070955753326,
0.022206895053386688,
0.0408388152718544,
-0.04019246622920036,
-0.002331341616809368,
-0.07934761792421341,
0.022413698956370354,
0.04896184056997299,
-0.07576388120651245,
-0.027840984985232353,
0.09173272550106049,
0.19243481755256653,
-0.06604618579149246,
-0.0015798868844285607,
-0.021848106756806374,
0.1393323838710785,
-0.09315299987792969,
0.006743675097823143,
-0.24165935814380646,
0.12031009048223495,
0.048291951417922974,
-0.09013071656227112,
0.01992899551987648,
-0.014381181448698044,
-0.08291494846343994,
-0.022517304867506027,
0.11266638338565826,
-0.03898230940103531,
-0.0168070700019598,
0.06804415583610535,
-0.010426445864140987,
-0.0456528477370739,
0.1213960126042366,
-0.03535955026745796,
-0.03547082841396332,
-0.0031298042740672827,
-0.09874029457569122,
0.07653598487377167,
0.10348480194807053,
-0.00029961756081320345,
-0.09876095503568649,
-0.0002607134520076215,
-0.010559090413153172,
-0.0030155552085489035,
0.004730693995952606,
-0.03206470236182213,
0.04297233372926712,
0.012655642814934254,
0.05729982256889343,
-0.21344077587127686,
-0.14794011414051056,
0.003643638687208295,
0.11540413647890091,
-0.03873583301901817,
0.008473042398691177,
-0.015034547075629234,
-0.1252679079771042,
-0.0024764654226601124,
0.01681266911327839,
0.02764737419784069,
-0.07197141647338867,
0.01756354235112667,
-0.135483980178833,
0.1246185377240181,
-0.11149141192436218,
0.03555155172944069,
-0.09124935418367386,
-0.04369807615876198,
-0.07687441259622574,
0.09644585847854614,
-0.07589633762836456,
0.0930565595626831,
-0.13411425054073334,
-0.0790388360619545,
-0.0666482225060463,
-0.03044966794550419,
0.0061899651773273945,
0.06917499005794525,
-0.08580426871776581,
-0.008447648026049137,
0.14903303980827332,
-0.10623884946107864,
-0.08469248563051224,
0.07982068508863449,
-0.03217468783259392,
-0.1004142314195633,
-0.001093198312446475,
0.1571812629699707,
0.01709531992673874,
0.006874193903058767,
-0.05145940184593201,
0.02314300648868084,
-0.1300475299358368,
-0.12837378680706024,
-0.036085695028305054,
0.14554448425769806,
0.03304373845458031,
-0.0051946197636425495,
-0.09025321900844574,
0.11243043094873428,
-0.0817771852016449,
-0.12052428722381592,
-0.04144604131579399,
-0.09867282211780548,
-0.016080081462860107,
0.04802199453115463,
0.10555781424045563,
0.06542543321847916,
-0.04202444478869438,
-0.1580812633037567,
0.055452611297369,
-0.053148303180933,
0.10828372091054916,
-0.09242380410432816,
0.15706881880760193,
-0.03418854996562004,
0.03510281816124916,
-0.23253700137138367,
0.008906538598239422,
0.004650469403713942,
0.020859623327851295,
-0.03446942940354347,
-0.04851138964295387,
0.03774969279766083,
-0.054576825350522995,
-0.03631936013698578,
-0.07042361050844193,
0.06736674159765244,
-0.019812222570180893,
-0.005021634977310896,
-0.12454035878181458,
-0.0018282649107277393,
-0.0431230328977108,
0.016051525250077248,
-0.1253124475479126,
-0.057769209146499634,
0.024070654064416885,
0.1246233657002449,
-0.031133711338043213,
0.08348829299211502,
-0.029190734028816223,
-0.05972059443593025,
-0.019149374216794968,
-0.05668909102678299,
0.09330214560031891,
0.015836184844374657,
-0.015499871224164963,
0.06861531734466553,
0.09606349468231201,
0.14840663969516754,
0.1509447693824768,
-0.2830372452735901,
-0.03764178976416588,
0.08659368753433228,
-0.032482609152793884,
-0.006019480526447296,
-0.007400063797831535,
0.1619768589735031,
0.10977166146039963,
-0.02622511237859726,
0.06964018940925598,
-0.05669981241226196,
0.09570147097110748,
0.013407323509454727,
-0.025981616228818893,
0.004281977657228708,
0.04564238712191582,
0.0802447572350502,
-0.27300018072128296,
0.08759628236293793,
0.1103631928563118,
-0.16334377229213715,
-0.04428181052207947,
-0.04528682678937912,
-0.06402572244405746,
0.10136551409959793,
0.07085718959569931,
-0.012086225673556328,
0.06659276783466339,
0.03774789720773697,
-0.024881774559617043,
0.05876437574625015,
-0.09699558466672897,
0.023479735478758812,
-0.12670519948005676,
0.023164289072155952,
-0.05192742869257927,
0.003100428730249405,
0.018817467615008354,
-0.0031259595416486263,
-0.002380205551162362,
0.10675735026597977,
-0.009700505062937737,
-0.2360631823539734,
0.05095323547720909,
-0.0675242617726326,
-0.028724832460284233,
0.2089480757713318,
-0.02361411415040493,
-0.2356199324131012,
-0.006665813736617565,
-0.01582927443087101,
-0.07913210988044739,
-0.015069789253175259,
0.025666493922472,
-0.12147388607263565,
-0.0657106414437294,
-0.019909126684069633,
-0.11065506935119629,
0.11169010400772095,
0.027490125969052315,
0.017122318968176842,
0.027980919927358627,
0.035928208380937576,
-0.048684779554605484,
0.014315933920443058,
-0.14148923754692078,
-0.09193039685487747,
0.13183189928531647,
-0.08633673936128616,
0.08052439242601395,
0.08303699642419815,
-0.015818491578102112,
0.019314415752887726,
0.024552753195166588,
0.13840456306934357,
-0.05222611501812935,
0.021199727430939674,
0.09256977587938309,
-0.00886098574846983,
0.050984937697649,
0.084734708070755,
0.009581604972481728,
-0.07130718231201172,
-0.008221063762903214,
0.02961954101920128,
-0.04244043305516243,
-0.09497544169425964,
-0.13595224916934967,
-0.0378279946744442,
-0.014755284413695335,
0.158452570438385,
0.10475954413414001,
0.045887839049100876,
0.10088971257209778,
0.011394877918064594,
-0.029560279101133347,
0.05715494602918625,
0.06390875577926636,
0.10593293607234955,
0.03748495504260063,
0.1108049526810646,
-0.07953238487243652,
-0.06634921580553055,
0.03679565340280533,
0.10595148801803589,
0.25231656432151794,
-0.029667001217603683,
-0.12466418743133545,
0.046352922916412354,
0.24848389625549316,
0.10443224757909775,
0.1033686026930809,
-0.06325124949216843,
-0.014705597423017025,
0.0018097955035045743,
-0.00048177270218729973,
0.01196601428091526,
0.04671478271484375,
-0.015327364206314087,
-0.0677119642496109,
0.03558879718184471,
0.034862566739320755,
0.026499299332499504,
0.249240443110466,
0.09474293142557144,
-0.32048845291137695,
0.010239296592772007,
-0.042557425796985626,
-0.0017028814181685448,
-0.015002013184130192,
0.13692042231559753,
-0.006435183342546225,
-0.03655049577355385,
0.07834995537996292,
-0.13128136098384857,
0.10561752319335938,
-0.05795387923717499,
0.009822770021855831,
0.0473211333155632,
-0.11966030299663544,
0.08795664459466934,
0.0498916357755661,
-0.1347319334745407,
0.17479196190834045,
-0.0478525385260582,
-0.06816823035478592,
-0.06855451315641403,
-0.048118602484464645,
0.10839352756738663,
0.13539673388004303,
0.24696996808052063,
0.016690479591488838,
-0.010850254446268082,
0.0879601463675499,
-0.025232424959540367,
0.03934088349342346,
0.052689630538225174,
0.07649997621774673,
-0.05899002403020859,
0.034481436014175415,
-0.052866946905851364,
0.0398198664188385,
0.12960046529769897,
-0.021983444690704346,
-0.10171516239643097,
-0.058700431138277054,
0.07292798161506653,
-0.04926767200231552,
-0.033569030463695526,
-0.06584075093269348,
-0.07397886365652084,
0.1355414092540741,
0.030192527920007706,
-0.026211552321910858,
-0.08368439972400665,
0.09423509985208511,
0.06502584367990494,
-0.06135778874158859,
0.10413460433483124,
-0.04917410761117935,
0.027732515707612038,
-0.019386250525712967,
-0.24595887959003448,
0.17228904366493225,
-0.1154390200972557,
-0.004731600638478994,
0.03013092279434204,
-0.06586316227912903,
-0.008661050349473953,
-0.02702687494456768,
0.056064266711473465,
0.04279056191444397,
-0.1271173506975174,
-0.02423235960304737,
0.01709495671093464,
-0.044232990592718124,
0.01935693994164467,
0.08883974701166153,
-0.05527776852250099,
-0.07682837545871735,
-0.036980945616960526,
0.15122190117835999,
0.1576569378376007,
-0.0433829165995121,
-0.10309663414955139,
0.005824095103889704,
0.11374257504940033,
0.03207111358642578,
-0.3869397044181824,
-0.09171580523252487,
-0.02505921758711338,
-0.06744284182786942,
-0.09879971295595169,
-0.1307881474494934,
0.23615652322769165,
-0.006959684193134308,
-0.059930991381406784,
0.0008560330024920404,
-0.1391560286283493,
-0.03775596246123314,
0.25461772084236145,
0.07517210394144058,
0.2861258089542389,
-0.08883396536111832,
0.025453275069594383,
-0.05845293030142784,
0.025108199566602707,
0.09008731693029404,
-0.004988688975572586,
0.07253002375364304,
-0.0027146704960614443,
0.05161336809396744,
-0.04242442548274994,
-0.027906447649002075,
-0.02695634216070175,
0.1131201684474945,
0.09467501938343048,
-0.09696285426616669,
0.07672406733036041,
0.06674114614725113,
-0.01797335222363472,
0.05126509070396423,
0.08429577946662903,
0.11133768409490585,
-0.015543913468718529,
0.024260200560092926,
-0.09401245415210724,
0.05124438554048538,
0.1055263876914978,
0.00027400118415243924,
-0.10644172877073288,
0.06148465350270271,
0.10955177247524261,
0.0857534185051918,
0.2829989194869995,
0.12814262509346008,
0.03368355706334114,
0.06776335835456848,
0.006801007315516472,
-0.0783596783876419,
-0.08442137390375137,
-0.10211344808340073,
-0.03647009655833244,
0.1447150856256485,
-0.15101009607315063,
0.05572216957807541,
0.10205931216478348,
-0.0018157500308007002,
-0.0035089857410639524,
0.1014065146446228,
0.019577283412218094,
-0.03325967863202095,
0.09962768852710724,
-0.06661970913410187,
-0.025409581139683723,
0.010896540246903896,
0.0761917382478714,
0.10386428982019424,
0.1399383693933487,
0.10627073794603348,
-0.10398247838020325,
-0.0024618010502308607,
0.024485623463988304,
0.07517100125551224,
-0.05314343422651291,
0.058177631348371506,
0.14485765993595123,
-0.025867333635687828,
-0.14577017724514008,
0.20106041431427002,
0.007860911078751087,
-0.15634962916374207,
-0.023160260170698166,
-0.054417792707681656,
-0.14167672395706177,
-0.1291711926460266,
-0.009703093208372593,
-0.003332108026370406,
-0.24244451522827148,
-0.08363298326730728,
-0.004693466238677502,
-0.07164939492940903,
0.11733949929475784,
0.025166448205709457,
0.13861814141273499,
-0.021031063050031662,
-0.03661719337105751,
-0.07945320010185242,
-0.03846943378448486,
0.018913976848125458,
0.011723780073225498,
0.06660176068544388,
-0.1616573929786682,
-0.21363183856010437,
0.013958844356238842,
0.15110598504543304,
-0.08089461177587509,
0.007071467582136393,
-0.08162625879049301,
0.09072794020175934,
-0.12389998883008957,
0.06894999742507935,
-0.05862702429294586,
0.026637066155672073,
-0.06158250942826271,
-0.028145743533968925,
-0.06308282911777496,
0.056371863931417465,
-0.09653656929731369,
-0.026269370689988136,
-0.0005408708821050823,
-0.023506058380007744,
-0.117017962038517,
-0.028689205646514893,
-0.02015838585793972,
0.006523805670440197,
0.1316632181406021,
0.08282825350761414,
-0.10578665137290955,
0.09355971217155457,
-0.14170177280902863,
-0.25698912143707275,
0.1556512713432312,
0.04048388823866844,
-0.02051299437880516,
0.059178758412599564,
0.054800137877464294,
0.07536144554615021,
-0.044986989349126816,
-0.015637606382369995,
0.012717227451503277,
-0.046334024518728256,
-0.01953941397368908,
-0.2365904599428177,
0.01918552815914154,
-0.027041219174861908,
0.0038747910875827074,
0.08129977434873581,
0.10371624678373337,
0.11686643213033676,
-0.05840156972408295,
-0.0071630957536399364,
-0.03923705220222473,
-0.003741312539204955,
-0.00687831686809659,
-0.11360208690166473,
-0.015308456495404243,
-0.03750839829444885,
0.021657725796103477,
-0.07844021916389465,
0.1644858717918396,
-0.018295498564839363,
-0.0964442640542984,
0.025700882077217102,
0.2193782478570938,
-0.11253664642572403,
0.07985016703605652,
0.18610206246376038,
0.08595740050077438,
-0.008152458816766739,
0.050747789442539215,
0.04977960139513016,
0.09601109474897385,
0.18952150642871857,
0.00418998533859849,
0.2471097707748413,
0.041306085884571075,
0.1159195825457573,
0.07817456871271133,
-0.015268572606146336,
0.052771758288145065,
0.09242278337478638,
-0.09134594351053238,
0.1051529049873352,
0.005593913607299328,
0.003514004871249199,
0.13294558227062225,
0.0322100967168808,
0.008134705945849419,
0.014656124636530876,
-0.005593457259237766,
-0.08462353050708771,
-0.204203799366951,
-0.10057121515274048,
-0.14486679434776306,
0.04875024035573006,
-0.031066104769706726,
-0.08203542977571487,
0.21749520301818848,
0.08015964180231094,
-0.045576464384794235,
0.05122554302215576,
-0.08002632111310959,
-0.052691567689180374,
0.16411224007606506,
-0.008359287865459919,
-0.10711362957954407,
0.1382613182067871,
-0.047639813274145126,
0.066770039498806,
0.05856925994157791,
-0.01557912491261959,
0.0019961437210440636,
0.06766504049301147,
0.15987497568130493,
-0.039382100105285645,
-0.04002955183386803,
-0.032757628709077835,
0.025394592434167862,
-0.044211652129888535,
-0.003320094430819154,
-0.01698356680572033,
0.029873918741941452,
0.04727846384048462,
0.2323896288871765,
-0.10574555397033691,
-0.07572998106479645,
-0.1128677949309349,
-0.03624029830098152,
-0.002467453246936202,
0.05954906716942787,
-0.04903950169682503,
-0.03778450936079025,
-0.021324504166841507,
0.20859652757644653,
0.22922465205192566,
-0.18302331864833832,
0.004068491980433464,
0.0842617005109787,
0.003324494231492281,
-0.037758708000183105,
0.029735304415225983,
0.17052409052848816,
0.10927384346723557,
-0.06671486049890518,
-0.13570381700992584,
-0.06060635298490524,
-0.05648837238550186,
-0.11898429691791534,
0.013906740583479404,
0.058948274701833725,
-0.07613780349493027,
-0.13292452692985535,
0.11999525129795074,
-0.17654283344745636,
-0.011855706572532654,
0.07948452979326248,
-0.08594264835119247,
-0.12081120163202286,
-0.051676828414201736,
0.07019615173339844,
0.09917984902858734,
0.009273936040699482,
-0.07885241508483887,
0.04512840509414673,
0.018944580107927322,
0.027855481952428818,
-0.15693163871765137,
-0.04218674823641777,
-0.06457831710577011,
-0.02718990668654442,
0.12512196600437164,
0.038266900926828384,
-0.0054649231024086475,
0.044002681970596313,
0.027317024767398834,
-0.06928573548793793,
0.00972569826990366,
-0.04889630898833275,
-0.12530715763568878,
0.024429520592093468,
0.06483325362205505,
-0.030790427699685097,
-0.003481582272797823,
0.05575175583362579,
-0.165300652384758,
-0.023503165692090988,
0.05022266134619713,
0.013173170387744904,
-0.11644019931554794,
0.01398411300033331,
-0.0702197402715683,
0.05362347513437271,
0.04327251389622688,
0.00496812304481864,
-0.0017478411318734288,
-0.05221421644091606,
-0.030633438378572464,
0.008551651611924171,
-0.12219645828008652,
0.011811664327979088,
-0.026807283982634544,
-0.03588233143091202,
-0.23979616165161133,
0.05944765731692314,
0.006935045123100281,
-0.018006717786192894,
-0.03428909182548523,
-0.041990723460912704,
-0.08273885399103165,
0.07876519858837128,
0.11251866072416306,
-0.043642815202474594,
0.010340838693082333,
0.015536220744252205,
0.07201986759901047,
-0.0016804981278255582,
-0.02216409333050251,
-0.1097506582736969
] |
null | null |
transformers
|
# resnet26d
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet26d
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet26d
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet26d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet26d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
26
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet26d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10658662021160126,
0.06905526667833328,
-0.0024896860122680664,
0.07957018911838531,
0.09052454680204391,
0.007605852093547583,
-0.017499523237347603,
0.08282821625471115,
-0.09666583687067032,
-0.01545178797096014,
0.10364415496587753,
0.09498219937086105,
0.003624245524406433,
-0.06289677321910858,
-0.056367579847574234,
-0.22660216689109802,
0.08879246562719345,
0.07104138284921646,
-0.01033689547330141,
0.05763578414916992,
0.06925241649150848,
-0.07539515197277069,
0.09433097392320633,
0.0020018117502331734,
-0.17083758115768433,
0.008892307989299297,
-0.014630982652306557,
-0.0657598078250885,
0.07907469570636749,
0.02243478037416935,
0.1017458438873291,
0.012905209325253963,
0.09656047821044922,
-0.1209040954709053,
0.032367054373025894,
0.053239766508340836,
-0.07287812978029251,
0.058233652263879776,
0.10124292224645615,
-0.022808628156781197,
0.04567496478557587,
0.04814198240637779,
-0.07888012379407883,
-0.04692472144961357,
0.002205328084528446,
-0.0686926618218422,
0.0029725111089646816,
0.17442339658737183,
0.08222678303718567,
0.09386355429887772,
0.033619172871112823,
0.08351165801286697,
-0.10985086113214493,
0.09197793155908585,
0.15531349182128906,
-0.2883235812187195,
-0.05168340355157852,
0.12475688010454178,
-0.055335916578769684,
-0.08116699755191803,
-0.07614784687757492,
0.02528429590165615,
0.025595026090741158,
0.03798019513487816,
-0.1320396363735199,
0.003016637871041894,
-0.08507776260375977,
-0.009083965793251991,
-0.08740822225809097,
-0.03191455826163292,
0.05292462930083275,
0.09613290429115295,
0.06336227059364319,
0.008202709257602692,
-0.15647703409194946,
0.018130863085389137,
-0.10307986289262772,
0.042441532015800476,
0.02149241790175438,
0.067275770008564,
-0.08694674074649811,
-0.004259787499904633,
-0.06236248090863228,
-0.03700657933950424,
-0.14622560143470764,
-0.03756994381546974,
0.05694316327571869,
0.13426083326339722,
-0.11945349723100662,
0.02388521283864975,
0.042509231716394424,
-0.04344010725617409,
-0.0029546963050961494,
-0.08078747242689133,
0.025518620386719704,
0.04997631162405014,
-0.07385047525167465,
-0.02502470649778843,
0.09559834003448486,
0.19096587598323822,
-0.07777639478445053,
0.0001187514208140783,
-0.0227400790899992,
0.1358853280544281,
-0.09641478210687637,
0.004803012125194073,
-0.24019241333007812,
0.12341000884771347,
0.047787975519895554,
-0.08651825785636902,
0.01933131366968155,
-0.01360990945249796,
-0.08012346923351288,
-0.02154277265071869,
0.11409839987754822,
-0.03940955176949501,
-0.014007954858243465,
0.06998706609010696,
-0.0067107053473591805,
-0.046809300780296326,
0.13260303437709808,
-0.03522665798664093,
-0.03602919355034828,
-0.0014189488720148802,
-0.09538114070892334,
0.0831647515296936,
0.10425679385662079,
0.00047915717004798353,
-0.09727316349744797,
0.0012076866114512086,
-0.009992875158786774,
-0.0075620743446052074,
0.005454537458717823,
-0.03442119061946869,
0.04684619978070259,
0.01312637235969305,
0.05511101707816124,
-0.21489696204662323,
-0.13794296979904175,
0.007008381653577089,
0.11774508655071259,
-0.035103823989629745,
0.01038059126585722,
-0.014313616789877415,
-0.12717147171497345,
0.00088353524915874,
0.01826893724501133,
0.028877176344394684,
-0.07084662467241287,
0.01536470279097557,
-0.1302875131368637,
0.12408281117677689,
-0.11534235626459122,
0.03467843681573868,
-0.08917804807424545,
-0.03949877247214317,
-0.059021856635808945,
0.09360053390264511,
-0.08204589039087296,
0.08323214203119278,
-0.1347881704568863,
-0.07478741556406021,
-0.0721818134188652,
-0.03319288790225983,
0.009800100699067116,
0.06552385538816452,
-0.08868516236543655,
-0.007069632411003113,
0.1506652683019638,
-0.1080797016620636,
-0.08787131309509277,
0.0787888765335083,
-0.029739169403910637,
-0.10636372119188309,
-0.006061499938368797,
0.15861661732196808,
0.02009488083422184,
-0.00044696067925542593,
-0.057033244520425797,
0.03080037795007229,
-0.1264285296201706,
-0.12957990169525146,
-0.042226240038871765,
0.14414264261722565,
0.036545708775520325,
-0.003967799711972475,
-0.09106321632862091,
0.11369103938341141,
-0.08186923712491989,
-0.1199839785695076,
-0.04447270557284355,
-0.09281761199235916,
-0.017442218959331512,
0.046606648713350296,
0.10254108160734177,
0.06851201504468918,
-0.03964976966381073,
-0.16287793219089508,
0.05935221537947655,
-0.053767941892147064,
0.11264509707689285,
-0.09023580700159073,
0.14618809521198273,
-0.03631880506873131,
0.03942039981484413,
-0.2271261364221573,
0.0037451223470270634,
0.006692963652312756,
0.025654911994934082,
-0.04002070799469948,
-0.04804760590195656,
0.03472261503338814,
-0.06124555692076683,
-0.037881746888160706,
-0.07099372893571854,
0.06919199973344803,
-0.017336303368210793,
-0.0056592850014567375,
-0.12471749633550644,
-0.004258090164512396,
-0.04355792701244354,
0.009528310969471931,
-0.1335577368736267,
-0.0579373799264431,
0.026703525334596634,
0.1281375288963318,
-0.032961782068014145,
0.0849442407488823,
-0.02928367257118225,
-0.05969499051570892,
-0.02548728697001934,
-0.05772417411208153,
0.08932307362556458,
0.018509529531002045,
-0.019689591601490974,
0.06895270943641663,
0.09980149567127228,
0.1407838761806488,
0.14678868651390076,
-0.2790777385234833,
-0.0348324291408062,
0.08776640892028809,
-0.03440511226654053,
-0.011366543360054493,
-0.0063262502662837505,
0.16145378351211548,
0.10420236736536026,
-0.02123934030532837,
0.06882432103157043,
-0.05622074753046036,
0.09977924823760986,
0.013354779221117496,
-0.03261979669332504,
0.001565599930472672,
0.042327336966991425,
0.07457290589809418,
-0.27553266286849976,
0.08473141491413116,
0.11178620159626007,
-0.1633443832397461,
-0.047081828117370605,
-0.04709070920944214,
-0.06573440879583359,
0.10275769233703613,
0.0746748223900795,
-0.013180127367377281,
0.06280827522277832,
0.04453464224934578,
-0.020761745050549507,
0.05871095880866051,
-0.09012938290834427,
0.027295513078570366,
-0.12838539481163025,
0.027575237676501274,
-0.053989410400390625,
0.0013889839174225926,
0.014170208014547825,
0.0009559562313370407,
-0.006106038112193346,
0.10322964936494827,
-0.012323969975113869,
-0.24336908757686615,
0.05388518422842026,
-0.0692710131406784,
-0.026610393077135086,
0.20889420807361603,
-0.02321300096809864,
-0.23199371993541718,
-0.0055656167678534985,
-0.019664252176880836,
-0.08573207259178162,
-0.01714733988046646,
0.023969881236553192,
-0.12102287262678146,
-0.06320902705192566,
-0.018146121874451637,
-0.11812791228294373,
0.10130295902490616,
0.026430504396557808,
0.018606381490826607,
0.028723251074552536,
0.03749115765094757,
-0.05091865360736847,
0.019159525632858276,
-0.1391773670911789,
-0.09264305233955383,
0.1328350007534027,
-0.09187841415405273,
0.08127465099096298,
0.0894044041633606,
-0.015435841865837574,
0.018583057448267937,
0.02966197207570076,
0.1383998543024063,
-0.04676185920834541,
0.02628416381776333,
0.08772417157888412,
-0.007851332426071167,
0.05552775785326958,
0.08889970928430557,
0.010262190364301205,
-0.0671021044254303,
-0.007917865179479122,
0.029484594240784645,
-0.04345069080591202,
-0.10153865814208984,
-0.13409581780433655,
-0.03630233183503151,
-0.019164355471730232,
0.15264610946178436,
0.10590662807226181,
0.046543776988983154,
0.09944257140159607,
0.007047552149742842,
-0.02443081885576248,
0.049620579928159714,
0.06330501288175583,
0.10723187774419785,
0.04181775078177452,
0.10946711897850037,
-0.08099622279405594,
-0.06149454414844513,
0.03447727859020233,
0.1182028129696846,
0.2536126673221588,
-0.027221105992794037,
-0.11770214885473251,
0.051459718495607376,
0.2582510709762573,
0.10464436560869217,
0.09837324917316437,
-0.061470821499824524,
-0.0138448067009449,
0.0020618862472474575,
0.0018666799878701568,
0.017770258709788322,
0.0483766533434391,
-0.020649533718824387,
-0.07001943141222,
0.03887388855218887,
0.03695204481482506,
0.02390369586646557,
0.258247047662735,
0.09255616366863251,
-0.31363338232040405,
0.011298607103526592,
-0.04211970791220665,
0.0011315766023471951,
-0.013224882073700428,
0.13608911633491516,
-0.008856251835823059,
-0.03034311532974243,
0.0808427631855011,
-0.12929107248783112,
0.105548195540905,
-0.0596928596496582,
0.00908302515745163,
0.04564570263028145,
-0.11670666188001633,
0.08839692175388336,
0.05234584957361221,
-0.13692234456539154,
0.17888860404491425,
-0.04694308340549469,
-0.06804317981004715,
-0.0642307698726654,
-0.046314917504787445,
0.1065632775425911,
0.13876701891422272,
0.24046020209789276,
0.018666589632630348,
-0.020379992201924324,
0.09647641330957413,
-0.02241791971027851,
0.04029789939522743,
0.05599869787693024,
0.08281910419464111,
-0.05826065316796303,
0.03588256612420082,
-0.05268377065658569,
0.04219059646129608,
0.13466566801071167,
-0.028682613745331764,
-0.09895479679107666,
-0.05515173450112343,
0.09060158580541611,
-0.06060768663883209,
-0.03689161315560341,
-0.06343993544578552,
-0.07349005341529846,
0.1445467174053192,
0.02920627035200596,
-0.030155770480632782,
-0.0816371738910675,
0.08539882302284241,
0.06577175110578537,
-0.06421241164207458,
0.09870188683271408,
-0.05356317386031151,
0.03062264993786812,
-0.02040497399866581,
-0.2447350025177002,
0.17237316071987152,
-0.11618134379386902,
-0.002498186891898513,
0.02998235821723938,
-0.06463093310594559,
-0.010767578147351742,
-0.03234918415546417,
0.057523321360349655,
0.03823699802160263,
-0.12830853462219238,
-0.02827286720275879,
0.014363748021423817,
-0.043072547763586044,
0.007536548189818859,
0.08256740123033524,
-0.055902767926454544,
-0.07641460001468658,
-0.03684517741203308,
0.14975322782993317,
0.1525842547416687,
-0.03809528052806854,
-0.10362689197063446,
-0.0007729498320259154,
0.12559010088443756,
0.027014898136258125,
-0.38367536664009094,
-0.09660807251930237,
-0.02532103843986988,
-0.07208330184221268,
-0.1023724302649498,
-0.12772279977798462,
0.232082799077034,
-0.010238884948194027,
-0.06255071610212326,
0.009342599660158157,
-0.13344381749629974,
-0.03258803114295006,
0.2550565004348755,
0.07270651310682297,
0.2832612991333008,
-0.08478701859712601,
0.021171892061829567,
-0.0577959306538105,
0.02073069103062153,
0.08877568691968918,
-0.010730192996561527,
0.07228962332010269,
-0.005984586197882891,
0.05475492775440216,
-0.044135212898254395,
-0.02763454243540764,
-0.02596324495971203,
0.12057945877313614,
0.0955696776509285,
-0.0920691192150116,
0.0809861421585083,
0.06900393962860107,
-0.02016534097492695,
0.05261588469147682,
0.08318708837032318,
0.1131911501288414,
-0.0072629707865417,
0.02559034153819084,
-0.09668603539466858,
0.05036020651459694,
0.10124357044696808,
-0.0007186643779277802,
-0.10830526798963547,
0.057724252343177795,
0.11235657334327698,
0.08556625247001648,
0.2851227819919586,
0.12998051941394806,
0.0344356931746006,
0.08323144167661667,
-0.0005977783584967256,
-0.07938606292009354,
-0.08088691532611847,
-0.10363292694091797,
-0.03593085706233978,
0.14728516340255737,
-0.15633894503116608,
0.05648471415042877,
0.10251164436340332,
-0.0018438659608364105,
-0.006929542403668165,
0.10181069374084473,
0.012489527463912964,
-0.0388045497238636,
0.09822870045900345,
-0.06143118068575859,
-0.02728710137307644,
0.009140142239630222,
0.0771537721157074,
0.09767443686723709,
0.14628265798091888,
0.10874959826469421,
-0.10395081341266632,
0.0015316439094021916,
0.024928158149123192,
0.07173808664083481,
-0.05264761298894882,
0.057210348546504974,
0.1485542207956314,
-0.024875974282622337,
-0.14182400703430176,
0.2024713158607483,
0.007716131396591663,
-0.1532416194677353,
-0.02681300789117813,
-0.05688345432281494,
-0.14532528817653656,
-0.12669965624809265,
-0.019124627113342285,
-0.014368802309036255,
-0.24249330163002014,
-0.08756792545318604,
-0.0013886266387999058,
-0.06896759569644928,
0.11139054596424103,
0.018829097971320152,
0.1348402053117752,
-0.017084313556551933,
-0.03691637143492699,
-0.08343848586082458,
-0.03974944353103638,
0.01640848070383072,
0.011089473031461239,
0.07234101742506027,
-0.16317753493785858,
-0.21914057433605194,
0.016441499814391136,
0.15094470977783203,
-0.0770365372300148,
0.006736293900758028,
-0.0789196789264679,
0.09099479019641876,
-0.13514070212841034,
0.06894467025995255,
-0.05451934412121773,
0.026459258049726486,
-0.06115120276808739,
-0.026064623147249222,
-0.06154735013842583,
0.06033255159854889,
-0.0943601205945015,
-0.026242859661579132,
-0.0015038367127999663,
-0.02314211241900921,
-0.1170811653137207,
-0.028367381542921066,
-0.022388439625501633,
0.006144732236862183,
0.12467288970947266,
0.07836145162582397,
-0.10857654362916946,
0.09486746042966843,
-0.14739204943180084,
-0.2536733150482178,
0.15601123869419098,
0.04346269741654396,
-0.017867498099803925,
0.06017952784895897,
0.05076020583510399,
0.0780133605003357,
-0.04209841042757034,
-0.01907995156943798,
0.007165964227169752,
-0.04690227285027504,
-0.0287084449082613,
-0.24342355132102966,
0.019098417833447456,
-0.023638533428311348,
0.002177300164476037,
0.08405354619026184,
0.11291868984699249,
0.11838517338037491,
-0.056890975683927536,
-0.010702491737902164,
-0.03993707522749901,
-0.005133545026183128,
-0.005640697665512562,
-0.11139405518770218,
-0.009133181534707546,
-0.03522372618317604,
0.018449895083904266,
-0.07966573536396027,
0.16003455221652985,
-0.029215194284915924,
-0.09377360343933105,
0.021121300756931305,
0.20555411279201508,
-0.11971604079008102,
0.07453149557113647,
0.18945564329624176,
0.08543279767036438,
-0.010536270216107368,
0.05221593752503395,
0.042608872056007385,
0.09339313209056854,
0.19225643575191498,
0.006310656201094389,
0.24126020073890686,
0.04098702594637871,
0.12033035606145859,
0.07364070415496826,
-0.01805208995938301,
0.05692019313573837,
0.08576041460037231,
-0.09104881435632706,
0.10925757884979248,
0.010386821813881397,
-0.00026791475829668343,
0.13661837577819824,
0.03388603404164314,
0.003228752873837948,
0.012962564826011658,
-0.007726127281785011,
-0.0847293958067894,
-0.20259709656238556,
-0.10213108360767365,
-0.14240491390228271,
0.049576543271541595,
-0.030573664233088493,
-0.08014582842588425,
0.2144494503736496,
0.07954949885606766,
-0.04536881297826767,
0.048803865909576416,
-0.07060349732637405,
-0.0540546178817749,
0.16457046568393707,
-0.007596293464303017,
-0.10938747227191925,
0.14583732187747955,
-0.04033123701810837,
0.06775540113449097,
0.056563135236501694,
-0.01761476881802082,
0.0011049993336200714,
0.07413921505212784,
0.16372694075107574,
-0.037285007536411285,
-0.03915184736251831,
-0.03414053097367287,
0.027607912197709084,
-0.039629168808460236,
-0.00129211088642478,
-0.013559923507273197,
0.030781634151935577,
0.04801228269934654,
0.22920255362987518,
-0.10748083144426346,
-0.07911211997270584,
-0.11101612448692322,
-0.027552444487810135,
-0.0026486190035939217,
0.05918236821889877,
-0.047884322702884674,
-0.03441857546567917,
-0.021118419244885445,
0.20806078612804413,
0.2245340794324875,
-0.17976014316082,
0.003782497951760888,
0.07835979759693146,
0.004762518685311079,
-0.03985817730426788,
0.030241185799241066,
0.17143115401268005,
0.10579080879688263,
-0.06500456482172012,
-0.14078858494758606,
-0.06207782402634621,
-0.0547403059899807,
-0.11943667382001877,
0.017431167885661125,
0.05581012740731239,
-0.0789310410618782,
-0.1346222311258316,
0.11709317564964294,
-0.17880751192569733,
-0.013573181815445423,
0.07330986857414246,
-0.0724058449268341,
-0.11610129475593567,
-0.052212413400411606,
0.06766258180141449,
0.0945345088839531,
0.007894919253885746,
-0.0814443826675415,
0.04626024141907692,
0.025780200958251953,
0.023648468777537346,
-0.15046773850917816,
-0.03922741115093231,
-0.06622949987649918,
-0.02916972152888775,
0.12595254182815552,
0.04176109656691551,
-0.003283771686255932,
0.04386168345808983,
0.030787037685513496,
-0.06961338967084885,
0.01509823463857174,
-0.04830000549554825,
-0.11856786906719208,
0.0236346535384655,
0.05974964797496796,
-0.03216352313756943,
-0.006230559665709734,
0.05588853731751442,
-0.1672637015581131,
-0.023775266483426094,
0.061149027198553085,
0.013297474011778831,
-0.12113084644079208,
0.010515404865145683,
-0.06542761623859406,
0.0523252934217453,
0.042175039649009705,
0.0036929110065102577,
-0.002187030855566263,
-0.05263791233301163,
-0.031131984665989876,
0.007241427432745695,
-0.11730106174945831,
0.014715731143951416,
-0.022439535707235336,
-0.03526249900460243,
-0.23237381875514984,
0.06083716079592705,
0.010530458763241768,
-0.015237677842378616,
-0.03212392330169678,
-0.04222071170806885,
-0.07960522919893265,
0.07889780402183533,
0.11215606331825256,
-0.03934643417596817,
0.010694161057472229,
0.020177381113171577,
0.0675596222281456,
-0.004270496778190136,
-0.02292538806796074,
-0.11381756514310837
] |
null | null |
transformers
|
# resnet34
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet34
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet34
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet34\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet34\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
25
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet34\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.1078517958521843,
0.06416585296392441,
-0.00234150025062263,
0.08502304553985596,
0.09366682171821594,
0.007787173613905907,
-0.017818961292505264,
0.09075593203306198,
-0.09371036291122437,
-0.016322480514645576,
0.10387150943279266,
0.1036374643445015,
-0.00033664805232547224,
-0.05787540227174759,
-0.051756322383880615,
-0.23315881192684174,
0.08685462921857834,
0.06876730918884277,
-0.00691399909555912,
0.05769328027963638,
0.0693548321723938,
-0.08268929272890091,
0.09771660715341568,
0.0017359757330268621,
-0.17023710906505585,
0.007424066308885813,
-0.01630958914756775,
-0.06746795028448105,
0.07904238253831863,
0.02260967716574669,
0.09542319178581238,
0.005540897604078054,
0.09836136549711227,
-0.10961392521858215,
0.03222227096557617,
0.047420185059309006,
-0.07690152525901794,
0.0564880333840847,
0.09822343289852142,
-0.03226352855563164,
0.033462777733802795,
0.044883377850055695,
-0.0702739730477333,
-0.045364413410425186,
-0.0008410612936131656,
-0.07001597434282303,
0.004006607923656702,
0.16664491593837738,
0.08339382708072662,
0.0928485244512558,
0.032229233533144,
0.08709703385829926,
-0.11129921674728394,
0.09464201331138611,
0.16512274742126465,
-0.2876516878604889,
-0.05067875236272812,
0.1167699471116066,
-0.05501958727836609,
-0.0779646709561348,
-0.0709504708647728,
0.03014417365193367,
0.023071691393852234,
0.03564592823386192,
-0.12899723649024963,
-0.00025461631594225764,
-0.07149730622768402,
-0.006202190648764372,
-0.08683089911937714,
-0.028597233816981316,
0.053789082914590836,
0.09208669513463974,
0.06093524023890495,
-0.003751296317204833,
-0.150999054312706,
0.014971509575843811,
-0.10535909980535507,
0.048673491925001144,
0.02277657389640808,
0.062340088188648224,
-0.08853217959403992,
-0.008906364440917969,
-0.05859871953725815,
-0.037132080644369125,
-0.1429237574338913,
-0.04098385199904442,
0.05493505671620369,
0.13181166350841522,
-0.11309897899627686,
0.02358618564903736,
0.037360262125730515,
-0.04164297133684158,
-0.002624138258397579,
-0.07902245223522186,
0.02414860762655735,
0.04797358810901642,
-0.07634858787059784,
-0.024975253269076347,
0.09329458326101303,
0.20097211003303528,
-0.062374603003263474,
-0.001741955173201859,
-0.02523859404027462,
0.1389109343290329,
-0.09384404122829437,
0.010668880306184292,
-0.24026744067668915,
0.11297357827425003,
0.05136837810277939,
-0.0911683440208435,
0.021786900237202644,
-0.013482954353094101,
-0.08391447365283966,
-0.025641709566116333,
0.11054809391498566,
-0.03649674355983734,
-0.015742680057883263,
0.06972351670265198,
-0.009649358689785004,
-0.04314875230193138,
0.11967823654413223,
-0.0361110121011734,
-0.03316495940089226,
-0.005663897842168808,
-0.09895879030227661,
0.07731424272060394,
0.10561257600784302,
0.0003779275284614414,
-0.0976417139172554,
-0.0008241095929406583,
-0.009768558666110039,
-0.0010763156460598111,
0.004273626487702131,
-0.033310502767562866,
0.0423954539000988,
0.010286402888596058,
0.05906542390584946,
-0.2140941172838211,
-0.1525944471359253,
0.004351734649389982,
0.11483287811279297,
-0.04052777588367462,
0.010549039579927921,
-0.013441065326333046,
-0.12409289926290512,
-0.0037781938444823027,
0.016294335946440697,
0.02688021771609783,
-0.0729392021894455,
0.01777729019522667,
-0.13296400010585785,
0.12725672125816345,
-0.11230617761611938,
0.035007573664188385,
-0.0898716002702713,
-0.04185055196285248,
-0.07998884469270706,
0.09806983917951584,
-0.0749129131436348,
0.09593844413757324,
-0.13247551023960114,
-0.08018574863672256,
-0.06373870372772217,
-0.029111938551068306,
0.0064213452860713005,
0.06821128726005554,
-0.08691052347421646,
-0.007172157522290945,
0.14916929602622986,
-0.1057942807674408,
-0.08496440947055817,
0.08166230469942093,
-0.03407201170921326,
-0.10014966875314713,
0.001462846645154059,
0.15770110487937927,
0.017858140170574188,
0.003206998808309436,
-0.04834890738129616,
0.02381957322359085,
-0.1318516582250595,
-0.12755846977233887,
-0.036667753010988235,
0.14460664987564087,
0.029080575332045555,
-0.004262408707290888,
-0.08918670564889908,
0.11381614953279495,
-0.08128838986158371,
-0.12088748067617416,
-0.04114444553852081,
-0.09931882470846176,
-0.01588146574795246,
0.04523458704352379,
0.10662057250738144,
0.06344133615493774,
-0.041468679904937744,
-0.156510591506958,
0.05725562572479248,
-0.05345756560564041,
0.11024875938892365,
-0.0930134654045105,
0.16201846301555634,
-0.0323033444583416,
0.034158576279878616,
-0.23148319125175476,
0.013428816571831703,
0.004283322487026453,
0.022709928452968597,
-0.035179175436496735,
-0.05085311457514763,
0.03927731141448021,
-0.05182044953107834,
-0.03743662312626839,
-0.07073657214641571,
0.07058019936084747,
-0.019099170342087746,
-0.004565475974231958,
-0.12370716780424118,
-0.0016948806587606668,
-0.04261615499854088,
0.01583726890385151,
-0.12407778948545456,
-0.05651893839240074,
0.02104995772242546,
0.12481289356946945,
-0.03132827579975128,
0.08436566591262817,
-0.028810959309339523,
-0.05812137573957443,
-0.018336106091737747,
-0.05646185949444771,
0.0942455604672432,
0.014794635586440563,
-0.016939962282776833,
0.06944122165441513,
0.09531162679195404,
0.15113858878612518,
0.1529388278722763,
-0.2848644256591797,
-0.03844517841935158,
0.08412814140319824,
-0.0303720161318779,
-0.00500837666913867,
-0.005008958745747805,
0.15935750305652618,
0.11156634986400604,
-0.028345957398414612,
0.0731881782412529,
-0.05794014409184456,
0.09521468728780746,
0.011933470144867897,
-0.02735542505979538,
0.005865208804607391,
0.04621107131242752,
0.07892981171607971,
-0.26903679966926575,
0.0872034952044487,
0.10712949186563492,
-0.16329942643642426,
-0.04421592131257057,
-0.04598335921764374,
-0.06274720281362534,
0.10031771659851074,
0.06884995102882385,
-0.011481831781566143,
0.06610095500946045,
0.03617393970489502,
-0.02468104287981987,
0.05913214758038521,
-0.09769948571920395,
0.022731216624379158,
-0.1259291023015976,
0.023206664249300957,
-0.050631195306777954,
0.0024609726388007402,
0.01851615123450756,
-0.0011870256857946515,
-0.002183160511776805,
0.10695970803499222,
-0.010802571661770344,
-0.23544320464134216,
0.05149246007204056,
-0.06535456329584122,
-0.027081314474344254,
0.20728640258312225,
-0.025664271786808968,
-0.23591814935207367,
-0.008699852973222733,
-0.0202044490724802,
-0.07878505438566208,
-0.014902805909514427,
0.025398731231689453,
-0.11984258145093918,
-0.06608543545007706,
-0.019947988912463188,
-0.10918786376714706,
0.11169357597827911,
0.029063450172543526,
0.01822505332529545,
0.02718714438378811,
0.03645120933651924,
-0.050010472536087036,
0.013278418220579624,
-0.1412339210510254,
-0.09104181826114655,
0.1318555325269699,
-0.08613263815641403,
0.08068094402551651,
0.07988331466913223,
-0.015798751264810562,
0.019346358254551888,
0.023931991308927536,
0.14230050146579742,
-0.05200604349374771,
0.022157609462738037,
0.09464943408966064,
-0.009097786620259285,
0.05057407170534134,
0.08514080196619034,
0.008916812017560005,
-0.07367321103811264,
-0.009684515185654163,
0.02879285253584385,
-0.04375317692756653,
-0.09107618033885956,
-0.13568799197673798,
-0.03861597925424576,
-0.012195101007819176,
0.15807025134563446,
0.10356388986110687,
0.049366939812898636,
0.09906798601150513,
0.013551903888583183,
-0.026549244299530983,
0.05985688045620918,
0.06495383381843567,
0.11084293574094772,
0.03504382446408272,
0.11171076446771622,
-0.0786479189991951,
-0.06680470705032349,
0.03736544027924538,
0.10525161027908325,
0.24877765774726868,
-0.031344056129455566,
-0.12547142803668976,
0.04473013058304787,
0.24797303974628448,
0.10417084395885468,
0.10483837872743607,
-0.06638702005147934,
-0.014160490594804287,
0.002490339567884803,
0.00004925124812871218,
0.011409011669456959,
0.04546140506863594,
-0.011215273290872574,
-0.06968680024147034,
0.03705185279250145,
0.03230525553226471,
0.027809830382466316,
0.2434394806623459,
0.09378432482481003,
-0.32230344414711,
0.012198571115732193,
-0.042200516909360886,
-0.0015604153741151094,
-0.01487369928508997,
0.13633736968040466,
-0.009923468343913555,
-0.037534236907958984,
0.08092306554317474,
-0.12983299791812897,
0.10552115738391876,
-0.057495538145303726,
0.010228960774838924,
0.045186497271060944,
-0.12084759026765823,
0.08535774052143097,
0.04803455248475075,
-0.13493525981903076,
0.17135600745677948,
-0.046638790518045425,
-0.06685841828584671,
-0.06721781939268112,
-0.047734200954437256,
0.10890094935894012,
0.14026866853237152,
0.24706755578517914,
0.01635017991065979,
-0.005117442458868027,
0.08296952396631241,
-0.027923962101340294,
0.036792706698179245,
0.05086570605635643,
0.07688184827566147,
-0.058215852826833725,
0.03493623808026314,
-0.05415339395403862,
0.038201574236154556,
0.12668359279632568,
-0.020488906651735306,
-0.09994836896657944,
-0.06112697720527649,
0.07279560714960098,
-0.047729842364788055,
-0.033726099878549576,
-0.06523486226797104,
-0.07675198465585709,
0.13231493532657623,
0.035135459154844284,
-0.0273011215031147,
-0.08591755479574203,
0.09314460307359695,
0.0670660138130188,
-0.06067563220858574,
0.10685458034276962,
-0.04926742985844612,
0.023610306903719902,
-0.019824983552098274,
-0.2458852082490921,
0.17003285884857178,
-0.11489400267601013,
-0.0041056424379348755,
0.03108295053243637,
-0.06237322837114334,
-0.010872450657188892,
-0.027224112302064896,
0.05744864046573639,
0.042957909405231476,
-0.12524113059043884,
-0.024153342470526695,
0.016254808753728867,
-0.04475977644324303,
0.022259991616010666,
0.09042064845561981,
-0.05582931637763977,
-0.0782679095864296,
-0.0352356918156147,
0.15202446281909943,
0.15939359366893768,
-0.039877232164144516,
-0.1018470898270607,
0.008326358161866665,
0.11014033108949661,
0.031494684517383575,
-0.38498184084892273,
-0.09200738370418549,
-0.02591560035943985,
-0.0681241974234581,
-0.0971449613571167,
-0.13062684237957,
0.23503029346466064,
-0.009496727958321571,
-0.05807320773601532,
0.003252220805734396,
-0.1382996141910553,
-0.039965707808732986,
0.2539325952529907,
0.07455570250749588,
0.29065844416618347,
-0.08797235786914825,
0.025546399876475334,
-0.05702536553144455,
0.021143024787306786,
0.0860883966088295,
-0.0017172540538012981,
0.07347901165485382,
-0.0034125158563256264,
0.050988756120204926,
-0.04149466007947922,
-0.0288444384932518,
-0.025301309302449226,
0.11303262412548065,
0.09659125655889511,
-0.09955478459596634,
0.0749763622879982,
0.0611366331577301,
-0.017158038914203644,
0.0533350370824337,
0.08276082575321198,
0.11059486120939255,
-0.019377848133444786,
0.022276269271969795,
-0.092162124812603,
0.05030854418873787,
0.10542643815279007,
-0.00035592843778431416,
-0.10677848756313324,
0.061950765550136566,
0.10850336402654648,
0.08454927057027817,
0.2808890640735626,
0.12792345881462097,
0.03086087293922901,
0.06480798870325089,
0.005679367110133171,
-0.077435202896595,
-0.08571431785821915,
-0.10081524401903152,
-0.03671952337026596,
0.14402027428150177,
-0.1495334655046463,
0.057048145681619644,
0.10117295384407043,
-0.0027724318206310272,
-0.0019002350745722651,
0.10019248723983765,
0.02007550559937954,
-0.033331338316202164,
0.1006491631269455,
-0.06470560282468796,
-0.02010725624859333,
0.011026601307094097,
0.06505969911813736,
0.10847407579421997,
0.13895247876644135,
0.10744685679674149,
-0.10332892090082169,
-0.0023942766711115837,
0.024128051474690437,
0.07618660479784012,
-0.05322893336415291,
0.05764170363545418,
0.14432735741138458,
-0.026609618216753006,
-0.14650462567806244,
0.19945357739925385,
0.006920773535966873,
-0.15114715695381165,
-0.024085568264126778,
-0.05378938093781471,
-0.14225532114505768,
-0.12879052758216858,
-0.010297232307493687,
0.002540611196309328,
-0.24350813031196594,
-0.08308210223913193,
-0.0055212895385921,
-0.07110468298196793,
0.11667172610759735,
0.026235148310661316,
0.14029920101165771,
-0.021981967613101006,
-0.037032973021268845,
-0.08070900291204453,
-0.03730613365769386,
0.01867966540157795,
0.013353073969483376,
0.06495564430952072,
-0.15921998023986816,
-0.21225352585315704,
0.015131731517612934,
0.14910763502120972,
-0.08114264160394669,
0.006523645482957363,
-0.08282437175512314,
0.09121250361204147,
-0.12807254493236542,
0.0683450773358345,
-0.05915055423974991,
0.028551356866955757,
-0.0609150268137455,
-0.028936073184013367,
-0.06301652640104294,
0.056733280420303345,
-0.09662361443042755,
-0.026949485763907433,
0.00022908428218215704,
-0.023416142910718918,
-0.1171523779630661,
-0.031211167573928833,
-0.020037878304719925,
0.006629975512623787,
0.13182432949543,
0.08306774497032166,
-0.10507158190011978,
0.09268009662628174,
-0.14096896350383759,
-0.25741422176361084,
0.15605710446834564,
0.03999160975217819,
-0.02073906734585762,
0.058503128588199615,
0.05654039978981018,
0.07451257854700089,
-0.04594647139310837,
-0.014607124961912632,
0.014894380234181881,
-0.047207411378622055,
-0.018830522894859314,
-0.23502954840660095,
0.01915322244167328,
-0.02731771022081375,
0.0025174652691930532,
0.08024532347917557,
0.1042420044541359,
0.11625420302152634,
-0.058673519641160965,
-0.006841644644737244,
-0.039494045078754425,
-0.0045454297214746475,
-0.009046745486557484,
-0.1156606674194336,
-0.01345959771424532,
-0.03883248195052147,
0.022059671580791473,
-0.07743718475103378,
0.16718752682209015,
-0.016274170950055122,
-0.09989791363477707,
0.02539321407675743,
0.22099782526493073,
-0.11076832562685013,
0.07880803197622299,
0.1880061775445938,
0.08407921344041824,
-0.007453425321727991,
0.04956819862127304,
0.05298878252506256,
0.0999390035867691,
0.18572980165481567,
-0.0009344131103716791,
0.24642902612686157,
0.041285861283540726,
0.11455964297056198,
0.07700397074222565,
-0.017034299671649933,
0.05122702568769455,
0.08684340119361877,
-0.09287778288125992,
0.1050441786646843,
0.0045053171925246716,
0.0033234895672649145,
0.13132096827030182,
0.030101316049695015,
0.00787555892020464,
0.01676560752093792,
-0.006440748926252127,
-0.084710031747818,
-0.20542214810848236,
-0.10008502006530762,
-0.14449577033519745,
0.04941685497760773,
-0.030844727531075478,
-0.08182225376367569,
0.21365511417388916,
0.08004872500896454,
-0.044611137360334396,
0.05065540224313736,
-0.08524183928966522,
-0.0521015040576458,
0.1618679016828537,
-0.00746593251824379,
-0.10709381848573685,
0.1411702036857605,
-0.04840584471821785,
0.06319332122802734,
0.056392498314380646,
-0.017261186614632607,
0.001796058495528996,
0.06663236767053604,
0.15990477800369263,
-0.03958315774798393,
-0.04014969244599342,
-0.03309986740350723,
0.024831464514136314,
-0.04205085337162018,
0.0010783624602481723,
-0.015722930431365967,
0.029710881412029266,
0.046195127069950104,
0.23287121951580048,
-0.10322661697864532,
-0.07730865478515625,
-0.11425979435443878,
-0.038379210978746414,
-0.000944469531532377,
0.05825576186180115,
-0.04915105924010277,
-0.03739920258522034,
-0.019834578037261963,
0.21473003923892975,
0.23361168801784515,
-0.1815827488899231,
0.003792667528614402,
0.08348725736141205,
0.0032148512545973063,
-0.03477222099900246,
0.030058175325393677,
0.17133331298828125,
0.10702500492334366,
-0.06601510941982269,
-0.1339905858039856,
-0.061861466616392136,
-0.05891899764537811,
-0.12068799138069153,
0.014891120605170727,
0.057775020599365234,
-0.07664979994297028,
-0.13206547498703003,
0.12080417573451996,
-0.17646853625774384,
-0.014058238826692104,
0.08517751842737198,
-0.08706455677747726,
-0.12048816680908203,
-0.05168529972434044,
0.07121478766202927,
0.09925652295351028,
0.008151611313223839,
-0.07808956503868103,
0.04450925067067146,
0.01953057013452053,
0.02768966183066368,
-0.15836085379123688,
-0.043381527066230774,
-0.06387636810541153,
-0.02257734350860119,
0.12505720555782318,
0.03794407099485397,
-0.004796248860657215,
0.04436364024877548,
0.027754096314311028,
-0.0671442523598671,
0.01218322105705738,
-0.05020962655544281,
-0.126357764005661,
0.02419084496796131,
0.06330031901597977,
-0.03133649006485939,
-0.004876499064266682,
0.05349989980459213,
-0.1618708223104477,
-0.02231539413332939,
0.05163424089550972,
0.010943755507469177,
-0.115318164229393,
0.012001347728073597,
-0.07097861915826797,
0.05372319743037224,
0.04271674528717995,
0.004605445545166731,
-0.0020652723032981157,
-0.05198301747441292,
-0.03038272075355053,
0.009204634465277195,
-0.13008476793766022,
0.008718626573681831,
-0.027051802724599838,
-0.036572184413671494,
-0.23943059146404266,
0.05829143524169922,
0.012110799551010132,
-0.019062599167227745,
-0.035969313234090805,
-0.040985655039548874,
-0.08599618077278137,
0.07875609397888184,
0.11215661466121674,
-0.04302123934030533,
0.012138733640313148,
0.014194143004715443,
0.07268233597278595,
-0.0029958600644022226,
-0.02326280251145363,
-0.11015208810567856
] |
null | null |
transformers
|
# resnet34d
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet34d
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet34d
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet34d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet34d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
26
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet34d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.10663411766290665,
0.06266716867685318,
-0.0025797979906201363,
0.08240601420402527,
0.09026387333869934,
0.00850965827703476,
-0.020855458453297615,
0.08491533249616623,
-0.10096833854913712,
-0.017272258177399635,
0.10417962074279785,
0.09696290642023087,
0.0017954673385247588,
-0.062395453453063965,
-0.055916108191013336,
-0.22810620069503784,
0.08666086196899414,
0.07146931439638138,
-0.006728045642375946,
0.056231096386909485,
0.07008690387010574,
-0.07531458884477615,
0.09665065258741379,
0.003381091170012951,
-0.1654333472251892,
0.00833373423665762,
-0.014870937913656235,
-0.06686452031135559,
0.0804000273346901,
0.023197613656520844,
0.10051446408033371,
0.01065147202461958,
0.09701023250818253,
-0.12178745865821838,
0.03247091546654701,
0.05163164436817169,
-0.0743408277630806,
0.057653188705444336,
0.09775275737047195,
-0.027548594400286674,
0.04142870008945465,
0.04701175540685654,
-0.07663067430257797,
-0.046808455139398575,
0.0012556869769468904,
-0.07696612924337387,
0.0010126415872946382,
0.1750352531671524,
0.08178213238716125,
0.0901111364364624,
0.03508037328720093,
0.08421376347541809,
-0.11072792857885361,
0.09576356410980225,
0.1555246114730835,
-0.29127106070518494,
-0.04993903636932373,
0.1294516623020172,
-0.05343438312411308,
-0.08012812584638596,
-0.07362467795610428,
0.030125439167022705,
0.028052784502506256,
0.03764399141073227,
-0.12889844179153442,
0.0008590296492911875,
-0.07776016741991043,
-0.007170213386416435,
-0.08670881390571594,
-0.029892368242144585,
0.052002862095832825,
0.09430528432130814,
0.05977023020386696,
0.0052799503318965435,
-0.15408539772033691,
0.013986174017190933,
-0.10496940463781357,
0.04786541312932968,
0.02293376997113228,
0.06675469130277634,
-0.08637342602014542,
-0.0061091287061572075,
-0.06026177108287811,
-0.03548738732933998,
-0.1438739001750946,
-0.03721560165286064,
0.05715852603316307,
0.13406632840633392,
-0.1174483373761177,
0.024882560595870018,
0.03727061673998833,
-0.04442346468567848,
-0.003455567406490445,
-0.08032555878162384,
0.028504706919193268,
0.04843803495168686,
-0.07505153119564056,
-0.020925166085362434,
0.09753691405057907,
0.20273371040821075,
-0.07278715074062347,
7.704577598133255e-8,
-0.026993894949555397,
0.1353864222764969,
-0.09648687392473221,
0.008980742655694485,
-0.23844987154006958,
0.11389152705669403,
0.05219205468893051,
-0.0862988531589508,
0.02226787619292736,
-0.013027098961174488,
-0.08214025944471359,
-0.024261659011244774,
0.11195523291826248,
-0.0359184592962265,
-0.013230246491730213,
0.07217402011156082,
-0.006085786037147045,
-0.043998654931783676,
0.12953919172286987,
-0.03594839200377464,
-0.033581748604774475,
-0.004753801506012678,
-0.09667541831731796,
0.08333476632833481,
0.10755088180303574,
0.0010759257711470127,
-0.09605655819177628,
-0.0004116283089388162,
-0.009257188998162746,
-0.0053159017115831375,
0.004602136090397835,
-0.03556114435195923,
0.045996084809303284,
0.01055858377367258,
0.05736031383275986,
-0.21638064086437225,
-0.14485839009284973,
0.007649707607924938,
0.11772217601537704,
-0.03726200386881828,
0.012168968096375465,
-0.011894594877958298,
-0.12519030272960663,
-0.0013226880691945553,
0.017375288531184196,
0.027322815731167793,
-0.07218582183122635,
0.01586104743182659,
-0.12668797373771667,
0.12714655697345734,
-0.1166638433933258,
0.03397528454661369,
-0.08715125173330307,
-0.03728339076042175,
-0.06383568048477173,
0.0957135409116745,
-0.0809738039970398,
0.08763927966356277,
-0.13268719613552094,
-0.07686236500740051,
-0.06747809052467346,
-0.03126152232289314,
0.010807090438902378,
0.06374724209308624,
-0.09054949879646301,
-0.0051462589763104916,
0.14961549639701843,
-0.10661565512418747,
-0.08835494518280029,
0.08193126320838928,
-0.03220784664154053,
-0.10583214461803436,
-0.002788968151435256,
0.1592472344636917,
0.020567089319229126,
-0.004025466740131378,
-0.05268765613436699,
0.030382508412003517,
-0.12738247215747833,
-0.12824909389019012,
-0.04337916523218155,
0.143508180975914,
0.030533170327544212,
-0.0026114643551409245,
-0.08929619193077087,
0.11506260186433792,
-0.0814124271273613,
-0.1207139790058136,
-0.04357542097568512,
-0.0946323424577713,
-0.016963202506303787,
0.042996641248464584,
0.10384852439165115,
0.06595370173454285,
-0.039194539189338684,
-0.16014017164707184,
0.0617159865796566,
-0.05405658483505249,
0.11470291763544083,
-0.09165727347135544,
0.15302206575870514,
-0.03298540413379669,
0.03860859200358391,
-0.22628039121627808,
0.00868472084403038,
0.006272733677178621,
0.02761356718838215,
-0.040086086839437485,
-0.049960244446992874,
0.036576662212610245,
-0.05768560245633125,
-0.038899585604667664,
-0.07108283042907715,
0.0717100203037262,
-0.01707218959927559,
-0.00555753568187356,
-0.12403235584497452,
-0.004687022417783737,
-0.04278644174337387,
0.009836627170443535,
-0.13318295776844025,
-0.05629671737551689,
0.02176009677350521,
0.1282583326101303,
-0.03271099179983139,
0.08575421571731567,
-0.02887284941971302,
-0.05761990323662758,
-0.02405974455177784,
-0.057441603392362595,
0.09048927575349808,
0.016810981556773186,
-0.021294288337230682,
0.06767066568136215,
0.09864785522222519,
0.14430293440818787,
0.14876002073287964,
-0.2814581096172333,
-0.03539619594812393,
0.08491768687963486,
-0.03195199742913246,
-0.00977583322674036,
-0.0031261579133570194,
0.15905755758285522,
0.10588086396455765,
-0.02381320483982563,
0.07333049178123474,
-0.058115795254707336,
0.09877080470323563,
0.011412372812628746,
-0.03404882922768593,
0.003444389207288623,
0.04262740910053253,
0.07268993556499481,
-0.26970207691192627,
0.08360566943883896,
0.10770080238580704,
-0.1644284725189209,
-0.04628980532288551,
-0.04738771542906761,
-0.06407830119132996,
0.1008235290646553,
0.07204553484916687,
-0.012593165971338749,
0.06268662959337234,
0.04345965385437012,
-0.02070915512740612,
0.0597408264875412,
-0.09162929654121399,
0.026576917618513107,
-0.12697312235832214,
0.027410538867115974,
-0.05291253700852394,
0.0004550526791717857,
0.013954860158264637,
0.0028207916766405106,
-0.00572828808799386,
0.10327432304620743,
-0.013128659687936306,
-0.2427389770746231,
0.05435066297650337,
-0.06663716584444046,
-0.02450689487159252,
0.20686839520931244,
-0.026340432465076447,
-0.2333219200372696,
-0.00812257919460535,
-0.025164658203721046,
-0.08635555952787399,
-0.016884643584489822,
0.023714249953627586,
-0.11775031685829163,
-0.06410988420248032,
-0.01874004863202572,
-0.11578202992677689,
0.10177434235811234,
0.028115814551711082,
0.020690031349658966,
0.027771467342972755,
0.03809254243969917,
-0.052996937185525894,
0.01818116381764412,
-0.13909178972244263,
-0.0916367843747139,
0.1336081475019455,
-0.09077799320220947,
0.08086255192756653,
0.08465695381164551,
-0.01561637595295906,
0.018847214058041573,
0.02862333133816719,
0.1438281387090683,
-0.046541567891836166,
0.02756035141646862,
0.09086232632398605,
-0.008669505827128887,
0.05509725585579872,
0.08956002444028854,
0.009871375747025013,
-0.07072296738624573,
-0.009356296621263027,
0.02921168878674507,
-0.04498228058218956,
-0.09732070565223694,
-0.13358880579471588,
-0.03733151778578758,
-0.01509855780750513,
0.15246844291687012,
0.10444171726703644,
0.05223323032259941,
0.09705822914838791,
0.009540400467813015,
-0.01966770552098751,
0.052745915949344635,
0.06384242326021194,
0.11330311000347137,
0.03839455917477608,
0.11110398918390274,
-0.07993345707654953,
-0.06186263635754585,
0.035715650767087936,
0.11703848093748093,
0.24970203638076782,
-0.029611803591251373,
-0.11802702397108078,
0.050053562968969345,
0.25743335485458374,
0.10424941033124924,
0.10023844242095947,
-0.06518525630235672,
-0.013458236120641232,
0.0029295890126377344,
0.0020051314495503902,
0.016078416258096695,
0.04710938036441803,
-0.0151828583329916,
-0.07221406698226929,
0.041223157197237015,
0.03378244861960411,
0.025850730016827583,
0.25204360485076904,
0.09047306329011917,
-0.31579843163490295,
0.013303140178322792,
-0.041317168623209,
0.001115666818805039,
-0.013518699444830418,
0.13549846410751343,
-0.012810569256544113,
-0.03157666698098183,
0.08437461405992508,
-0.12704682350158691,
0.10585933178663254,
-0.05950066074728966,
0.009131534025073051,
0.04352932423353195,
-0.11675748974084854,
0.08536087721586227,
0.05010621249675751,
-0.1381482183933258,
0.1738159954547882,
-0.04538165032863617,
-0.0661570131778717,
-0.06299344450235367,
-0.04515937343239784,
0.10717643052339554,
0.14459119737148285,
0.2393876016139984,
0.018568027764558792,
-0.01238631084561348,
0.08987193554639816,
-0.025097204372286797,
0.037073805928230286,
0.05352447181940079,
0.08328204602003098,
-0.056895043700933456,
0.03599392995238304,
-0.05419522523880005,
0.03950351104140282,
0.12978409230709076,
-0.02754281833767891,
-0.09699511528015137,
-0.05752978473901749,
0.08927419781684875,
-0.058573149144649506,
-0.0379679836332798,
-0.06317681819200516,
-0.0774725005030632,
0.14059403538703918,
0.03678009659051895,
-0.03207731619477272,
-0.084086112678051,
0.08390861004590988,
0.06864505261182785,
-0.06323539465665817,
0.10206756740808487,
-0.053268566727638245,
0.02554726041853428,
-0.021409323439002037,
-0.2446519285440445,
0.1696479171514511,
-0.11591935157775879,
-0.0026274232659488916,
0.03209784999489784,
-0.06033959239721298,
-0.01362787839025259,
-0.032423779368400574,
0.05861835181713104,
0.03785013034939766,
-0.12621545791625977,
-0.02840626984834671,
0.013129308819770813,
-0.0437743216753006,
0.01028844341635704,
0.08428390324115753,
-0.05634355545043945,
-0.07730869948863983,
-0.03448837250471115,
0.15064369142055511,
0.15419402718544006,
-0.032993707805871964,
-0.10190680623054504,
0.002215571003034711,
0.12198209762573242,
0.02699892967939377,
-0.38117456436157227,
-0.0972355306148529,
-0.026425676420331,
-0.07289024442434311,
-0.10000443458557129,
-0.12858203053474426,
0.23032833635807037,
-0.01181001402437687,
-0.06002086400985718,
0.0133829889819026,
-0.13119107484817505,
-0.03594842180609703,
0.2542762756347656,
0.0717364251613617,
0.28821590542793274,
-0.08434607088565826,
0.021502502262592316,
-0.05664222314953804,
0.017226360738277435,
0.08440021425485611,
-0.006454742979258299,
0.07352755218744278,
-0.006387354806065559,
0.05453796684741974,
-0.04260491579771042,
-0.02761930227279663,
-0.02363644912838936,
0.1196756362915039,
0.09787290543317795,
-0.09518947452306747,
0.07980943471193314,
0.06112094968557358,
-0.019599972292780876,
0.0552191361784935,
0.08116167783737183,
0.1115986630320549,
-0.013125208206474781,
0.023629004135727882,
-0.09441519528627396,
0.049225639551877975,
0.10094085335731506,
-0.0016560080694034696,
-0.10889913886785507,
0.058693110942840576,
0.11041615158319473,
0.08473145216703415,
0.282647967338562,
0.12966403365135193,
0.030323507264256477,
0.08018336445093155,
-0.0014775533927604556,
-0.07936140149831772,
-0.08195724338293076,
-0.10190777480602264,
-0.03652529790997505,
0.14625351130962372,
-0.15476760268211365,
0.058046888560056686,
0.10142460465431213,
-0.003515344811603427,
-0.004511717241257429,
0.09963853657245636,
0.013201853260397911,
-0.03938370570540428,
0.09975063800811768,
-0.05949321761727333,
-0.02159920334815979,
0.00933679100126028,
0.06390702724456787,
0.10265210270881653,
0.14505618810653687,
0.11073655635118484,
-0.10330956429243088,
0.0013941333163529634,
0.02437627501785755,
0.07289507240056992,
-0.05299433693289757,
0.056293610483407974,
0.14836385846138,
-0.02558731473982334,
-0.142761692404747,
0.2004031538963318,
0.007153174374252558,
-0.14591045677661896,
-0.02773299627006054,
-0.05580175295472145,
-0.14544843137264252,
-0.12652911245822906,
-0.02002773806452751,
-0.0070371669717133045,
-0.24284972250461578,
-0.08725205063819885,
-0.0026494869962334633,
-0.06840402632951736,
0.11082568019628525,
0.020517803728580475,
0.13654565811157227,
-0.018877241760492325,
-0.03724747151136398,
-0.08409211039543152,
-0.03826636075973511,
0.016808979213237762,
0.012757709249854088,
0.07043676823377609,
-0.15998119115829468,
-0.21586491167545319,
0.01779991388320923,
0.1492941975593567,
-0.07748519629240036,
0.0061237202025949955,
-0.08046205341815948,
0.09135448187589645,
-0.14075103402137756,
0.06808174401521683,
-0.05579008162021637,
0.02959297224879265,
-0.060313355177640915,
-0.02738124690949917,
-0.0622473806142807,
0.06036174297332764,
-0.09471488744020462,
-0.026415294036269188,
0.000012402507309161592,
-0.02325165458023548,
-0.11711729317903519,
-0.03168699890375137,
-0.02274339646100998,
0.006244945339858532,
0.12623895704746246,
0.07787534594535828,
-0.10739535838365555,
0.0935598611831665,
-0.14473195374011993,
-0.25329819321632385,
0.15572145581245422,
0.043212950229644775,
-0.01821015402674675,
0.05900263413786888,
0.05261820927262306,
0.07649558037519455,
-0.043117620050907135,
-0.0177951380610466,
0.009368974715471268,
-0.04833832010626793,
-0.02743338979780674,
-0.24106992781162262,
0.01986059918999672,
-0.024270744994282722,
0.0005723746144212782,
0.08145328611135483,
0.11357534676790237,
0.1178043782711029,
-0.05676089599728584,
-0.010694949887692928,
-0.04015904292464256,
-0.00628627510741353,
-0.008014784194529057,
-0.11335990577936172,
-0.0074537936598062515,
-0.03720979392528534,
0.019213996827602386,
-0.07870681583881378,
0.16414302587509155,
-0.026452939957380295,
-0.09839363396167755,
0.02022711932659149,
0.20878969132900238,
-0.1183675080537796,
0.0729454755783081,
0.19041520357131958,
0.08282139152288437,
-0.008985204622149467,
0.04981527477502823,
0.04567864164710045,
0.09822385758161545,
0.18759122490882874,
0.0006843663286417723,
0.23957215249538422,
0.0419287346303463,
0.11846556514501572,
0.07206334173679352,
-0.01989566534757614,
0.05501877889037132,
0.08052743971347809,
-0.0924445390701294,
0.10888862609863281,
0.008880934678018093,
-0.00016961923392955214,
0.13438661396503448,
0.03152930364012718,
0.0036826336290687323,
0.015770815312862396,
-0.008549893274903297,
-0.08409658074378967,
-0.20381750166416168,
-0.10121655464172363,
-0.1419173628091812,
0.0499177947640419,
-0.030972806736826897,
-0.07978028059005737,
0.20989318192005157,
0.07996361702680588,
-0.044545795768499374,
0.047311004251241684,
-0.07971575856208801,
-0.05293315649032593,
0.16194580495357513,
-0.006579478271305561,
-0.10842814296483994,
0.14905963838100433,
-0.04262187331914902,
0.0630495548248291,
0.05411782115697861,
-0.019507981836795807,
0.0010246996534988284,
0.07382596284151077,
0.16453810036182404,
-0.03735506534576416,
-0.039316773414611816,
-0.034465041011571884,
0.026430999860167503,
-0.03616274893283844,
0.0040198503993451595,
-0.012813916429877281,
0.03054066374897957,
0.047079410403966904,
0.22940649092197418,
-0.10418298095464706,
-0.08116281777620316,
-0.11339039355516434,
-0.03233417123556137,
-0.0016533068846911192,
0.057887014001607895,
-0.04838891327381134,
-0.03394397348165512,
-0.019399255514144897,
0.2163688987493515,
0.23005901277065277,
-0.1784253716468811,
0.003403998911380768,
0.07783377170562744,
0.00459303380921483,
-0.03641972690820694,
0.030095838010311127,
0.1720006763935089,
0.10257065296173096,
-0.06338154524564743,
-0.139459028840065,
-0.06337332725524902,
-0.05682053416967392,
-0.1219799742102623,
0.018620919436216354,
0.0543178953230381,
-0.07939878851175308,
-0.13332286477088928,
0.1173991933465004,
-0.1793241649866104,
-0.016505282372236252,
0.08038211613893509,
-0.07392954081296921,
-0.11640121787786484,
-0.052594318985939026,
0.06999798119068146,
0.09480347484350204,
0.006487789563834667,
-0.08023468405008316,
0.04480746015906334,
0.027089226990938187,
0.02295546606183052,
-0.15180692076683044,
-0.04127948731184006,
-0.06449252367019653,
-0.020337071269750595,
0.12539608776569366,
0.041229840368032455,
-0.0020161771681159735,
0.0442793108522892,
0.030828215181827545,
-0.06689956039190292,
0.01768985576927662,
-0.04918045178055763,
-0.11991960555315018,
0.023081546649336815,
0.057860855013132095,
-0.03269415348768234,
-0.006737724412232637,
0.053662609308958054,
-0.16235144436359406,
-0.023165393620729446,
0.0634247362613678,
0.01032467745244503,
-0.11967051029205322,
0.008318452164530754,
-0.06670427322387695,
0.05190761387348175,
0.04145975038409233,
0.0030990950763225555,
-0.0029674030374735594,
-0.05261784791946411,
-0.030305378139019012,
0.008733028545975685,
-0.12592953443527222,
0.011123795993626118,
-0.022913217544555664,
-0.036236509680747986,
-0.2333022952079773,
0.059410080313682556,
0.01700194925069809,
-0.015627853572368622,
-0.03435743227601051,
-0.040365856140851974,
-0.08416574448347092,
0.07968061417341232,
0.11234153062105179,
-0.039115626364946365,
0.013071536086499691,
0.01932278461754322,
0.06790951639413834,
-0.00627303309738636,
-0.024042915552854538,
-0.11382149904966354
] |
null | null |
transformers
|
# resnet50
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet50
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet50
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
25
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet50\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.11057435721158981,
0.06110876053571701,
-0.002370847389101982,
0.08305384963750839,
0.09363015741109848,
0.008916938677430153,
-0.016629531979560852,
0.08732534945011139,
-0.09587264060974121,
-0.01693911664187908,
0.10304729640483856,
0.10274473577737808,
0.0036121357697993517,
-0.051843490451574326,
-0.054280295968055725,
-0.22855201363563538,
0.08617226034402847,
0.06894126534461975,
-0.010205981321632862,
0.05874195694923401,
0.0672886073589325,
-0.08398712426424026,
0.09836944192647934,
0.00044743288890458643,
-0.17384620010852814,
0.00724453292787075,
-0.015325023792684078,
-0.06735949218273163,
0.07630950957536697,
0.02299652062356472,
0.09868544340133667,
0.007834143005311489,
0.0988115668296814,
-0.10621637850999832,
0.033065564930438995,
0.04801742359995842,
-0.0765245333313942,
0.058749862015247345,
0.09815714508295059,
-0.02882036752998829,
0.03171780705451965,
0.047256942838430405,
-0.07236875593662262,
-0.045730866491794586,
0.0013588331639766693,
-0.05981668457388878,
0.00579294515773654,
0.16765989363193512,
0.082844078540802,
0.09178705513477325,
0.03266944736242294,
0.08544698357582092,
-0.10886497050523758,
0.09432120621204376,
0.16537348926067352,
-0.285129189491272,
-0.05067320913076401,
0.12002696841955185,
-0.05803721398115158,
-0.07528962939977646,
-0.06937790662050247,
0.029848482459783554,
0.022322067990899086,
0.03399904444813728,
-0.13022299110889435,
0.0015042201848700643,
-0.0702357217669487,
-0.00833648256957531,
-0.08776015788316727,
-0.026851428672671318,
0.05570752173662186,
0.09513194859027863,
0.06133588030934334,
-0.002616697922348976,
-0.15311218798160553,
0.01173115149140358,
-0.10694798827171326,
0.04313192889094353,
0.021752221509814262,
0.060959819704294205,
-0.09011740237474442,
-0.007049486506730318,
-0.057787977159023285,
-0.03846004232764244,
-0.14489339292049408,
-0.04680173844099045,
0.055034950375556946,
0.13332298398017883,
-0.11646410077810287,
0.02088809758424759,
0.032671138644218445,
-0.039116423577070236,
-0.0030598649755120277,
-0.08007938414812088,
0.01888432726264,
0.045319706201553345,
-0.07609034329652786,
-0.03639479726552963,
0.09115209430456161,
0.19530048966407776,
-0.06565526127815247,
-0.0009217634215019643,
-0.02265094593167305,
0.13964532315731049,
-0.0934877097606659,
0.006057173479348421,
-0.24335135519504547,
0.10937432199716568,
0.049754898995161057,
-0.09103279560804367,
0.020357556641101837,
-0.014348645694553852,
-0.082007996737957,
-0.02562868222594261,
0.11309292912483215,
-0.03992823511362076,
-0.018111873418092728,
0.06996358931064606,
-0.007959662936627865,
-0.04464735835790634,
0.11955094337463379,
-0.034291934221982956,
-0.03416840732097626,
-0.006009568925946951,
-0.09743790328502655,
0.08057480305433273,
0.10342540591955185,
0.00006920219311723486,
-0.1001247838139534,
0.003007955849170685,
-0.008349901996552944,
-0.0038608969189226627,
0.0019904260989278555,
-0.0345754474401474,
0.0407676063477993,
0.014529092237353325,
0.05739889293909073,
-0.21220183372497559,
-0.15208785235881805,
0.0027280135545879602,
0.11690965294837952,
-0.038269106298685074,
0.009942564181983471,
-0.014834990724921227,
-0.12586872279644012,
-0.003106295131146908,
0.01727442629635334,
0.030839279294013977,
-0.07050121575593948,
0.015706386417150497,
-0.1338430494070053,
0.12814727425575256,
-0.11249783635139465,
0.035205163061618805,
-0.0871734470129013,
-0.045443568378686905,
-0.08270318806171417,
0.10050103068351746,
-0.07247528433799744,
0.0926034227013588,
-0.13315513730049133,
-0.07768500596284866,
-0.07192523777484894,
-0.03253323584794998,
0.007796092890202999,
0.0671079158782959,
-0.09231191128492355,
-0.009036104194819927,
0.1525260955095291,
-0.10730720311403275,
-0.08527527004480362,
0.07827960699796677,
-0.03204121068120003,
-0.10266973823308945,
0.0010178430238738656,
0.15438884496688843,
0.013830064795911312,
0.0027533448301255703,
-0.049923546612262726,
0.020926086232066154,
-0.13461770117282867,
-0.13219477236270905,
-0.039984334260225296,
0.14600475132465363,
0.03077097050845623,
-0.006394379306584597,
-0.08914828300476074,
0.11215228587388992,
-0.08175133913755417,
-0.121660515666008,
-0.04223698750138283,
-0.10066410899162292,
-0.022084636613726616,
0.044506367295980453,
0.10724126547574997,
0.0687427818775177,
-0.04016562178730965,
-0.16074995696544647,
0.0543670617043972,
-0.0526922270655632,
0.1114555150270462,
-0.09438393265008926,
0.15735402703285217,
-0.03233962133526802,
0.037544135004282,
-0.2334451526403427,
0.015728242695331573,
0.004661776591092348,
0.014507968910038471,
-0.036011070013046265,
-0.051585789769887924,
0.039526134729385376,
-0.052906617522239685,
-0.03899231553077698,
-0.07067462801933289,
0.06822367012500763,
-0.018291527405381203,
-0.005778072401881218,
-0.12503734230995178,
-0.0003210203431081027,
-0.04061923921108246,
0.012453213334083557,
-0.12312589585781097,
-0.05498277768492699,
0.020486943423748016,
0.12299712747335434,
-0.028654733672738075,
0.08201127499341965,
-0.029794098809361458,
-0.059644944965839386,
-0.01948322169482708,
-0.055188797414302826,
0.09185962378978729,
0.013748654164373875,
-0.016932429745793343,
0.06189814209938049,
0.0983743965625763,
0.1490049958229065,
0.15145248174667358,
-0.28209686279296875,
-0.04134281724691391,
0.08477777987718582,
-0.029015135020017624,
-0.005880036391317844,
-0.010171339847147465,
0.16262243688106537,
0.10398929566144943,
-0.024999666959047318,
0.06953219324350357,
-0.056375470012426376,
0.09743473678827286,
0.013290339149534702,
-0.02277267724275589,
0.007546048145741224,
0.04406674578785896,
0.08469508588314056,
-0.2696918547153473,
0.0843084454536438,
0.1098283901810646,
-0.16608871519565582,
-0.047581057995557785,
-0.04376834258437157,
-0.06434053927659988,
0.10042010992765427,
0.06813864409923553,
-0.01042772177606821,
0.06705936044454575,
0.038402628153562546,
-0.024169905111193657,
0.05960957333445549,
-0.09907650202512741,
0.020625188946723938,
-0.12341239303350449,
0.024997275322675705,
-0.05218677595257759,
0.003924868535250425,
0.013652498833835125,
-0.002571340184658766,
-0.0026511100586503744,
0.1072186529636383,
-0.011291561648249626,
-0.2343510389328003,
0.05049818009138107,
-0.06736823916435242,
-0.02519099973142147,
0.20697946846485138,
-0.021275101229548454,
-0.23714233934879303,
-0.0023459785152226686,
-0.009875355288386345,
-0.07865087687969208,
-0.01601773500442505,
0.023276394233107567,
-0.12206621468067169,
-0.06595312058925629,
-0.019387656822800636,
-0.11033911257982254,
0.11346433311700821,
0.03152226284146309,
0.018982132896780968,
0.027705121785402298,
0.034335896372795105,
-0.04713920131325722,
0.01565987803041935,
-0.14429739117622375,
-0.08958631753921509,
0.134237140417099,
-0.08522836863994598,
0.08080718666315079,
0.08183486014604568,
-0.018334748223423958,
0.022400202229619026,
0.025182364508509636,
0.14242598414421082,
-0.053502555936574936,
0.024051502346992493,
0.08917859196662903,
-0.00481907045468688,
0.05139172077178955,
0.08887139707803726,
0.010073483921587467,
-0.0707792118191719,
-0.009617376141250134,
0.029901789501309395,
-0.04193395748734474,
-0.09094997495412827,
-0.13218720257282257,
-0.03870818391442299,
-0.014412220567464828,
0.15652693808078766,
0.10248380154371262,
0.0514657236635685,
0.1013571172952652,
0.012413885444402695,
-0.029351573437452316,
0.06130119785666466,
0.06393162906169891,
0.10369168967008591,
0.03937603160738945,
0.11220325529575348,
-0.07712441682815552,
-0.06679026037454605,
0.03548608720302582,
0.10686542093753815,
0.2517772316932678,
-0.02728000096976757,
-0.12653231620788574,
0.04302870109677315,
0.24795417487621307,
0.1071787178516388,
0.10781723260879517,
-0.06476612389087677,
-0.015461957082152367,
0.004052072297781706,
0.0013169990852475166,
0.013685528188943863,
0.04467802867293358,
-0.012220704928040504,
-0.0680510401725769,
0.04191054031252861,
0.03975171595811844,
0.02737625502049923,
0.2476390153169632,
0.08967793732881546,
-0.3178331255912781,
0.014077506959438324,
-0.04563356935977936,
-0.002170677063986659,
-0.014764118008315563,
0.13490161299705505,
-0.010480270721018314,
-0.03679652884602547,
0.08400950580835342,
-0.1325051486492157,
0.1075475811958313,
-0.055267926305532455,
0.009510025382041931,
0.041340894997119904,
-0.11724348366260529,
0.08430223166942596,
0.045858174562454224,
-0.14139725267887115,
0.1684866100549698,
-0.04650413617491722,
-0.06590882688760757,
-0.06678328663110733,
-0.049304623156785965,
0.1067797988653183,
0.14076288044452667,
0.24722151458263397,
0.016175130382180214,
-0.009463467635214329,
0.08937432616949081,
-0.025469906628131866,
0.036327000707387924,
0.0512634739279747,
0.07979979366064072,
-0.06066567823290825,
0.03714223951101303,
-0.05466289073228836,
0.03902142122387886,
0.12906292080879211,
-0.027726715430617332,
-0.10135727375745773,
-0.05970463156700134,
0.07754530757665634,
-0.04856919124722481,
-0.033487141132354736,
-0.0663217231631279,
-0.07374840974807739,
0.13085678219795227,
0.02321264147758484,
-0.027138441801071167,
-0.08656205981969833,
0.09874700009822845,
0.06365169584751129,
-0.06104379519820213,
0.10412514209747314,
-0.05067809298634529,
0.026727600023150444,
-0.020741060376167297,
-0.24518124759197235,
0.1710207164287567,
-0.11496090143918991,
-0.0017653867835178971,
0.029922939836978912,
-0.06815991550683975,
-0.00890851579606533,
-0.025872904807329178,
0.05599632486701012,
0.042100559920072556,
-0.1254725307226181,
-0.02340213768184185,
0.016938190907239914,
-0.046793170273303986,
0.018173007294535637,
0.09129753708839417,
-0.05511399358510971,
-0.0852094367146492,
-0.03590357303619385,
0.15016694366931915,
0.15844173729419708,
-0.04822835326194763,
-0.10121719539165497,
0.005961747840046883,
0.11279258877038956,
0.03412611037492752,
-0.38591763377189636,
-0.09069553762674332,
-0.020840244367718697,
-0.06812196224927902,
-0.09998053312301636,
-0.13098767399787903,
0.23924846947193146,
-0.00835910439491272,
-0.05781365558505058,
0.0019008716335520148,
-0.13649457693099976,
-0.04049983248114586,
0.2547994554042816,
0.07443545013666153,
0.2898275554180145,
-0.08661329746246338,
0.026200316846370697,
-0.06025071069598198,
0.027319634333252907,
0.09145048260688782,
-0.002183191478252411,
0.07218459248542786,
-0.0035784519277513027,
0.05309772863984108,
-0.043211501091718674,
-0.02914038486778736,
-0.03230411931872368,
0.11713922768831253,
0.0973319560289383,
-0.09580893069505692,
0.07457905262708664,
0.06447568535804749,
-0.01657583750784397,
0.05274677649140358,
0.08495622873306274,
0.1092836856842041,
-0.006973343435674906,
0.023495592176914215,
-0.09248892217874527,
0.05185490474104881,
0.10570494085550308,
0.00022536961478181183,
-0.10676421970129013,
0.061611372977495193,
0.10957784950733185,
0.08455520123243332,
0.28612273931503296,
0.1288166642189026,
0.029437355697155,
0.06526925414800644,
0.006381248589605093,
-0.08356381952762604,
-0.08496030420064926,
-0.09973497688770294,
-0.03611249104142189,
0.14590692520141602,
-0.14515668153762817,
0.0579538457095623,
0.10121101140975952,
-0.0026939131785184145,
-0.003979512490332127,
0.10081344842910767,
0.020701218396425247,
-0.03697602450847626,
0.09985209256410599,
-0.06267818063497543,
-0.02369348518550396,
0.0135270394384861,
0.07148730754852295,
0.10466545075178146,
0.14458172023296356,
0.10450264811515808,
-0.10384752601385117,
-0.0006409338675439358,
0.023844962939620018,
0.07599367201328278,
-0.04996068403124809,
0.06214142218232155,
0.1459653526544571,
-0.025940045714378357,
-0.14723718166351318,
0.19921383261680603,
0.007703766226768494,
-0.15362991392612457,
-0.024242429062724113,
-0.05664427950978279,
-0.1461056023836136,
-0.12677431106567383,
-0.00984749011695385,
0.0016829227097332478,
-0.24825966358184814,
-0.08354926109313965,
-0.0028052590787410736,
-0.06965681910514832,
0.11841939389705658,
0.023903874680399895,
0.13808134198188782,
-0.02466263435781002,
-0.036827269941568375,
-0.08153674751520157,
-0.03754005581140518,
0.020540280267596245,
0.009699733927845955,
0.06835024058818817,
-0.15821076929569244,
-0.21495775878429413,
0.016007686033844948,
0.1484939306974411,
-0.0819639042019844,
0.008703581057488918,
-0.08000131696462631,
0.09154799580574036,
-0.1196097806096077,
0.06857402622699738,
-0.05707267299294472,
0.026831451803445816,
-0.061873659491539,
-0.0303728599101305,
-0.06414040923118591,
0.05668967217206955,
-0.09561898559331894,
-0.02640458382666111,
-0.0007337297429330647,
-0.026523329317569733,
-0.11689674109220505,
-0.027731267735362053,
-0.022431163117289543,
0.008937041275203228,
0.12932416796684265,
0.08248884230852127,
-0.1064169779419899,
0.09231410920619965,
-0.1427963376045227,
-0.25753989815711975,
0.15355323255062103,
0.04272597283124924,
-0.020533937960863113,
0.061094287782907486,
0.05701110139489174,
0.07654395699501038,
-0.04248789697885513,
-0.015607049688696861,
0.01402067020535469,
-0.04555386304855347,
-0.01895076408982277,
-0.24003657698631287,
0.020314885303378105,
-0.028725331649184227,
0.004891878925263882,
0.0790984258055687,
0.1050933301448822,
0.11939425766468048,
-0.05668111890554428,
-0.009109794162213802,
-0.03822692856192589,
-0.0024180645123124123,
-0.005507031921297312,
-0.11255349963903427,
-0.012921446934342384,
-0.03792883828282356,
0.024611158296465874,
-0.07815030962228775,
0.16761553287506104,
-0.019275017082691193,
-0.09771624207496643,
0.026384666562080383,
0.21998552978038788,
-0.1159711629152298,
0.07594031095504761,
0.18753959238529205,
0.08964227885007858,
-0.008613601326942444,
0.053309567272663116,
0.05202756077051163,
0.09845338761806488,
0.18747130036354065,
0.0046854265965521336,
0.24382343888282776,
0.033509768545627594,
0.11348364502191544,
0.07479789108037949,
-0.016467107459902763,
0.05900097265839577,
0.0943262055516243,
-0.09692385047674179,
0.10347729176282883,
0.0065419599413871765,
0.008635831996798515,
0.13484372198581696,
0.033782292157411575,
0.008241761475801468,
0.01788397692143917,
-0.0075412024743855,
-0.0863887146115303,
-0.20913170278072357,
-0.10135328024625778,
-0.14526152610778809,
0.049039077013731,
-0.030109738931059837,
-0.08283232152462006,
0.21329014003276825,
0.08191082626581192,
-0.0444047674536705,
0.05041904002428055,
-0.08229702711105347,
-0.05182911828160286,
0.16193436086177826,
-0.00799791794270277,
-0.10890422761440277,
0.14349503815174103,
-0.04735071212053299,
0.06745811551809311,
0.05856436863541603,
-0.015377763658761978,
0.0008887018193490803,
0.0649518296122551,
0.16105009615421295,
-0.035814475268125534,
-0.03949003294110298,
-0.03376667574048042,
0.022412043064832687,
-0.04488307982683182,
-0.003723498433828354,
-0.01734836772084236,
0.03213300555944443,
0.0457436628639698,
0.231280118227005,
-0.10527919977903366,
-0.0796126201748848,
-0.11437960714101791,
-0.04921618103981018,
0.003307838225737214,
0.060388050973415375,
-0.04900201037526131,
-0.03926939144730568,
-0.02140144631266594,
0.21373619139194489,
0.23200015723705292,
-0.18036162853240967,
0.0029470433946698904,
0.08515207469463348,
0.0037311429623514414,
-0.0359521359205246,
0.025137610733509064,
0.17068380117416382,
0.10243489593267441,
-0.0639943778514862,
-0.12725548446178436,
-0.06172699108719826,
-0.059238553047180176,
-0.11875715106725693,
0.01597895845770836,
0.05887097865343094,
-0.07684276252985,
-0.132748544216156,
0.12311342358589172,
-0.17163318395614624,
-0.020538272336125374,
0.08128272742033005,
-0.08173912018537521,
-0.1192740723490715,
-0.052617188543081284,
0.07144375890493393,
0.10029935091733932,
0.004722044337540865,
-0.0762564092874527,
0.04704541340470314,
0.017213396728038788,
0.028615260496735573,
-0.1580985188484192,
-0.04625185206532478,
-0.06574802100658417,
-0.03154773637652397,
0.12106268107891083,
0.03983557969331741,
-0.007094154600054026,
0.0441833958029747,
0.026375316083431244,
-0.06543345004320145,
0.014637075364589691,
-0.052136585116386414,
-0.12164755910634995,
0.026299603283405304,
0.06647562980651855,
-0.03177488222718239,
-0.0037715863436460495,
0.053026385605335236,
-0.1625436246395111,
-0.02627970091998577,
0.04895251244306564,
0.011873782612383366,
-0.11632055789232254,
0.017472166568040848,
-0.06673784554004669,
0.05301910266280174,
0.042184680700302124,
0.005609584040939808,
0.0014141976134851575,
-0.05560048297047615,
-0.0313647985458374,
0.010851002298295498,
-0.12668976187705994,
0.009415562264621258,
-0.027494745329022408,
-0.03393568843603134,
-0.2453015148639679,
0.060204971581697464,
0.016764119267463684,
-0.014101377688348293,
-0.0381845161318779,
-0.04182219132781029,
-0.08568400889635086,
0.07637161761522293,
0.11443252861499786,
-0.04191189259290695,
0.01145494170486927,
0.01692335307598114,
0.07320648431777954,
-0.0030721838120371103,
-0.024510227143764496,
-0.10847880691289902
] |
null | null |
transformers
|
# resnet50d
Implementation of ResNet proposed in [Deep Residual Learning for Image
Recognition](https://arxiv.org/abs/1512.03385)
``` python
ResNet.resnet18()
ResNet.resnet26()
ResNet.resnet34()
ResNet.resnet50()
ResNet.resnet101()
ResNet.resnet152()
ResNet.resnet200()
Variants (d) proposed in `Bag of Tricks for Image Classification with Convolutional Neural Networks <https://arxiv.org/pdf/1812.01187.pdf`_
ResNet.resnet26d()
ResNet.resnet34d()
ResNet.resnet50d()
# You can construct your own one by chaning `stem` and `block`
resnet101d = ResNet.resnet101(stem=ResNetStemC, block=partial(ResNetBottleneckBlock, shortcut=ResNetShorcutD))
```
Examples:
``` python
# change activation
ResNet.resnet18(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNet.resnet18(n_classes=100)
# pass a different block
ResNet.resnet18(block=SENetBasicBlock)
# change the steam
model = ResNet.resnet18(stem=ResNetStemC)
change shortcut
model = ResNet.resnet18(block=partial(ResNetBasicBlock, shortcut=ResNetShorcutD))
# store each feature
x = torch.rand((1, 3, 224, 224))
# get features
model = ResNet.resnet18()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{"license": "apache-2.0", "tags": ["image-classification"], "datasets": ["imagenet"]}
|
image-classification
|
glasses/resnet50d
|
[
"transformers",
"pytorch",
"image-classification",
"dataset:imagenet",
"arxiv:1512.03385",
"arxiv:1812.01187",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1512.03385",
"1812.01187"
] |
[] |
TAGS
#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us
|
# resnet50d
Implementation of ResNet proposed in Deep Residual Learning for Image
Recognition
Examples:
|
[
"# resnet50d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# resnet50d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
58,
26
] |
[
"passage: TAGS\n#transformers #pytorch #image-classification #dataset-imagenet #arxiv-1512.03385 #arxiv-1812.01187 #license-apache-2.0 #endpoints_compatible #region-us \n# resnet50d\nImplementation of ResNet proposed in Deep Residual Learning for Image\nRecognition\n\n \n\n Examples:"
] |
[
-0.11013618111610413,
0.058866456151008606,
-0.0025584178511053324,
0.08036098629236221,
0.09021938592195511,
0.008882137015461922,
-0.018993953242897987,
0.08281222730875015,
-0.10184378921985626,
-0.016679344698786736,
0.1033768579363823,
0.09755716472864151,
0.006327469367533922,
-0.05575801432132721,
-0.058336492627859116,
-0.2250393033027649,
0.08625666797161102,
0.07196581363677979,
-0.010509785264730453,
0.057082805782556534,
0.06830887496471405,
-0.07817383110523224,
0.09731023013591766,
0.002389611443504691,
-0.16950513422489166,
0.007241262588649988,
-0.014495478011667728,
-0.0676562711596489,
0.07743435353040695,
0.024199604988098145,
0.10235138237476349,
0.01351426262408495,
0.09696593880653381,
-0.11823657155036926,
0.033470526337623596,
0.05195653438568115,
-0.074016273021698,
0.06008748710155487,
0.09926985949277878,
-0.0230576042085886,
0.03793264180421829,
0.04832003265619278,
-0.07760335505008698,
-0.047311101108789444,
0.0032555111683905125,
-0.06686419993638992,
0.003676336957141757,
0.17555294930934906,
0.08366633206605911,
0.08809398859739304,
0.03492303565144539,
0.08233031630516052,
-0.10800287127494812,
0.09508511424064636,
0.15606121718883514,
-0.2898436486721039,
-0.050611063838005066,
0.1335255652666092,
-0.05752557888627052,
-0.07769700884819031,
-0.07168760895729065,
0.03046269342303276,
0.027401315048336983,
0.035054221749305725,
-0.12785126268863678,
0.003257647855207324,
-0.07581596821546555,
-0.009078777395188808,
-0.08794311434030533,
-0.028942346572875977,
0.05459093675017357,
0.09712275862693787,
0.060959719121456146,
0.003684513969346881,
-0.15612851083278656,
0.010873609222471714,
-0.10700933635234833,
0.041585274040699005,
0.02262459509074688,
0.06529225409030914,
-0.08742669224739075,
-0.004497324116528034,
-0.0601780079305172,
-0.03587799891829491,
-0.14612101018428802,
-0.04380853474140167,
0.05666597932577133,
0.13447429239749908,
-0.1217445433139801,
0.02316221594810486,
0.030067777261137962,
-0.042445775121450424,
-0.003967743366956711,
-0.08125124871730804,
0.023041995242238045,
0.04557614028453827,
-0.07363557815551758,
-0.03368255868554115,
0.09528405219316483,
0.1963794231414795,
-0.0710662305355072,
0.0004003799695055932,
-0.025756118819117546,
0.13649247586727142,
-0.09514295309782028,
0.006305375602096319,
-0.24239081144332886,
0.10964379459619522,
0.05108898878097534,
-0.08454709500074387,
0.020605795085430145,
-0.014334868639707565,
-0.08126374334096909,
-0.025987161323428154,
0.11578483134508133,
-0.03884299099445343,
-0.014141022227704525,
0.07240292429924011,
-0.004668113309890032,
-0.04589882120490074,
0.12776756286621094,
-0.03351142629981041,
-0.03488510474562645,
-0.005172558594495058,
-0.09489426016807556,
0.08544399589300156,
0.10440967231988907,
0.0016927301185205579,
-0.0983976349234581,
0.005048348568379879,
-0.007748846895992756,
-0.007579405326396227,
0.0023074664641171694,
-0.03655180707573891,
0.04429752007126808,
0.013749317266047001,
0.05501324310898781,
-0.21427088975906372,
-0.14422105252742767,
0.006564343348145485,
0.11890041083097458,
-0.034929309040308,
0.011754854582250118,
-0.014430826529860497,
-0.12643522024154663,
-0.000608885835390538,
0.018560107797384262,
0.03146406635642052,
-0.0701201856136322,
0.01458902657032013,
-0.12840873003005981,
0.12770336866378784,
-0.115522600710392,
0.03404203802347183,
-0.08508177846670151,
-0.04043007642030716,
-0.06621413677930832,
0.09802058339118958,
-0.07778587937355042,
0.08447006344795227,
-0.1335202306509018,
-0.07505080103874207,
-0.07675281167030334,
-0.03508443757891655,
0.012239482253789902,
0.06402824074029922,
-0.09779489040374756,
-0.007530574221163988,
0.15127032995224,
-0.10773646086454391,
-0.08944281935691833,
0.07944732159376144,
-0.03035222925245762,
-0.10753653198480606,
-0.003230958478525281,
0.1570502668619156,
0.015872638672590256,
-0.004285202361643314,
-0.053732022643089294,
0.02735043130815029,
-0.12960079312324524,
-0.13097955286502838,
-0.045725252479314804,
0.14341136813163757,
0.03151419758796692,
-0.003892164444550872,
-0.09122609347105026,
0.11317813396453857,
-0.08210745453834534,
-0.12173013389110565,
-0.045410238206386566,
-0.09562284499406815,
-0.02301075868308544,
0.042371716350317,
0.10466304421424866,
0.07088357955217361,
-0.03689604252576828,
-0.16397376358509064,
0.05920518934726715,
-0.05301700904965401,
0.11498146504163742,
-0.09293324500322342,
0.15073825418949127,
-0.03421701490879059,
0.041589196771383286,
-0.2277975231409073,
0.013343456201255322,
0.007432924117892981,
0.018651096150279045,
-0.041149962693452835,
-0.052725650370121,
0.03674997761845589,
-0.05885487422347069,
-0.03957076370716095,
-0.07213937491178513,
0.06931445747613907,
-0.01651032455265522,
-0.008504834026098251,
-0.12540017068386078,
-0.0033968540374189615,
-0.04043594375252724,
0.006499027833342552,
-0.13137464225292206,
-0.05442264303565025,
0.020468203350901604,
0.12664808332920074,
-0.029959220439195633,
0.08318504691123962,
-0.028783785179257393,
-0.05961045250296593,
-0.025445325300097466,
-0.056147363036870956,
0.08873787522315979,
0.015451252460479736,
-0.022647114470601082,
0.060279086232185364,
0.10016681998968124,
0.14417487382888794,
0.14785581827163696,
-0.2791288197040558,
-0.03610103204846382,
0.08318453282117844,
-0.03046068362891674,
-0.010360256768763065,
-0.008590630255639553,
0.16227436065673828,
0.09728638827800751,
-0.020843757316470146,
0.07029058784246445,
-0.05733630433678627,
0.10052717477083206,
0.013293103314936161,
-0.028557972982525826,
0.003230944275856018,
0.04051607474684715,
0.0790020301938057,
-0.26994478702545166,
0.08154607564210892,
0.11348576843738556,
-0.16502545773983002,
-0.048949360847473145,
-0.04436739906668663,
-0.06611815840005875,
0.10097832977771759,
0.07151468098163605,
-0.011682022362947464,
0.06450260430574417,
0.04288865625858307,
-0.019747000187635422,
0.06063434109091759,
-0.09298107028007507,
0.023934870958328247,
-0.12490696460008621,
0.02889920026063919,
-0.0533871091902256,
0.0025020132306963205,
0.01136727537959814,
0.001969820586964488,
-0.005711964797228575,
0.10452274978160858,
-0.01317919697612524,
-0.24025116860866547,
0.053138259798288345,
-0.06863868981599808,
-0.023259669542312622,
0.20709538459777832,
-0.02253665216267109,
-0.23709897696971893,
-0.0012773909838870168,
-0.012632977217435837,
-0.08661480247974396,
-0.018623441457748413,
0.0218474343419075,
-0.11934877187013626,
-0.06403521448373795,
-0.018411308526992798,
-0.1164906769990921,
0.10261902213096619,
0.031293634325265884,
0.020940862596035004,
0.028116164728999138,
0.03565181791782379,
-0.05050681531429291,
0.02004294842481613,
-0.14157770574092865,
-0.09024613350629807,
0.13590306043624878,
-0.0887993723154068,
0.08146604895591736,
0.0866362452507019,
-0.017376311123371124,
0.02198989875614643,
0.029252277687191963,
0.14430148899555206,
-0.04864242300391197,
0.028499139472842216,
0.08507245779037476,
-0.00464752409607172,
0.055156409740448,
0.09405902028083801,
0.01085165236145258,
-0.06864354014396667,
-0.009535660035908222,
0.029354404658079147,
-0.0423581637442112,
-0.09555590897798538,
-0.13068416714668274,
-0.03749823570251465,
-0.015297556295990944,
0.15286245942115784,
0.10326950997114182,
0.05375361815094948,
0.09904046356678009,
0.007519393693655729,
-0.023209331557154655,
0.05520624294877052,
0.06398613750934601,
0.10555625706911087,
0.0423981174826622,
0.11164025962352753,
-0.07702305912971497,
-0.061430152505636215,
0.03414154797792435,
0.11678522080183029,
0.25172874331474304,
-0.025120822712779045,
-0.12016415596008301,
0.04792468622326851,
0.2581348717212677,
0.10710182040929794,
0.10234980285167694,
-0.06356934458017349,
-0.01435105875134468,
0.003933298401534557,
0.001986251212656498,
0.018748808652162552,
0.04639533907175064,
-0.01580573059618473,
-0.0706828311085701,
0.044078171253204346,
0.04021504893898964,
0.026434557512402534,
0.25624334812164307,
0.08673045784235,
-0.31039193272590637,
0.015224340371787548,
-0.044764455407857895,
0.0006678320933133364,
-0.014101340435445309,
0.1339186728000641,
-0.013752535916864872,
-0.03173369541764259,
0.08624032139778137,
-0.13019046187400818,
0.10801737755537033,
-0.056699372828006744,
0.008714049123227596,
0.03792354837059975,
-0.11422809958457947,
0.0835791528224945,
0.04793596267700195,
-0.147930309176445,
0.17103137075901031,
-0.044564589858055115,
-0.06464369595050812,
-0.06333084404468536,
-0.04731345549225807,
0.10526358336210251,
0.1459021419286728,
0.24016989767551422,
0.01786646991968155,
-0.014611069113016129,
0.0960930660367012,
-0.024293916299939156,
0.0364249087870121,
0.054115794599056244,
0.08450630307197571,
-0.06003527715802193,
0.03821689635515213,
-0.05449036508798599,
0.040679603815078735,
0.13269861042499542,
-0.035664886236190796,
-0.09789157658815384,
-0.055079828947782516,
0.09145376831293106,
-0.057638105005025864,
-0.037867363542318344,
-0.06448346376419067,
-0.07537859678268433,
0.13796180486679077,
0.022556958720088005,
-0.030754828825592995,
-0.08463621884584427,
0.09157833456993103,
0.06431333720684052,
-0.06333547085523605,
0.10108828544616699,
-0.05473686754703522,
0.03030758537352085,
-0.022672777995467186,
-0.24506515264511108,
0.1706928163766861,
-0.11656863987445831,
-0.0014105222653597593,
0.030834907665848732,
-0.06636442989110947,
-0.012324702925980091,
-0.030884921550750732,
0.05695987120270729,
0.03706052154302597,
-0.12585461139678955,
-0.02671845071017742,
0.014204328879714012,
-0.0415431372821331,
0.005877264309674501,
0.08553599566221237,
-0.055032823234796524,
-0.08697111159563065,
-0.03577720373868942,
0.14844463765621185,
0.15304940938949585,
-0.038475945591926575,
-0.1014939621090889,
0.000378021621145308,
0.12451021373271942,
0.02902880869805813,
-0.38172057271003723,
-0.09378284960985184,
-0.021719126030802727,
-0.07220946252346039,
-0.1015581414103508,
-0.12862223386764526,
0.23349285125732422,
-0.00983565766364336,
-0.06022379547357559,
0.011994882486760616,
-0.13448329269886017,
-0.037619028240442276,
0.25545212626457214,
0.0724317878484726,
0.28854745626449585,
-0.08329257369041443,
0.022086165845394135,
-0.05860389396548271,
0.022345468401908875,
0.0911136344075203,
-0.005119471810758114,
0.07175254076719284,
-0.007088341284543276,
0.05330482870340347,
-0.04303666576743126,
-0.028109300881624222,
-0.029886234551668167,
0.12169938534498215,
0.09971115738153458,
-0.09192460030317307,
0.07904079556465149,
0.06277627497911453,
-0.019517283886671066,
0.054228488355875015,
0.08338416367769241,
0.11039991676807404,
-0.0014956410741433501,
0.024954332038760185,
-0.09468600153923035,
0.05186142772436142,
0.10096395760774612,
-0.0021818203385919333,
-0.1095815971493721,
0.05821122229099274,
0.11145371943712234,
0.08504587411880493,
0.2879163920879364,
0.13028526306152344,
0.027957553043961525,
0.07753931730985641,
0.0004324228793848306,
-0.08470003306865692,
-0.08146607130765915,
-0.10148530453443527,
-0.03578236326575279,
0.14923611283302307,
-0.1508295238018036,
0.05878316983580589,
0.10200051963329315,
-0.0031146055553108454,
-0.0057748425751924515,
0.0999978706240654,
0.014677220024168491,
-0.042934659868478775,
0.09871091693639755,
-0.05848903954029083,
-0.02681938372552395,
0.011312843300402164,
0.0729362964630127,
0.10012257844209671,
0.14911207556724548,
0.10658128559589386,
-0.10447868704795837,
0.002205262193456292,
0.02471749298274517,
0.07277620583772659,
-0.04833606630563736,
0.06058976426720619,
0.14834462106227875,
-0.02520046941936016,
-0.14313581585884094,
0.2019582837820053,
0.007632486056536436,
-0.15103885531425476,
-0.026815883815288544,
-0.05893099308013916,
-0.14938007295131683,
-0.12462624907493591,
-0.018598072230815887,
-0.006014574784785509,
-0.2470603585243225,
-0.08669088780879974,
0.0006324631976895034,
-0.06578341871500015,
0.112404465675354,
0.019119588658213615,
0.13490250706672668,
-0.02176036313176155,
-0.0366152785718441,
-0.08418815582990646,
-0.037311870604753494,
0.019594090059399605,
0.009333913214504719,
0.0739486888051033,
-0.1583561897277832,
-0.2176860272884369,
0.018937615677714348,
0.1481594741344452,
-0.07823261618614197,
0.007707078941166401,
-0.07716809213161469,
0.09082259237766266,
-0.131085604429245,
0.06776636093854904,
-0.05415451154112816,
0.027218900620937347,
-0.060895659029483795,
-0.028847239911556244,
-0.06242755427956581,
0.0593748539686203,
-0.09377464652061462,
-0.026408923789858818,
-0.0012886314652860165,
-0.025648076087236404,
-0.11600160598754883,
-0.028948497027158737,
-0.025015978142619133,
0.008327526040375233,
0.12505078315734863,
0.07746497541666031,
-0.10801319777965546,
0.09220395237207413,
-0.1475486159324646,
-0.2531208395957947,
0.15262539684772491,
0.045347776263952255,
-0.018229372799396515,
0.0615205317735672,
0.05348142609000206,
0.07800742238759995,
-0.0409693568944931,
-0.01861322671175003,
0.01024825219064951,
-0.04761109501123428,
-0.02891220524907112,
-0.24515365064144135,
0.02249797247350216,
-0.025752337649464607,
0.0032101362012326717,
0.07911625504493713,
0.11435563862323761,
0.12096795439720154,
-0.05502966791391373,
-0.013237512670457363,
-0.0393531508743763,
-0.004375155549496412,
-0.005227414425462484,
-0.11123646050691605,
-0.008093427866697311,
-0.03621121868491173,
0.02180069126188755,
-0.0800333097577095,
0.16336625814437866,
-0.028501184657216072,
-0.09591451287269592,
0.021376390010118484,
0.20754417777061462,
-0.12202168256044388,
0.07089513540267944,
0.19184008240699768,
0.08782284706830978,
-0.009901578538119793,
0.05308016762137413,
0.046803805977106094,
0.09763600677251816,
0.19036492705345154,
0.005372023209929466,
0.23732484877109528,
0.0327802449464798,
0.11760777980089188,
0.0694991871714592,
-0.02002793736755848,
0.0602288693189621,
0.08744221180677414,
-0.09637675434350967,
0.10965149849653244,
0.010306998156011105,
0.005965168587863445,
0.13880231976509094,
0.03438246622681618,
0.003800951177254319,
0.017264699563384056,
-0.008929767645895481,
-0.08553645759820938,
-0.21005688607692719,
-0.10288126766681671,
-0.14294451475143433,
0.04979744181036949,
-0.030871408060193062,
-0.08018810302019119,
0.20770083367824554,
0.08222769945859909,
-0.045243918895721436,
0.047605887055397034,
-0.07515936344861984,
-0.05295809358358383,
0.16428275406360626,
-0.006842600181698799,
-0.11036784946918488,
0.14947271347045898,
-0.042599666863679886,
0.06665370613336563,
0.05593976750969887,
-0.017891237512230873,
-0.00044942207750864327,
0.07199283689260483,
0.16404055058956146,
-0.03301185742020607,
-0.039323315024375916,
-0.03438938036561012,
0.02285214513540268,
-0.03997766971588135,
0.0005470202886499465,
-0.013740196824073792,
0.032855257391929626,
0.04659022390842438,
0.22916188836097717,
-0.10574419796466827,
-0.08196371793746948,
-0.11251561343669891,
-0.043311476707458496,
0.0018124858615919948,
0.05931038782000542,
-0.04786229506134987,
-0.036041684448719025,
-0.020729869604110718,
0.2163572460412979,
0.22642937302589417,
-0.1771082729101181,
0.002580474130809307,
0.0789632648229599,
0.004705097060650587,
-0.03799835965037346,
0.025455355644226074,
0.17103268206119537,
0.09758871048688889,
-0.06240743026137352,
-0.13137321174144745,
-0.06352594494819641,
-0.05739686265587807,
-0.11966175585985184,
0.019985603168606758,
0.055891621857881546,
-0.07979697734117508,
-0.13418938219547272,
0.11916472762823105,
-0.17306894063949585,
-0.021495020017027855,
0.07615558803081512,
-0.06930844485759735,
-0.11549129337072372,
-0.05371662229299545,
0.06984230875968933,
0.09532063454389572,
0.002539136214181781,
-0.0790344700217247,
0.04748440906405449,
0.022474346682429314,
0.024047769606113434,
-0.15056867897510529,
-0.045357778668403625,
-0.06652650982141495,
-0.02909630350768566,
0.12221219390630722,
0.04384328052401543,
-0.005198295693844557,
0.043683573603630066,
0.03050115704536438,
-0.0656513050198555,
0.018895279616117477,
-0.05026441812515259,
-0.11305084079504013,
0.024537725374102592,
0.062247321009635925,
-0.03220849111676216,
-0.0050835381262004375,
0.054083872586488724,
-0.16348256170749664,
-0.02630307711660862,
0.059384264051914215,
0.01124292891472578,
-0.12094633281230927,
0.014982982538640499,
-0.062175750732421875,
0.0513276681303978,
0.040476202964782715,
0.0040135979652404785,
-0.0007945335819385946,
-0.05650101602077484,
-0.031069207936525345,
0.009213238023221493,
-0.12004505842924118,
0.011820893734693527,
-0.02335837297141552,
-0.03263574466109276,
-0.23869328200817108,
0.06119728460907936,
0.022527044638991356,
-0.011665504425764084,
-0.03580925986170769,
-0.041841309517621994,
-0.08339565247297287,
0.07821428030729294,
0.11546347290277481,
-0.03931960090994835,
0.012047328986227512,
0.021741287782788277,
0.06868306547403336,
-0.005657196510583162,
-0.0248735249042511,
-0.1118214800953865
] |
null | null |
transformers
|
# resnext101_32x8d
Implementation of ResNetXt proposed in [\"Aggregated Residual
Transformation for Deep Neural
Networks\"](https://arxiv.org/pdf/1611.05431.pdf)
Create a default model
``` python
ResNetXt.resnext50_32x4d()
ResNetXt.resnext101_32x8d()
# create a resnetxt18_32x4d
ResNetXt.resnet18(block=ResNetXtBottleNeckBlock, groups=32, base_width=4)
```
Examples:
: ``` python
# change activation
ResNetXt.resnext50_32x4d(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNetXt.resnext50_32x4d(n_classes=100)
# pass a different block
ResNetXt.resnext50_32x4d(block=SENetBasicBlock)
# change the initial convolution
model = ResNetXt.resnext50_32x4d
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = ResNetXt.resnext50_32x4d()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{}
| null |
glasses/resnext101_32x8d
|
[
"transformers",
"pytorch",
"arxiv:1611.05431",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1611.05431"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us
|
# resnext101_32x8d
Implementation of ResNetXt proposed in \"Aggregated Residual
Transformation for Deep Neural
Networks\"
Create a default model
Examples:
:
|
[
"# resnext101_32x8d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us \n",
"# resnext101_32x8d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
30,
47
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us \n# resnext101_32x8d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
-0.07091588526964188,
-0.021861789748072624,
-0.0020389158744364977,
0.01925642415881157,
0.10363481193780899,
0.06276014447212219,
0.04625668004155159,
0.0510755218565464,
-0.14151513576507568,
-0.0033786362037062645,
0.14526627957820892,
0.1509125977754593,
-0.026772107928991318,
0.019909432157874107,
-0.03351235017180443,
-0.2233579009771347,
0.08534117043018341,
0.0992112010717392,
-0.10442908108234406,
0.062072623521089554,
0.07937769591808319,
-0.07434696704149246,
0.07663077861070633,
0.02051587402820587,
-0.187839537858963,
-0.019752278923988342,
0.009136291220784187,
-0.01643524132668972,
0.10128460079431534,
0.08737519383430481,
0.20991285145282745,
-0.00743893114849925,
0.034441858530044556,
-0.09595953673124313,
0.0413888543844223,
0.04434641823172569,
-0.036468181759119034,
0.11022230982780457,
0.0758637934923172,
0.013909611850976944,
0.08723114430904388,
0.0420437827706337,
-0.07304245233535767,
-0.0012757788645103574,
-0.09053950011730194,
-0.07489115744829178,
-0.030967438593506813,
0.14129167795181274,
0.015768425539135933,
0.10992246866226196,
-0.02489747293293476,
0.16804605722427368,
-0.09657386690378189,
0.09374203532934189,
0.14030736684799194,
-0.31717777252197266,
-0.024087004363536835,
0.0946715772151947,
0.0006429008208215237,
-0.044258590787649155,
0.023903770372271538,
0.008986352942883968,
0.07507380843162537,
0.028062717989087105,
-0.1045738160610199,
-0.019105952233076096,
0.0001986958523048088,
0.007589605171233416,
-0.1718405932188034,
0.0627962052822113,
0.015213213860988617,
0.03712356463074684,
0.05840024724602699,
0.0066185989417135715,
-0.18297460675239563,
-0.03797651082277298,
-0.055998578667640686,
-0.04335111007094383,
-0.04164251685142517,
0.021852238103747368,
0.01768822781741619,
-0.039194680750370026,
-0.04900712892413139,
-0.019824467599391937,
-0.14411857724189758,
0.13970056176185608,
0.09759748727083206,
0.05510660633444786,
-0.10425484925508499,
0.02129253000020981,
-0.0034801894798874855,
-0.0016689648618921638,
0.06760803610086441,
-0.13997912406921387,
-0.1294236183166504,
-0.04561774432659149,
-0.12433447688817978,
-0.13223733007907867,
0.07452894002199173,
0.13104552030563354,
-0.06905820965766907,
-0.02736430987715721,
0.06845621764659882,
0.08115073293447495,
-0.06499993056058884,
-0.0265751201659441,
-0.293055921792984,
0.10857681930065155,
0.027859261259436607,
-0.09840775281190872,
0.005760591011494398,
0.01217885036021471,
-0.11063116788864136,
-0.0033679185435175896,
0.0433313213288784,
0.00636200699955225,
-0.03239097073674202,
0.0806736871600151,
-0.04286038503050804,
-0.1166810393333435,
0.09595565497875214,
-0.004434344824403524,
-0.060238320380449295,
-0.009674740023911,
-0.032385941594839096,
0.12296376377344131,
0.005630023777484894,
0.011798858642578125,
-0.07299452275037766,
0.010146232321858406,
-0.07757004350423813,
-0.04693109542131424,
-0.06025707721710205,
-0.1274963617324829,
-0.00016553039313293993,
0.03357645124197006,
0.0848812386393547,
-0.1949881613254547,
-0.12484538555145264,
0.02542734518647194,
0.08315915614366531,
-0.03333938121795654,
-0.007622020319104195,
-0.025673501193523407,
-0.06626272946596146,
-0.04306071251630783,
0.005612772423774004,
0.034726276993751526,
-0.046233437955379486,
0.0011938302777707577,
-0.025617899373173714,
0.06289131194353104,
-0.09931842237710953,
0.011962383054196835,
-0.05838480591773987,
-0.03289753198623657,
-0.06046181544661522,
0.0758851170539856,
-0.02885361574590206,
0.09879173338413239,
-0.13189548254013062,
-0.0786481574177742,
-0.07230061292648315,
0.01242954470217228,
0.06308992207050323,
0.0693306177854538,
-0.10584709793329239,
-0.06863832473754883,
0.19229823350906372,
-0.09774142503738403,
-0.1190061941742897,
0.1597042977809906,
-0.027593672275543213,
-0.06724949926137924,
0.04515111446380615,
0.12799891829490662,
0.06387143582105637,
0.01385509967803955,
-0.017678391188383102,
-0.039854299277067184,
-0.15577030181884766,
-0.11508117616176605,
-0.06095156818628311,
0.17069754004478455,
-0.04005389288067818,
-0.04831021651625633,
0.04531223326921463,
0.11437059193849564,
-0.1372361183166504,
-0.05979624018073082,
-0.03513631969690323,
-0.13734856247901917,
0.08814339339733124,
0.019033290445804596,
0.1130029633641243,
0.07556534558534622,
-0.06742272526025772,
-0.10242721438407898,
0.06436055153608322,
-0.044326942414045334,
0.0683840811252594,
-0.12110000103712082,
0.08379809558391571,
-0.1006120890378952,
0.04779423773288727,
-0.24228793382644653,
-0.11400459706783295,
0.01210560742765665,
-0.002524559386074543,
0.03677605837583542,
0.22868521511554718,
0.03567693382501602,
-0.0433400459587574,
-0.05643059313297272,
-0.034347329288721085,
0.1019544005393982,
0.004341010469943285,
-0.049747731536626816,
-0.13239526748657227,
0.01668541133403778,
-0.09428401291370392,
-0.08032651990652084,
-0.17089806497097015,
-0.03556564077734947,
-0.0552070215344429,
0.15802761912345886,
-0.018184378743171692,
0.06559212505817413,
0.031146053224802017,
0.030553340911865234,
-0.04282749444246292,
-0.024452319368720055,
0.040335506200790405,
-0.0270315520465374,
-0.14228205382823944,
0.0696725994348526,
0.009649425745010376,
0.16525040566921234,
0.09355815500020981,
-0.15499475598335266,
-0.08217929303646088,
0.11087506264448166,
-0.0716577023267746,
0.027179600670933723,
0.044277872890233994,
0.09470731019973755,
0.19322341680526733,
0.015502563677728176,
0.0647260844707489,
-0.02915874496102333,
0.07189812511205673,
0.004624912980943918,
-0.03123195469379425,
0.05765216797590256,
0.07611978054046631,
0.07394247502088547,
-0.33135294914245605,
0.05722084641456604,
-0.02868383564054966,
-0.0956726223230362,
0.01257767342031002,
-0.02230004593729973,
-0.054577574133872986,
0.02980816178023815,
0.04525025561451912,
0.016895931214094162,
0.004466908983886242,
0.08626929670572281,
0.02862529642879963,
0.05954597517848015,
-0.08998528867959976,
0.0675845816731453,
-0.07277734577655792,
0.002479143440723419,
-0.01222055871039629,
0.011565342545509338,
-0.02228648029267788,
0.09033093601465225,
0.008839395828545094,
0.06376849114894867,
-0.023516574874520302,
-0.15060096979141235,
0.02431020885705948,
-0.004649006295949221,
-0.0447155125439167,
0.18990696966648102,
-0.07682838290929794,
-0.19709686934947968,
-0.06492920964956284,
-0.05876634642481804,
-0.016767382621765137,
0.004053697921335697,
0.00416930578649044,
-0.13416838645935059,
-0.06188959628343582,
0.05284923315048218,
0.10008519142866135,
0.06392291933298111,
0.08124114573001862,
-0.02677847631275654,
0.022404415532946587,
-0.0020467888098210096,
-0.03186442330479622,
0.03293108940124512,
-0.18656162917613983,
-0.05069994553923607,
0.03788575157523155,
-0.11433415114879608,
0.04742071032524109,
0.17142057418823242,
0.041573893278837204,
-0.011885765008628368,
-0.0033014435321092606,
0.1784237027168274,
-0.0671287402510643,
0.04448891431093216,
0.09439071267843246,
-0.01782459393143654,
0.04415198788046837,
0.13192549347877502,
0.0300005991011858,
-0.0635460838675499,
0.028491126373410225,
-0.004079279024153948,
-0.014194388873875141,
-0.23718991875648499,
-0.16187182068824768,
-0.09736859053373337,
-0.05645633488893509,
0.06013726443052292,
0.03174980357289314,
0.014254874549806118,
0.122378870844841,
0.011655159294605255,
-0.01905512809753418,
-0.09396947175264359,
0.06000620126724243,
0.16694538295269012,
0.04382925108075142,
0.1141316294670105,
-0.08453506231307983,
-0.12601755559444427,
0.05115868151187897,
0.009807690978050232,
0.2712997496128082,
-0.017640408128499985,
0.03693588823080063,
0.05918767303228378,
0.14921560883522034,
0.09865137189626694,
0.15946397185325623,
-0.010113714262843132,
-0.08086325228214264,
-0.0412728413939476,
0.01651855558156967,
-0.020080361515283585,
0.09757069498300552,
0.0734366923570633,
-0.0852673277258873,
0.04777982831001282,
0.0752234160900116,
0.03189743310213089,
0.169599249958992,
0.14364571869373322,
-0.34090790152549744,
-0.08368270099163055,
-0.08073180168867111,
-0.042736537754535675,
0.014052211306989193,
0.06038450449705124,
0.03258105367422104,
0.00852136965841055,
0.07388432323932648,
-0.0994773656129837,
0.10102195292711258,
-0.03806227073073387,
0.05931030213832855,
-0.026071202009916306,
0.06443502008914948,
0.026134466752409935,
0.04533466696739197,
-0.18882633745670319,
0.2708919644355774,
-0.010233364067971706,
-0.050238825380802155,
-0.006832757033407688,
-0.018507448956370354,
0.02320803515613079,
0.1179761067032814,
0.13551156222820282,
-0.0002972234506160021,
-0.011367592960596085,
0.05819806456565857,
-0.014799586497247219,
0.01921233907341957,
0.05759956315159798,
0.08522230386734009,
0.008496634662151337,
0.03760320320725441,
-0.0571524016559124,
0.04573217034339905,
0.04954656586050987,
-0.03213389590382576,
-0.1029847115278244,
0.02595309540629387,
0.09856832772493362,
-0.09030485153198242,
-0.014804732985794544,
-0.060319527983665466,
-0.016610439866781235,
0.18272241950035095,
0.069498710334301,
-0.031537823379039764,
-0.0913858488202095,
0.0660247802734375,
0.19302691519260406,
-0.09433207660913467,
0.07774672657251358,
-0.03811906650662422,
0.006939969025552273,
-0.024221453815698624,
-0.16913814842700958,
0.16452965140342712,
-0.15020698308944702,
-0.01886151172220707,
-0.024400917813181877,
0.007382584270089865,
-0.003000493161380291,
0.024007849395275116,
0.020225703716278076,
-0.009095742367208004,
-0.14118455350399017,
-0.06165558472275734,
-0.06312257051467896,
-0.03563974052667618,
-0.052516743540763855,
0.08328662812709808,
-0.10101054608821869,
0.06510807573795319,
-0.05805647373199463,
0.06831087917089462,
0.16758914291858673,
-0.03211486339569092,
-0.06288154423236847,
-0.012318122200667858,
0.12197959423065186,
0.012124376371502876,
-0.23125097155570984,
-0.09288844466209412,
0.033831074833869934,
0.0023447528947144747,
-0.05304276943206787,
-0.15765129029750824,
0.1775110960006714,
-0.04347265139222145,
0.02434217557311058,
-0.014707493595778942,
-0.04523087292909622,
-0.06340467929840088,
0.21590878069400787,
0.03079107403755188,
0.26966413855552673,
-0.0283226165920496,
0.009640495292842388,
-0.028412893414497375,
0.11146306246519089,
0.09864714741706848,
0.014668717980384827,
0.11599830538034439,
0.033636242151260376,
0.1347091645002365,
0.010326473042368889,
-0.03764627128839493,
-0.03446834161877632,
0.10673630237579346,
0.012507918290793896,
-0.02923492155969143,
0.03873352333903313,
0.07023104280233383,
0.025871267542243004,
-0.022344378754496574,
0.1632072776556015,
0.05794413387775421,
-0.0037900309544056654,
-0.03493442386388779,
-0.08612418174743652,
-0.0023618838749825954,
0.09151844680309296,
-0.04571111872792244,
-0.10063962638378143,
0.09736054390668869,
0.12299153953790665,
0.031034834682941437,
0.2071368396282196,
0.0485854372382164,
0.09092864394187927,
0.040344592183828354,
0.06868461519479752,
-0.09461471438407898,
-0.05041149631142616,
-0.023133227601647377,
-0.03086705319583416,
0.09019085019826889,
-0.09796565026044846,
0.020103320479393005,
0.1549282968044281,
-0.05697586014866829,
-0.011304867453873158,
0.0962548777461052,
-0.032567333430051804,
-0.045148689299821854,
0.09624819457530975,
-0.10519494116306305,
-0.13995936512947083,
0.0013092657318338752,
-0.003065998200327158,
0.011526450514793396,
0.18500258028507233,
0.1666557788848877,
-0.06604103744029999,
-0.009495622478425503,
0.03126389533281326,
0.01894795149564743,
-0.0845988467335701,
0.11242838203907013,
0.10622421652078629,
0.04551191255450249,
-0.1265576034784317,
0.04001181945204735,
0.009784710593521595,
-0.10747067630290985,
-0.03648357838392258,
-0.044859517365694046,
-0.11835463345050812,
-0.09943939745426178,
-0.015996865928173065,
0.016809845343232155,
-0.19808988273143768,
-0.057490620762109756,
-0.11476226896047592,
-0.062141213566064835,
0.08470125496387482,
0.08643700182437897,
0.10691922903060913,
0.02192274108529091,
-0.09963209927082062,
-0.05711769312620163,
-0.09996558725833893,
0.0048740156926214695,
0.0748552456498146,
0.06664692610502243,
-0.177016481757164,
-0.060391321778297424,
-0.03660980984568596,
0.11491978913545609,
-0.09682505577802658,
0.05074463412165642,
-0.10691322386264801,
0.06567665934562683,
-0.07246532291173935,
-0.031021762639284134,
-0.091962531208992,
-0.033659741282463074,
-0.049504756927490234,
-0.08170385658740997,
-0.08085623383522034,
0.0376882441341877,
-0.10169323533773422,
0.0584867037832737,
0.04741412028670311,
-0.030011653900146484,
-0.04404429718852043,
-0.008273512125015259,
-0.04119244962930679,
0.005890883971005678,
0.12241422384977341,
0.03691859915852547,
-0.10242264717817307,
0.12241680175065994,
-0.08709277957677841,
-0.13185691833496094,
0.09963130950927734,
0.07456469535827637,
0.07841205596923828,
-0.01740342192351818,
0.053659357130527496,
0.02808239683508873,
0.0344063900411129,
-0.02158135175704956,
-0.07751704752445221,
-0.003284909762442112,
0.0442199744284153,
-0.1027224138379097,
0.003279268043115735,
-0.032099924981594086,
-0.03748318925499916,
-0.013465373776853085,
0.12397301197052002,
0.09340906143188477,
-0.02728894352912903,
0.04723941162228584,
0.01852600835263729,
0.032511454075574875,
-0.017377778887748718,
-0.08354467898607254,
0.10014235973358154,
-0.08967195451259613,
0.04094752296805382,
-0.042805567383766174,
0.2264070361852646,
-0.11562834680080414,
-0.016945786774158478,
0.013185447081923485,
0.11559386551380157,
-0.11007118225097656,
0.05571373179554939,
0.15142317116260529,
0.09731332212686539,
-0.013889414258301258,
-0.021628176793456078,
0.05696322023868561,
0.07332649827003479,
0.17244724929332733,
0.12916024029254913,
0.12741298973560333,
0.08399268984794617,
0.07210556417703629,
0.01239754818379879,
-0.030327381566166878,
0.052839238196611404,
0.012650413438677788,
-0.11267579346895218,
0.04242921620607376,
0.05191785469651222,
0.034530576318502426,
0.07469098269939423,
0.07824434340000153,
0.020412743091583252,
0.0541028156876564,
-0.05076953023672104,
-0.09938782453536987,
-0.10026517510414124,
-0.09108763188123703,
-0.1459307223558426,
-0.01862216740846634,
-0.07759897410869598,
-0.1299889087677002,
0.09741812944412231,
0.09885527938604355,
-0.02277541533112526,
0.14686767756938934,
0.10247679054737091,
-0.034990787506103516,
0.07541225105524063,
-0.004382286686450243,
-0.058363307267427444,
0.1463962346315384,
-0.07290419191122055,
-0.046681102365255356,
0.05648307502269745,
-0.03405802324414253,
-0.016676099970936775,
-0.043243128806352615,
0.13291187584400177,
-0.024150656536221504,
-0.039259638637304306,
-0.0327884815633297,
0.045538973063230515,
-0.0325472429394722,
-0.035857073962688446,
-0.03502305597066879,
-0.023462500423192978,
-0.011175566352903843,
0.17079226672649384,
-0.06343838572502136,
-0.07330755889415741,
-0.1614687144756317,
0.11910783499479294,
0.02389351651072502,
0.08255745470523834,
-0.04403451830148697,
0.025520924478769302,
-0.046634282916784286,
0.23666852712631226,
0.3082476556301117,
-0.1707984358072281,
-0.005569417495280504,
0.09733808040618896,
0.0037751176860183477,
-0.031860318034887314,
0.05461672320961952,
0.10959798097610474,
0.12224546074867249,
-0.02614760957658291,
-0.1371787041425705,
-0.05330216884613037,
-0.011373928748071194,
-0.06618504971265793,
-0.046811457723379135,
0.12055567651987076,
-0.02937839739024639,
-0.05290904641151428,
0.14534342288970947,
-0.2585947811603546,
-0.04244862496852875,
-0.0599602609872818,
-0.12191285192966461,
-0.12610140442848206,
-0.11116103082895279,
0.07372806966304779,
0.05149540305137634,
0.12695513665676117,
-0.06624762713909149,
-0.06363704800605774,
0.08353608846664429,
0.01984443888068199,
-0.1353655755519867,
-0.056056421250104904,
0.034481510519981384,
0.03550935536623001,
0.01367043238133192,
-0.0012013603700324893,
-0.026965096592903137,
0.11634605377912521,
-0.03123658150434494,
-0.0561358667910099,
0.01819164678454399,
-0.015079034492373466,
-0.1632409542798996,
0.03966117650270462,
0.015912827104330063,
-0.02174057811498642,
0.01624170131981373,
0.019473766908049583,
-0.3014852702617645,
0.0027345134876668453,
0.012014655396342278,
0.011507445015013218,
-0.0536024309694767,
0.02630520984530449,
-0.08903083950281143,
0.08672542124986649,
0.10343264043331146,
-0.030398786067962646,
0.03363719955086708,
-0.027956334874033928,
0.015246868133544922,
0.06361550837755203,
0.07160511612892151,
-0.035278648138046265,
-0.17067617177963257,
-0.043215446174144745,
-0.12658771872520447,
0.0281120166182518,
-0.0732063353061676,
-0.013872319832444191,
-0.04945453628897667,
0.015126500278711319,
-0.08930530399084091,
0.0901174396276474,
0.06559940427541733,
-0.016824059188365936,
0.017054947093129158,
0.025073489174246788,
-0.026461925357580185,
-0.03946845978498459,
-0.1268736571073532,
-0.12259917706251144
] |
null | null |
transformers
|
# resnext50_32x4d
Implementation of ResNetXt proposed in [\"Aggregated Residual
Transformation for Deep Neural
Networks\"](https://arxiv.org/pdf/1611.05431.pdf)
Create a default model
``` python
ResNetXt.resnext50_32x4d()
ResNetXt.resnext101_32x8d()
# create a resnetxt18_32x4d
ResNetXt.resnet18(block=ResNetXtBottleNeckBlock, groups=32, base_width=4)
```
Examples:
: ``` python
# change activation
ResNetXt.resnext50_32x4d(activation = nn.SELU)
# change number of classes (default is 1000 )
ResNetXt.resnext50_32x4d(n_classes=100)
# pass a different block
ResNetXt.resnext50_32x4d(block=SENetBasicBlock)
# change the initial convolution
model = ResNetXt.resnext50_32x4d
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = ResNetXt.resnext50_32x4d()
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[torch.Size([1, 64, 112, 112]), torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14])]
```
|
{}
| null |
glasses/resnext50_32x4d
|
[
"transformers",
"pytorch",
"arxiv:1611.05431",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1611.05431"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us
|
# resnext50_32x4d
Implementation of ResNetXt proposed in \"Aggregated Residual
Transformation for Deep Neural
Networks\"
Create a default model
Examples:
:
|
[
"# resnext50_32x4d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us \n",
"# resnext50_32x4d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
30,
47
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1611.05431 #endpoints_compatible #region-us \n# resnext50_32x4d\nImplementation of ResNetXt proposed in \\\"Aggregated Residual\nTransformation for Deep Neural\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:\n\n :"
] |
[
-0.07260572910308838,
-0.020825961604714394,
-0.002030271105468273,
0.01994342729449272,
0.10591845214366913,
0.06416765600442886,
0.04615098237991333,
0.05094854161143303,
-0.14691892266273499,
-0.0031327693723142147,
0.14592565596103668,
0.1509939730167389,
-0.026742950081825256,
0.023458750918507576,
-0.03246388956904411,
-0.2219715118408203,
0.08047596365213394,
0.09917470812797546,
-0.11186278611421585,
0.06214424967765808,
0.07668782770633698,
-0.0752187967300415,
0.07947774231433868,
0.017584312707185745,
-0.1945660263299942,
-0.015225519426167011,
0.00989625882357359,
-0.015958135947585106,
0.10095725953578949,
0.08975531905889511,
0.20776325464248657,
-0.010514948517084122,
0.03467601537704468,
-0.0921030342578888,
0.04267575591802597,
0.04340536892414093,
-0.03732307627797127,
0.11098181456327438,
0.07514708489179611,
0.012824004516005516,
0.08572917431592941,
0.03681587055325508,
-0.07750602066516876,
-0.0022740315180271864,
-0.08938111364841461,
-0.07642094045877457,
-0.029886240139603615,
0.13921596109867096,
0.01389920711517334,
0.11136376112699509,
-0.024653330445289612,
0.16636794805526733,
-0.09603547304868698,
0.09471316635608673,
0.1340147703886032,
-0.3219648599624634,
-0.02469811402261257,
0.09718810021877289,
-0.002574996557086706,
-0.03902371972799301,
0.0234526339918375,
0.013330773450434208,
0.0748634785413742,
0.030357837677001953,
-0.10403497517108917,
-0.0195327065885067,
0.007243954110890627,
0.010682694613933563,
-0.16994287073612213,
0.06418415904045105,
0.019887372851371765,
0.03812575340270996,
0.058615218847990036,
0.006152060814201832,
-0.18181559443473816,
-0.03250798583030701,
-0.05804220214486122,
-0.03939088061451912,
-0.040815677493810654,
0.022144095972180367,
0.008627273142337799,
-0.038174550980329514,
-0.048499904572963715,
-0.01840924099087715,
-0.14406627416610718,
0.1367584615945816,
0.0973123237490654,
0.057101283222436905,
-0.10414774715900421,
0.018126240000128746,
-0.003830785397440195,
-0.00023956822406034917,
0.06917811930179596,
-0.14336369931697845,
-0.12968868017196655,
-0.04385775327682495,
-0.1268126368522644,
-0.12507112324237823,
0.07512766122817993,
0.12757673859596252,
-0.07272646576166153,
-0.025889568030834198,
0.0686795711517334,
0.07991039007902145,
-0.0684647336602211,
-0.032562486827373505,
-0.29471859335899353,
0.10212206095457077,
0.028164975345134735,
-0.10125720500946045,
0.007036672905087471,
0.014105921611189842,
-0.11054997146129608,
-0.0018371546175330877,
0.03542502224445343,
0.007369684521108866,
-0.033852893859148026,
0.08144792914390564,
-0.04076038673520088,
-0.11382463574409485,
0.09128803759813309,
-0.0050963652320206165,
-0.059991683810949326,
-0.008435714058578014,
-0.029398024082183838,
0.12208130955696106,
0.004733072593808174,
0.008949370123445988,
-0.07290519028902054,
0.005513044539839029,
-0.07516393810510635,
-0.04813574254512787,
-0.06067580357193947,
-0.12816134095191956,
-0.003244483843445778,
0.042238179594278336,
0.08634981513023376,
-0.1951279640197754,
-0.12226611375808716,
0.02669002115726471,
0.08332954347133636,
-0.03401656076312065,
-0.010237216018140316,
-0.02367691695690155,
-0.06798165291547775,
-0.042519401758909225,
0.006615541409701109,
0.03623366355895996,
-0.04563581198453903,
-0.0003321183321531862,
-0.020437192171812057,
0.06500936299562454,
-0.09773880988359451,
0.010805688798427582,
-0.05547179654240608,
-0.03560706973075867,
-0.06174825131893158,
0.08005352318286896,
-0.02636301890015602,
0.09517434984445572,
-0.13054217398166656,
-0.08182616531848907,
-0.0744333416223526,
0.012270130217075348,
0.06164262816309929,
0.07156367599964142,
-0.1066286563873291,
-0.07264658063650131,
0.19684772193431854,
-0.09521797299385071,
-0.11668038368225098,
0.15781843662261963,
-0.027089575305581093,
-0.07176098972558975,
0.04744725674390793,
0.12284448742866516,
0.059796061366796494,
0.011579057201743126,
-0.016398269683122635,
-0.0356745719909668,
-0.1568428874015808,
-0.11576785892248154,
-0.06302903592586517,
0.17219990491867065,
-0.0445929579436779,
-0.04816571623086929,
0.043163005262613297,
0.1159096211194992,
-0.13728517293930054,
-0.05993641912937164,
-0.03548432141542435,
-0.13732632994651794,
0.08773494511842728,
0.017675355076789856,
0.11358301341533661,
0.07417435199022293,
-0.06667649000883102,
-0.10307329893112183,
0.06461212038993835,
-0.04741307348012924,
0.06784749031066895,
-0.12221533805131912,
0.0848604068160057,
-0.10374151915311813,
0.04956576228141785,
-0.24274559319019318,
-0.11675915867090225,
0.011184423230588436,
0.003607151797041297,
0.03783605247735977,
0.2299167960882187,
0.0386819913983345,
-0.039804257452487946,
-0.05974241718649864,
-0.03679424896836281,
0.10076940804719925,
0.002249735174700618,
-0.049645110964775085,
-0.1344616562128067,
0.01762486808001995,
-0.0935489684343338,
-0.07913265377283096,
-0.17465835809707642,
-0.03763924911618233,
-0.052954960614442825,
0.16023534536361694,
-0.017273372039198875,
0.06608381867408752,
0.031590744853019714,
0.030072195455431938,
-0.0406646691262722,
-0.024412186816334724,
0.04138913378119469,
-0.027077380567789078,
-0.14072762429714203,
0.06978095322847366,
0.011336230672895908,
0.16798241436481476,
0.09817293286323547,
-0.15614007413387299,
-0.08125731348991394,
0.11260856688022614,
-0.07155639678239822,
0.02754090540111065,
0.04321180656552315,
0.09409023821353912,
0.19063542783260345,
0.013705732300877571,
0.06481553614139557,
-0.027767004445195198,
0.07044675201177597,
0.0037662475369870663,
-0.027729395776987076,
0.05936053395271301,
0.07580072432756424,
0.07260105013847351,
-0.32895228266716003,
0.0587419830262661,
-0.02620980143547058,
-0.09487733244895935,
0.01308174803853035,
-0.02117583528161049,
-0.05619530752301216,
0.029738442972302437,
0.04314224049448967,
0.01585807092487812,
0.005368561949580908,
0.09206297993659973,
0.028626153245568275,
0.060121625661849976,
-0.0925782099366188,
0.06590898334980011,
-0.07113371044397354,
0.004450914449989796,
-0.014117058366537094,
0.010541338473558426,
-0.024999506771564484,
0.0908065065741539,
0.008447169326245785,
0.0630335658788681,
-0.022404443472623825,
-0.14816677570343018,
0.028093388304114342,
-0.0046655163168907166,
-0.043440740555524826,
0.18690668046474457,
-0.07951762527227402,
-0.20023606717586517,
-0.06381484866142273,
-0.058786146342754364,
-0.01749478094279766,
0.004745653364807367,
0.002156079513952136,
-0.14028789103031158,
-0.06335638463497162,
0.056060485541820526,
0.09958673268556595,
0.06161215156316757,
0.08322608470916748,
-0.026585102081298828,
0.023463664576411247,
-0.00015828473260626197,
-0.028363097459077835,
0.035592030733823776,
-0.18555518984794617,
-0.048842862248420715,
0.03912591189146042,
-0.11361341178417206,
0.04693334922194481,
0.164808452129364,
0.04189655929803848,
-0.009948564693331718,
-0.0030518241692334414,
0.18676356971263885,
-0.06777679920196533,
0.0423206090927124,
0.0939096137881279,
-0.014189327135682106,
0.042696159332990646,
0.1306641697883606,
0.030745109543204308,
-0.06446076184511185,
0.027210291475057602,
-0.002433964516967535,
-0.013016353361308575,
-0.23463785648345947,
-0.15985572338104248,
-0.10020846873521805,
-0.0594438873231411,
0.05965593084692955,
0.030118687078356743,
0.008076794445514679,
0.12071394175291061,
0.010861673392355442,
-0.015252534300088882,
-0.094261035323143,
0.06013265624642372,
0.1691080778837204,
0.04287894442677498,
0.11590413749217987,
-0.08549248427152634,
-0.12900413572788239,
0.05397505313158035,
0.010593467392027378,
0.2706092298030853,
-0.014790870249271393,
0.03385322540998459,
0.05700051784515381,
0.1521918922662735,
0.0999840646982193,
0.15706303715705872,
-0.0091696260496974,
-0.08056902885437012,
-0.039547670632600784,
0.01587439700961113,
-0.016043925657868385,
0.0983642190694809,
0.0778690055012703,
-0.08887418359518051,
0.05153348669409752,
0.07673265039920807,
0.03415428474545479,
0.17300643026828766,
0.1412305384874344,
-0.3449978828430176,
-0.08414757996797562,
-0.07996176183223724,
-0.04061700403690338,
0.015231434255838394,
0.05986076965928078,
0.035659320652484894,
0.008518151938915253,
0.07732975482940674,
-0.1020016223192215,
0.09963914006948471,
-0.035473283380270004,
0.05966084077954292,
-0.02425338141620159,
0.06712332367897034,
0.02795756608247757,
0.04430687427520752,
-0.18571045994758606,
0.2644634246826172,
-0.007993035018444061,
-0.051666758954524994,
-0.004343315493315458,
-0.018796376883983612,
0.02593761868774891,
0.12074754387140274,
0.13669829070568085,
0.0002671622787602246,
-0.017707083374261856,
0.060521114617586136,
-0.012476220726966858,
0.021275553852319717,
0.0604812316596508,
0.08591485023498535,
0.009340385906398296,
0.03821991756558418,
-0.06146355718374252,
0.04373006895184517,
0.0417659617960453,
-0.03598860651254654,
-0.10575146973133087,
0.02320198528468609,
0.10183475911617279,
-0.08663395792245865,
-0.013947522267699242,
-0.061652541160583496,
-0.01622670888900757,
0.18901826441287994,
0.06446582823991776,
-0.03258669376373291,
-0.0936698392033577,
0.07119828462600708,
0.19117745757102966,
-0.09274035692214966,
0.0780683159828186,
-0.036531828343868256,
0.012627869844436646,
-0.02070740796625614,
-0.17264805734157562,
0.16249877214431763,
-0.1531660407781601,
-0.015245308168232441,
-0.022096412256360054,
0.008156988769769669,
-0.0039758882485330105,
0.0238489992916584,
0.019324662163853645,
-0.010104457847774029,
-0.1408817172050476,
-0.06160365045070648,
-0.060820095241069794,
-0.04001378268003464,
-0.04675198346376419,
0.084927037358284,
-0.10332030057907104,
0.0672607347369194,
-0.05475160852074623,
0.06973952054977417,
0.1674170047044754,
-0.04020123556256294,
-0.06315097212791443,
-0.013480573892593384,
0.11696578562259674,
0.013866506516933441,
-0.23282581567764282,
-0.09295818209648132,
0.03410011902451515,
0.0048365285620093346,
-0.054038673639297485,
-0.1622428297996521,
0.17869657278060913,
-0.05115045607089996,
0.02605963684618473,
-0.016537101939320564,
-0.03894202411174774,
-0.060414981096982956,
0.21094998717308044,
0.03226156160235405,
0.26637980341911316,
-0.024012310430407524,
0.00831790454685688,
-0.030329857021570206,
0.1207890436053276,
0.09921562671661377,
0.003792051924392581,
0.1148860827088356,
0.03185392916202545,
0.13860464096069336,
0.00909498892724514,
-0.03555000573396683,
-0.035730648785829544,
0.10337278246879578,
0.013468950055539608,
-0.027273593470454216,
0.047585465013980865,
0.06983822584152222,
0.02800089307129383,
-0.025625422596931458,
0.1593237668275833,
0.0589875653386116,
-0.0016162366373464465,
-0.03260009363293648,
-0.08514447510242462,
-0.004566669464111328,
0.09116186201572418,
-0.043289635330438614,
-0.09833601862192154,
0.09628447145223618,
0.12137165665626526,
0.032474637031555176,
0.2036311775445938,
0.048500869423151016,
0.08770538121461868,
0.03685041517019272,
0.06973345577716827,
-0.08693884313106537,
-0.0574663020670414,
-0.022227827459573746,
-0.03067023679614067,
0.09079238772392273,
-0.09730564802885056,
0.020751606673002243,
0.15296226739883423,
-0.059436291456222534,
-0.013643437065184116,
0.09656085819005966,
-0.030748829245567322,
-0.0467941015958786,
0.0989735797047615,
-0.10759104043245316,
-0.14176057279109955,
-0.0003172509605064988,
-0.01019226387143135,
0.012209024280309677,
0.18201494216918945,
0.1696370393037796,
-0.061687588691711426,
-0.01014741137623787,
0.03088875487446785,
0.018233956769108772,
-0.08536612242460251,
0.1140773594379425,
0.10634322464466095,
0.04236050322651863,
-0.12954698503017426,
0.03777090832591057,
0.006431268528103828,
-0.10686898976564407,
-0.03620798513293266,
-0.04181169718503952,
-0.12189944088459015,
-0.09876842051744461,
-0.012676553800702095,
0.01960783638060093,
-0.19875535368919373,
-0.060814645141363144,
-0.11722280830144882,
-0.06198559328913689,
0.08688322454690933,
0.07964202016592026,
0.10504059493541718,
0.0174998939037323,
-0.099889375269413,
-0.057161230593919754,
-0.09837975353002548,
0.0075411368161439896,
0.07138480991125107,
0.0663578137755394,
-0.17681090533733368,
-0.06312621384859085,
-0.03590996935963631,
0.1165655180811882,
-0.09741276502609253,
0.05279875919222832,
-0.10673137754201889,
0.06559302657842636,
-0.07267073541879654,
-0.03030378185212612,
-0.08715086430311203,
-0.03324269875884056,
-0.052105892449617386,
-0.07928420603275299,
-0.08150212466716766,
0.03931548446416855,
-0.10153967887163162,
0.05856931209564209,
0.049293890595436096,
-0.028813378885388374,
-0.04373414069414139,
-0.006695738527923822,
-0.04193056374788284,
0.00451883627101779,
0.12280037999153137,
0.04034319892525673,
-0.10259883850812912,
0.12201467156410217,
-0.07706672698259354,
-0.13544723391532898,
0.10101929306983948,
0.07758533954620361,
0.07925500720739365,
-0.019431540742516518,
0.056311290711164474,
0.029400916770100594,
0.03683114051818848,
-0.021226301789283752,
-0.07087624818086624,
-0.0017071084585040808,
0.04544323682785034,
-0.10714804381132126,
0.005710948258638382,
-0.032339390367269516,
-0.03573072701692581,
-0.014300604350864887,
0.12596052885055542,
0.09276102483272552,
-0.02925332449376583,
0.043530724942684174,
0.01826687715947628,
0.03213449940085411,
-0.0182221420109272,
-0.08525306731462479,
0.0969870388507843,
-0.08868007361888885,
0.04393875226378441,
-0.04391833767294884,
0.2257688045501709,
-0.11687097698450089,
-0.01613316871225834,
0.013679595664143562,
0.11540098488330841,
-0.1079988032579422,
0.05428463593125343,
0.1482713520526886,
0.09813085198402405,
-0.014482212252914906,
-0.019752293825149536,
0.05781806632876396,
0.07449372857809067,
0.17125879228115082,
0.12618127465248108,
0.12405707687139511,
0.08756133913993835,
0.07160277664661407,
0.013894581235945225,
-0.03182630613446236,
0.053285762667655945,
0.015714021399617195,
-0.11461306363344193,
0.0446089543402195,
0.050604842603206635,
0.031968794763088226,
0.07720872759819031,
0.07816064357757568,
0.02075536735355854,
0.0524635836482048,
-0.052498769015073776,
-0.09992611408233643,
-0.09781236201524734,
-0.09271587431430817,
-0.14799325168132782,
-0.01770315133035183,
-0.07907694578170776,
-0.13143934309482574,
0.09465897083282471,
0.09750308841466904,
-0.02254527062177658,
0.14788782596588135,
0.10545813292264938,
-0.03637700155377388,
0.07300406694412231,
-0.005811022128909826,
-0.058404333889484406,
0.15364544093608856,
-0.07265901565551758,
-0.04599869251251221,
0.052897993475198746,
-0.033651575446128845,
-0.017446888610720634,
-0.043537553399801254,
0.1358921080827713,
-0.024463245645165443,
-0.03639746829867363,
-0.03452426195144653,
0.047660280019044876,
-0.03285195305943489,
-0.03161676600575447,
-0.03551953658461571,
-0.025588730350136757,
-0.01107448898255825,
0.16743187606334686,
-0.06757990270853043,
-0.0717863067984581,
-0.1633463203907013,
0.11850285530090332,
0.02262144908308983,
0.08223304897546768,
-0.04377514496445656,
0.026734448969364166,
-0.04893805459141731,
0.24295400083065033,
0.31121528148651123,
-0.17127373814582825,
-0.005892971996217966,
0.09687074273824692,
0.003934322856366634,
-0.029678739607334137,
0.053360745310783386,
0.1123734787106514,
0.11947909742593765,
-0.028343848884105682,
-0.13818447291851044,
-0.0539671815931797,
-0.010405893437564373,
-0.06684442609548569,
-0.044226374477148056,
0.11859394609928131,
-0.0318598710000515,
-0.04926415905356407,
0.1456352174282074,
-0.25633129477500916,
-0.044711485505104065,
-0.05663347244262695,
-0.11759492754936218,
-0.1251785308122635,
-0.11124344915151596,
0.068759024143219,
0.05127257481217384,
0.12427090853452682,
-0.06707732379436493,
-0.06317984312772751,
0.08679450303316116,
0.018650416284799576,
-0.13838109374046326,
-0.05739876627922058,
0.03508300706744194,
0.02825208380818367,
0.006494459696114063,
-0.001556978328153491,
-0.022524956613779068,
0.1176689863204956,
-0.03287724032998085,
-0.055208031088113785,
0.01804315112531185,
-0.016803879290819168,
-0.16243954002857208,
0.03905623033642769,
0.019236700609326363,
-0.02507712133228779,
0.017771372571587563,
0.017520342022180557,
-0.29019641876220703,
-0.0000883359243744053,
0.010502388700842857,
0.011580260470509529,
-0.053922224789857864,
0.025673789903521538,
-0.08987056463956833,
0.08597039431333542,
0.10384389013051987,
-0.029986325651407242,
0.035623908042907715,
-0.02829662337899208,
0.015736036002635956,
0.06704258918762207,
0.06726593524217606,
-0.03483406826853752,
-0.17101967334747314,
-0.04515431076288223,
-0.12272550165653229,
0.029654858633875847,
-0.07586382329463959,
-0.01377846859395504,
-0.04978162422776222,
0.016969697549939156,
-0.09059896320104599,
0.09111160039901733,
0.06632839888334274,
-0.01881067454814911,
0.01679540053009987,
0.02719970978796482,
-0.027414701879024506,
-0.04435274377465248,
-0.12733061611652374,
-0.12281148135662079
] |
null | null |
transformers
|
# vgg11
Implementation of VGG proposed in [Very Deep Convolutional Networks For
Large-Scale Image Recognition](https://arxiv.org/pdf/1409.1556.pdf)
``` python
VGG.vgg11()
VGG.vgg13()
VGG.vgg16()
VGG.vgg19()
VGG.vgg11_bn()
VGG.vgg13_bn()
VGG.vgg16_bn()
VGG.vgg19_bn()
```
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
``` python
# change activation
VGG.vgg11(activation = nn.SELU)
# change number of classes (default is 1000 )
VGG.vgg11(n_classes=100)
# pass a different block
from nn.models.classification.senet import SENetBasicBlock
VGG.vgg11(block=SENetBasicBlock)
# store the features tensor after every block
```
|
{}
| null |
glasses/vgg11
|
[
"transformers",
"pytorch",
"arxiv:1409.1556",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1409.1556"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us
|
# vgg11
Implementation of VGG proposed in Very Deep Convolutional Networks For
Large-Scale Image Recognition
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
|
[
"# vgg11\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n",
"# vgg11\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
29,
84
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n# vgg11\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
-0.07242374122142792,
-0.051995743066072464,
-0.0020701121538877487,
0.03037923388183117,
0.014471183530986309,
0.055894412100315094,
-0.007795250508934259,
0.08236072212457657,
-0.02928370051085949,
0.016165338456630707,
0.189015194773674,
0.03912725672125816,
0.023759445175528526,
0.058047208935022354,
-0.04890989512205124,
-0.24301792681217194,
0.03725117817521095,
0.1285940259695053,
0.010763063095510006,
0.054072603583335876,
0.0701848641037941,
-0.14111526310443878,
0.08110889047384262,
0.009036285802721977,
-0.23582115769386292,
-0.046762123703956604,
0.008548121899366379,
-0.04108993709087372,
0.07011770457029343,
0.06341157853603363,
0.031773556023836136,
0.019243527203798294,
0.011183305643498898,
-0.05092970281839371,
0.03300771117210388,
-0.0020965661387890577,
-0.01162309292703867,
0.07573717832565308,
0.050864920020103455,
0.08281856030225754,
0.04626160487532616,
-0.045495301485061646,
-0.07180067896842957,
-0.013727382756769657,
-0.06532251089811325,
-0.13568422198295593,
0.009512419812381268,
0.09843257814645767,
-0.03475496917963028,
0.0023442204110324383,
0.010056493803858757,
0.14985008537769318,
-0.0729101300239563,
0.03071102499961853,
0.17124366760253906,
-0.26358816027641296,
-0.016553329303860664,
0.20679710805416107,
-0.032985083758831024,
-0.055512744933366776,
-0.04341288283467293,
0.13850675523281097,
0.06883585453033447,
-0.0011590830981731415,
0.014141554944217205,
0.006692953407764435,
-0.06104186177253723,
0.007532164920121431,
-0.12772484123706818,
-0.0954325795173645,
0.10202853381633759,
0.04073220491409302,
0.022039050236344337,
-0.1437889039516449,
-0.13871853053569794,
-0.0661822184920311,
-0.08050178736448288,
-0.038186609745025635,
-0.05206238850951195,
0.010269297286868095,
-0.03793451562523842,
-0.07549820095300674,
-0.06014899164438248,
-0.0669664666056633,
-0.11891444027423859,
-0.017739107832312584,
0.058900706470012665,
0.07450668513774872,
-0.11590474843978882,
0.16594372689723969,
-0.13982433080673218,
-0.06596572697162628,
0.019743744283914566,
-0.17141900956630707,
-0.037927012890577316,
0.04901818186044693,
0.00756053626537323,
0.08856982737779617,
0.048470962792634964,
0.0515822134912014,
0.06654592603445053,
-0.06146664172410965,
-0.032704729586839676,
0.06648624688386917,
0.004381558392196894,
0.0618625245988369,
-0.20817887783050537,
0.05498679354786873,
0.10032246261835098,
-0.11725030839443207,
0.028614763170480728,
-0.004729405511170626,
-0.06716947257518768,
-0.20373697578907013,
0.10160113871097565,
0.04669244587421417,
-0.055005088448524475,
0.1467490792274475,
0.0024757266510277987,
-0.09559779614210129,
0.03581704944372177,
0.024405106902122498,
-0.031235482543706894,
0.0049136304296553135,
-0.049433980137109756,
0.09472740441560745,
0.01588073931634426,
-0.012446370907127857,
-0.05114506930112839,
-0.027785904705524445,
-0.08164971321821213,
-0.0956227034330368,
-0.021419553086161613,
-0.12341193109750748,
0.003821672173216939,
-0.13370032608509064,
0.0454808734357357,
-0.1790473759174347,
-0.11259131878614426,
0.10229754447937012,
-0.035384401679039,
-0.07290365546941757,
0.027097221463918686,
-0.022143034264445305,
-0.07834448665380478,
-0.03091404214501381,
0.006435733754187822,
0.2101605087518692,
-0.05596732720732689,
0.01996864750981331,
-0.062145598232746124,
0.10260309278964996,
-0.04772127419710159,
0.020107798278331757,
-0.04805052652955055,
0.020225634798407555,
-0.030361678451299667,
0.04208463430404663,
-0.06134063005447388,
0.025088725611567497,
-0.05944203957915306,
-0.11049585789442062,
-0.05223357304930687,
-0.02079620584845543,
-0.020732220262289047,
0.10658176243305206,
-0.1204601600766182,
-0.03720085695385933,
0.1275140941143036,
-0.11346201598644257,
-0.10927532613277435,
0.15787450969219208,
-0.05162489041686058,
-0.07993309944868088,
0.06150775030255318,
0.06317970901727676,
0.046577274799346924,
-0.10082690417766571,
-0.001159838167950511,
0.021893421187996864,
-0.1478533297777176,
0.024272317066788673,
-0.033124905079603195,
0.17172154784202576,
-0.02316553331911564,
0.006810582242906094,
-0.0414581261575222,
0.08771733939647675,
-0.1264198124408722,
-0.06714896857738495,
-0.031097369268536568,
-0.04763011261820793,
0.050120286643505096,
0.01700516976416111,
0.07162252813577652,
0.026621581986546516,
-0.038837313652038574,
-0.19637849926948547,
0.07535972446203232,
-0.07155633717775345,
0.028648531064391136,
-0.10039918124675751,
0.16553747653961182,
-0.16083091497421265,
0.03635053709149361,
-0.086056187748909,
-0.014753527007997036,
0.08634752035140991,
0.10621895641088486,
-0.04431463032960892,
0.22935473918914795,
0.025514710694551468,
-0.02474081702530384,
-0.017931638285517693,
0.005303896497935057,
0.11336645483970642,
-0.01360916718840599,
-0.025663983076810837,
-0.07714954018592834,
-0.0009972283151000738,
-0.05223541706800461,
0.03825411573052406,
-0.1492581069469452,
-0.059492822736501694,
0.16260851919651031,
0.09267859905958176,
-0.009054599329829216,
0.05339865759015083,
0.07660101354122162,
-0.05530974641442299,
-0.020751509815454483,
-0.06472154706716537,
0.07657275348901749,
-0.04047869145870209,
-0.08319374918937683,
0.1084795743227005,
-0.020666155964136124,
0.10794685781002045,
0.14785613119602203,
-0.20411579310894012,
-0.0031096013262867928,
-0.08528294414281845,
-0.03764544427394867,
-0.003373831743374467,
0.007268223911523819,
0.01840963400900364,
0.09814872592687607,
-0.06184763088822365,
0.07972173392772675,
-0.07847502827644348,
-0.008309426717460155,
0.08988048136234283,
0.0222210381180048,
-0.05418863892555237,
0.010370185598731041,
-0.022013120353221893,
-0.16077739000320435,
0.03926724195480347,
-0.01836780458688736,
-0.015142163261771202,
0.11414528638124466,
0.0013673497596755624,
-0.056207429617643356,
0.030397489666938782,
-0.0074433558620512486,
-0.00042529322672635317,
0.10839729011058807,
-0.2641158699989319,
-0.039235204458236694,
0.10104548931121826,
-0.04618842899799347,
0.03891066089272499,
-0.08482379466295242,
-0.01398525107651949,
-0.004754635039716959,
-0.05077134817838669,
0.0869026705622673,
0.10909651219844818,
0.018503108993172646,
0.0968610942363739,
-0.023785408586263657,
-0.04572252929210663,
0.03289826586842537,
-0.059589508920907974,
-0.034700725227594376,
0.1259288787841797,
-0.020495284348726273,
-0.22344106435775757,
-0.007121275644749403,
-0.1280134916305542,
-0.09926499426364899,
0.027906395494937897,
-0.04285133257508278,
-0.05931432172656059,
-0.05541260167956352,
0.005650271661579609,
0.036122553050518036,
0.03804943338036537,
0.13519227504730225,
-0.01453513652086258,
-0.018012087792158127,
-0.008666956797242165,
-0.07394520938396454,
-0.02876863442361355,
-0.162337988615036,
-0.07957042753696442,
0.11261358112096786,
-0.06124712526798248,
0.10448170453310013,
0.13274064660072327,
-0.033868562430143356,
0.06341160088777542,
-0.021413762122392654,
0.22573716938495636,
-0.03654579073190689,
-0.006983226165175438,
0.12248019874095917,
0.0010679964907467365,
-0.04357750341296196,
0.039757147431373596,
-0.0034431261010468006,
-0.12609244883060455,
-0.02608206309378147,
-0.023328591138124466,
-0.08576534688472748,
-0.0739629939198494,
-0.14001117646694183,
-0.05903467908501625,
-0.07514379918575287,
0.10208814591169357,
0.05686574801802635,
0.06277810037136078,
0.1207723543047905,
-0.07485604286193848,
0.08195865154266357,
0.009722908958792686,
0.07596872001886368,
0.20608992874622345,
-0.030377652496099472,
0.05840229243040085,
-0.002984562423080206,
-0.06958874315023422,
0.07232320308685303,
0.005543147679418325,
0.20333760976791382,
-0.08730467408895493,
-0.005509349517524242,
0.11536703258752823,
0.10970404744148254,
0.15578831732273102,
0.05763896554708481,
-0.10580061376094818,
-0.05665118247270584,
-0.0627797544002533,
-0.06942791491746902,
0.03871903568506241,
0.045421432703733444,
0.045806415379047394,
-0.018898980692029,
-0.043052930384874344,
-0.03292740508913994,
0.05942704901099205,
0.05626046657562256,
0.13573725521564484,
-0.32073697447776794,
-0.015043044462800026,
0.03289806842803955,
0.0616907998919487,
-0.09898065030574799,
0.06935703754425049,
0.1364137977361679,
-0.02166081964969635,
0.08330333232879639,
-0.09417679905891418,
0.0782201737165451,
-0.18041031062602997,
0.08053495734930038,
-0.07031797617673874,
0.01416836865246296,
0.016828909516334534,
0.07696439325809479,
-0.22777822613716125,
0.17627453804016113,
0.0005070228362455964,
-0.0192719679325819,
-0.11451490223407745,
-0.044406794011592865,
0.0005207924987189472,
0.08649635314941406,
0.18510298430919647,
-0.006482304539531469,
0.16065342724323273,
0.05701812356710434,
-0.032872989773750305,
0.04544300585985184,
0.002506000455468893,
0.07564211636781693,
-0.027364457026124,
0.016756828874349594,
-0.03248539939522743,
0.0336124412715435,
0.10182499885559082,
-0.1275635063648224,
-0.045494262129068375,
0.010902803391218185,
0.09515585750341415,
-0.001787392538972199,
-0.015392757952213287,
-0.014265242964029312,
-0.09566639363765717,
0.14869925379753113,
-0.06427641957998276,
-0.042732276022434235,
-0.05832817777991295,
0.2123604118824005,
0.06772954016923904,
-0.022010378539562225,
0.11345092207193375,
-0.04164571315050125,
0.13631436228752136,
-0.03115086257457733,
-0.13518764078617096,
0.08280070871114731,
-0.07613187283277512,
-0.14085815846920013,
0.01618492417037487,
0.05804147943854332,
0.018092695623636246,
-0.014676141552627087,
0.0612126886844635,
-0.010213094763457775,
-0.05834520235657692,
-0.08537979423999786,
0.07041715830564499,
0.21791093051433563,
0.02600608952343464,
0.03720488399267197,
0.02106400765478611,
-0.06693989038467407,
-0.038993533700704575,
0.09325557202100754,
0.2139364331960678,
0.19023367762565613,
-0.11439737677574158,
0.025517312809824944,
0.10678166896104813,
-0.0169996228069067,
-0.24664044380187988,
0.00933153834193945,
-0.05468671768903732,
0.04030753672122955,
-0.1650274395942688,
-0.007393892854452133,
0.046652112156152725,
0.023407211527228355,
0.011057191528379917,
0.06790324300527573,
-0.24300889670848846,
-0.09003505855798721,
0.2021598517894745,
0.03492818772792816,
0.2856009304523468,
0.005433788523077965,
-0.007070551626384258,
-0.030010225251317024,
-0.08277010172605515,
0.12819793820381165,
0.14700362086296082,
0.1126389354467392,
-0.07903492450714111,
0.012973230332136154,
0.03237465023994446,
-0.06205090507864952,
0.08659645169973373,
0.06225722283124924,
0.13679486513137817,
-0.05835702270269394,
-0.1347428560256958,
0.10735709965229034,
0.015028329566121101,
0.02310342900454998,
0.1998075544834137,
0.06030069291591644,
-0.011840383522212505,
-0.04056882858276367,
-0.11461315304040909,
0.08533251285552979,
0.05631903558969498,
-0.1117786318063736,
-0.1530759483575821,
0.015531918965280056,
0.039365824311971664,
0.031274762004613876,
0.10218748450279236,
-0.011150049977004528,
-0.011966532096266747,
0.03630025312304497,
0.043964095413684845,
-0.130690336227417,
-0.007556765805929899,
-0.056075409054756165,
0.017307832837104797,
0.16591134667396545,
-0.07389623671770096,
0.04101799055933952,
0.11435789614915848,
0.051736097782850266,
0.01932057924568653,
0.09331648796796799,
-0.08041205257177353,
0.029948949813842773,
0.04101739451289177,
-0.22079189121723175,
-0.02682609111070633,
-0.03327541425824165,
-0.12009648978710175,
0.17714916169643402,
0.04008003696799278,
0.09721766412258148,
-0.08740293234586716,
-0.008616091683506966,
0.04603065922856331,
0.002977797295898199,
-0.019332680851221085,
0.10432912409305573,
0.06857751309871674,
0.02161145582795143,
-0.16091099381446838,
0.10462695360183716,
-0.061813320964574814,
-0.09876624494791031,
-0.03206029534339905,
-0.09669248759746552,
-0.13714562356472015,
-0.07101250439882278,
-0.13872703909873962,
0.026778750121593475,
-0.05797211825847626,
-0.007210523821413517,
0.04436277225613594,
-0.09747747331857681,
0.08641085773706436,
0.07382182776927948,
0.0901627391576767,
0.1270589381456375,
-0.07732129096984863,
-0.02712918259203434,
-0.08334662020206451,
0.04476751759648323,
0.008446713909506798,
0.025079801678657532,
-0.22106419503688812,
-0.005946115590631962,
0.005608249455690384,
0.10453946143388748,
-0.11570996791124344,
-0.0375235453248024,
-0.06114141643047333,
0.03718293085694313,
-0.18860693275928497,
0.0249255932867527,
-0.04805378615856171,
-0.020092401653528214,
-0.029813522472977638,
-0.061713092029094696,
-0.06363841891288757,
0.02094094455242157,
-0.048210322856903076,
0.06415805220603943,
0.01049993745982647,
-0.006137640681117773,
-0.024417975917458534,
-0.05174138769507408,
-0.0302341990172863,
-0.010391101241111755,
0.18925423920154572,
0.08191818743944168,
-0.089573435485363,
0.013640016317367554,
-0.20833103358745575,
-0.15735135972499847,
0.1907903105020523,
0.030369648709893227,
0.059396758675575256,
-0.00033497155527584255,
0.08001412451267242,
-0.0051686144433915615,
-0.09001734107732773,
0.021634146571159363,
0.10625813901424408,
-0.040652796626091,
-0.0064203087240457535,
-0.055492158979177475,
0.09350594133138657,
-0.061716657131910324,
0.029692772775888443,
0.11705353856086731,
0.1293054223060608,
-0.018760455772280693,
-0.016102219000458717,
-0.019604645669460297,
-0.040706682950258255,
-0.009493966586887836,
-0.05042477324604988,
-0.1477942168712616,
0.06819267570972443,
0.03251254931092262,
0.0432819165289402,
-0.036891039460897446,
0.3060649037361145,
-0.03421727195382118,
-0.1323079615831375,
-0.02229645475745201,
0.15401144325733185,
-0.04374758526682854,
0.0569310300052166,
0.21905553340911865,
0.04417065158486366,
0.027878412976861,
0.011673766188323498,
0.11018875986337662,
0.156285360455513,
0.18595260381698608,
0.06448360532522202,
-0.026416311040520668,
0.028903720900416374,
0.055820364505052567,
0.08802986145019531,
-0.04468150436878204,
-0.030691714957356453,
0.11105471849441528,
-0.015844149515032768,
0.14171893894672394,
-0.06314460933208466,
-0.07666324824094772,
0.15832866728305817,
-0.0328083261847496,
-0.024414220824837685,
0.07035689800977707,
-0.06184689700603485,
-0.06560396403074265,
-0.3137845993041992,
-0.06564616411924362,
-0.09078629314899445,
0.04545881226658821,
-0.09498894214630127,
-0.0585310123860836,
0.10247653722763062,
0.10199961066246033,
-0.07972846180200577,
0.11516982316970825,
0.03149612620472908,
-0.06627387553453445,
0.14873887598514557,
0.04059123992919922,
-0.11263515800237656,
-0.01234475988894701,
-0.03588670492172241,
0.03172707185149193,
0.037636347115039825,
-0.06485991925001144,
0.01013823039829731,
0.07937533408403397,
0.006745419930666685,
-0.011145489290356636,
-0.03298649936914444,
-0.0863366425037384,
0.009698424488306046,
-0.013376276940107346,
0.043188996613025665,
0.035843271762132645,
-0.06311643123626709,
-0.006144015584141016,
0.15793472528457642,
-0.018941545858979225,
-0.03888615965843201,
-0.04058576375246048,
0.07519656419754028,
-0.07844286412000656,
0.08331044763326645,
0.013849414885044098,
0.02298707515001297,
-0.01265877764672041,
0.2939732074737549,
0.30369633436203003,
-0.07891657948493958,
0.03102247789502144,
0.03727787733078003,
0.010706215165555477,
-0.049204062670469284,
0.1074795201420784,
0.02018892392516136,
0.19497041404247284,
-0.019362768158316612,
-0.03557594493031502,
-0.04366940259933472,
-0.00294656865298748,
-0.02751994878053665,
0.0666118636727333,
0.09441806375980377,
-0.03525007516145706,
-0.07925564050674438,
0.046061743050813675,
-0.12029556185007095,
-0.005642615258693695,
-0.017818834632635117,
-0.09791024774312973,
-0.07434578984975815,
-0.021596908569335938,
-0.11096543818712234,
0.02598736621439457,
0.054284192621707916,
-0.13799116015434265,
0.031173743307590485,
-0.036141734570264816,
0.05750779062509537,
-0.13498690724372864,
-0.01120428554713726,
0.05864863470196724,
0.13941922783851624,
0.1401350051164627,
-0.03185936436057091,
0.047074440866708755,
0.05251162499189377,
-0.020896196365356445,
-0.09090092778205872,
-0.014655626378953457,
-0.009531490504741669,
-0.08393649756908417,
-0.043178606778383255,
0.08647944033145905,
0.008760981261730194,
-0.009353501722216606,
0.05315711721777916,
-0.12135462462902069,
-0.006753644905984402,
-0.06972174346446991,
-0.02230856567621231,
-0.06988496333360672,
-0.00015304792032111436,
-0.022677110508084297,
0.0879514068365097,
0.018239669501781464,
-0.037648141384124756,
-0.047527655959129333,
-0.07638333737850189,
0.037636883556842804,
0.017059296369552612,
0.057653822004795074,
0.0392446331679821,
-0.0986512079834938,
-0.004057436715811491,
0.025268230587244034,
0.010749300010502338,
0.04575299099087715,
-0.026681222021579742,
-0.04186777397990227,
-0.015072247013449669,
0.009444521740078926,
0.08609717339277267,
0.04370187968015671,
-0.00768446596339345,
-0.00941881351172924,
-0.0006657906342297792,
0.009803582914173603,
0.012777270749211311,
-0.13940927386283875,
-0.08655759692192078
] |
null | null |
transformers
|
# vgg11_bn
Implementation of VGG proposed in [Very Deep Convolutional Networks For
Large-Scale Image Recognition](https://arxiv.org/pdf/1409.1556.pdf)
``` python
VGG.vgg11()
VGG.vgg13()
VGG.vgg16()
VGG.vgg19()
VGG.vgg11_bn()
VGG.vgg13_bn()
VGG.vgg16_bn()
VGG.vgg19_bn()
```
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
``` python
# change activation
VGG.vgg11(activation = nn.SELU)
# change number of classes (default is 1000 )
VGG.vgg11(n_classes=100)
# pass a different block
from nn.models.classification.senet import SENetBasicBlock
VGG.vgg11(block=SENetBasicBlock)
# store the features tensor after every block
```
|
{}
| null |
glasses/vgg11_bn
|
[
"transformers",
"pytorch",
"arxiv:1409.1556",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1409.1556"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us
|
# vgg11_bn
Implementation of VGG proposed in Very Deep Convolutional Networks For
Large-Scale Image Recognition
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
|
[
"# vgg11_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n",
"# vgg11_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
29,
86
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n# vgg11_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
-0.06671226769685745,
-0.04257591813802719,
-0.002209985628724098,
0.03104591742157936,
0.011152108199894428,
0.05128471925854683,
-0.010773885063827038,
0.08307888358831406,
-0.0334492065012455,
0.01931970752775669,
0.18727269768714905,
0.038395799696445465,
0.028008054941892624,
0.06798691302537918,
-0.055939681828022,
-0.24226123094558716,
0.04100643843412399,
0.1339745670557022,
0.013219432905316353,
0.054280105978250504,
0.06868264824151993,
-0.14327585697174072,
0.0754779800772667,
0.003913865424692631,
-0.24402163922786713,
-0.04792400076985359,
0.01053804624825716,
-0.04163900390267372,
0.061293814331293106,
0.07662691175937653,
0.0331382155418396,
0.02503139153122902,
0.010447120293974876,
-0.053021982312202454,
0.03309952840209007,
-0.0011153302621096373,
-0.013169677928090096,
0.08315623551607132,
0.048002708703279495,
0.08747512847185135,
0.04631964862346649,
-0.04238833487033844,
-0.07192131876945496,
-0.015423271805047989,
-0.06248144060373306,
-0.12843011319637299,
0.01415067259222269,
0.10219836235046387,
-0.04232462868094444,
0.004905268084257841,
0.005401026923209429,
0.14432111382484436,
-0.07731365412473679,
0.03285570070147514,
0.16565273702144623,
-0.2588576674461365,
-0.0167669877409935,
0.1963481456041336,
-0.038554199039936066,
-0.05645156651735306,
-0.043142177164554596,
0.12324881553649902,
0.07244222611188889,
-0.004625716246664524,
0.018482476472854614,
0.005373685155063868,
-0.0582038089632988,
0.004548158962279558,
-0.1251324862241745,
-0.08688399940729141,
0.10142362117767334,
0.04301011934876442,
0.01788032427430153,
-0.1391681730747223,
-0.1398666501045227,
-0.07841552048921585,
-0.08087349683046341,
-0.03281151130795479,
-0.050029363483190536,
0.013081520795822144,
-0.04755450412631035,
-0.06885547190904617,
-0.05493619665503502,
-0.054852358996868134,
-0.12825199961662292,
-0.020274296402931213,
0.055832017213106155,
0.07141965627670288,
-0.11114601790904999,
0.15973879396915436,
-0.13896895945072174,
-0.07006248831748962,
0.023810353130102158,
-0.1735348105430603,
-0.0386853851377964,
0.05033264681696892,
0.0025187553837895393,
0.09158098697662354,
0.048293426632881165,
0.051870979368686676,
0.06302820891141891,
-0.062056999653577805,
-0.04989057406783104,
0.07180584967136383,
0.004131177440285683,
0.05719098448753357,
-0.21682621538639069,
0.052312131971120834,
0.10336671024560928,
-0.10921835899353027,
0.03436296060681343,
-0.001915074186399579,
-0.06354513019323349,
-0.1949862837791443,
0.10772910714149475,
0.03343157097697258,
-0.05069567263126373,
0.14719067513942719,
-0.007172795012593269,
-0.09336641430854797,
0.03580242022871971,
0.020214088261127472,
-0.03599222004413605,
0.008961305022239685,
-0.05774090811610222,
0.09828163683414459,
0.019187666475772858,
-0.010370809584856033,
-0.047994695603847504,
-0.016126442700624466,
-0.08550489693880081,
-0.09384763985872269,
-0.023313963785767555,
-0.1278146207332611,
0.007805879693478346,
-0.12063008546829224,
0.048124123364686966,
-0.182392880320549,
-0.10319117456674576,
0.10545344650745392,
-0.03186051920056343,
-0.08286784589290619,
0.028765182942152023,
-0.017937446013092995,
-0.08169630169868469,
-0.03485354036092758,
-0.002181513700634241,
0.2025739997625351,
-0.06069057434797287,
0.025058921426534653,
-0.05550198629498482,
0.10789656639099121,
-0.062029894441366196,
0.016790851950645447,
-0.055043257772922516,
0.01600266806781292,
-0.039161328226327896,
0.046843912452459335,
-0.05897221714258194,
0.01944148726761341,
-0.0593847893178463,
-0.11364024132490158,
-0.06375830620527267,
-0.018789883702993393,
-0.018156247213482857,
0.10824260860681534,
-0.12077763676643372,
-0.040146250277757645,
0.13398756086826324,
-0.10826142877340317,
-0.1026059091091156,
0.16136784851551056,
-0.05438670143485069,
-0.078839510679245,
0.05759977176785469,
0.059069812297821045,
0.036355700343847275,
-0.08574000746011734,
-0.004840698558837175,
0.020653238520026207,
-0.1353570520877838,
0.027637021616101265,
-0.029763687402009964,
0.17581331729888916,
-0.026089800521731377,
0.007215505000203848,
-0.018154747784137726,
0.08556707948446274,
-0.12391061335802078,
-0.07121458649635315,
-0.02715633064508438,
-0.04690784588456154,
0.047937240451574326,
0.024279402568936348,
0.07154174894094467,
0.02159017324447632,
-0.042509108781814575,
-0.19361723959445953,
0.0745713859796524,
-0.07585547119379044,
0.033028773963451385,
-0.10324718058109283,
0.15314149856567383,
-0.171730637550354,
0.0348488911986351,
-0.09625273942947388,
-0.013696679845452309,
0.08229954540729523,
0.09652430564165115,
-0.03524424508213997,
0.23046615719795227,
0.020762275904417038,
-0.0235153641551733,
-0.02563856914639473,
0.007122252136468887,
0.12083598226308823,
-0.012913577258586884,
-0.027636321261525154,
-0.08635241538286209,
0.0035512647591531277,
-0.055082742124795914,
0.05413991957902908,
-0.1518312394618988,
-0.06448861211538315,
0.16233596205711365,
0.09584648907184601,
-0.003513919422402978,
0.05414927378296852,
0.07608569413423538,
-0.04689846560359001,
-0.026052789762616158,
-0.05700858682394028,
0.07716908305883408,
-0.04094779118895531,
-0.08334865421056747,
0.09776940941810608,
-0.020029742270708084,
0.10907687991857529,
0.15437503159046173,
-0.19497232139110565,
-0.006404637359082699,
-0.08160242438316345,
-0.0411623939871788,
-0.005168039817363024,
0.012450735084712505,
0.013921326026320457,
0.09367674589157104,
-0.058086562901735306,
0.07767275720834732,
-0.07990989089012146,
-0.007191366050392389,
0.09208829700946808,
0.01931934431195259,
-0.05258312076330185,
0.014040365815162659,
-0.014247705228626728,
-0.17193429172039032,
0.047911446541547775,
-0.0262258630245924,
-0.02670128084719181,
0.11586965620517731,
0.00007845423533581197,
-0.055454641580581665,
0.03155643492937088,
-0.0027824239805340767,
-0.0008312980644404888,
0.10997223854064941,
-0.25142067670822144,
-0.034535378217697144,
0.10295703262090683,
-0.05220814049243927,
0.040261123329401016,
-0.08250468969345093,
-0.01643470488488674,
-0.009970963932573795,
-0.04823640361428261,
0.09187542647123337,
0.10019419342279434,
0.023028606548905373,
0.10310190171003342,
-0.019099799916148186,
-0.0544210784137249,
0.03468315675854683,
-0.06332828104496002,
-0.03276040405035019,
0.12443429231643677,
-0.02037878707051277,
-0.22670917212963104,
-0.010329759679734707,
-0.12267027795314789,
-0.10686878859996796,
0.027698049321770668,
-0.045323602855205536,
-0.053724680095911026,
-0.057877395302057266,
-0.001824894337914884,
0.025548599660396576,
0.06046121194958687,
0.13259479403495789,
-0.013020436279475689,
-0.015731988474726677,
0.0003405411262065172,
-0.07408925145864487,
-0.027079904451966286,
-0.1585162878036499,
-0.06949497014284134,
0.11511550843715668,
-0.06701728701591492,
0.10657902806997299,
0.13462597131729126,
-0.033136263489723206,
0.055377159267663956,
-0.01955120824277401,
0.22408327460289001,
-0.03871285542845726,
-0.007504812907427549,
0.1149955540895462,
-0.00277430796995759,
-0.04633240029215813,
0.055758893489837646,
-0.0024486323818564415,
-0.12367261201143265,
-0.017607882618904114,
-0.02405916526913643,
-0.0814751386642456,
-0.06740208715200424,
-0.1436948925256729,
-0.06348532438278198,
-0.07058858126401901,
0.10393006354570389,
0.053803861141204834,
0.050376057624816895,
0.12162493169307709,
-0.07099897414445877,
0.07756780087947845,
0.01396020594984293,
0.07864218950271606,
0.2001616358757019,
-0.027483241632580757,
0.071937195956707,
-0.003433525562286377,
-0.07744637131690979,
0.07785987108945847,
0.0036561593879014254,
0.1957891434431076,
-0.0944264754652977,
-0.010925919748842716,
0.1104651540517807,
0.11343732476234436,
0.15325374901294708,
0.06317868828773499,
-0.1025017574429512,
-0.053842589259147644,
-0.0682816356420517,
-0.07366474717855453,
0.027997972443699837,
0.04680997505784035,
0.04256686195731163,
-0.013725138269364834,
-0.03582058474421501,
-0.04817815497517586,
0.06116481497883797,
0.05095639079809189,
0.14710451662540436,
-0.32375502586364746,
-0.01590830460190773,
0.03784898668527603,
0.06724050641059875,
-0.09171736240386963,
0.07280699163675308,
0.13746511936187744,
-0.016305940225720406,
0.07731778174638748,
-0.0959644690155983,
0.07880806177854538,
-0.17951835691928864,
0.08094010502099991,
-0.07259444892406464,
0.01725406013429165,
0.01758768782019615,
0.07133503258228302,
-0.23835109174251556,
0.16911207139492035,
0.0012479160213842988,
-0.021392380818724632,
-0.11262451112270355,
-0.04633157327771187,
-0.0014393709134310484,
0.06859473139047623,
0.1804300993680954,
-0.012623609974980354,
0.15127645432949066,
0.04905962198972702,
-0.033480312675237656,
0.049539174884557724,
0.004743674769997597,
0.08141372352838516,
-0.019363099709153175,
0.01627309061586857,
-0.03036346472799778,
0.04056968167424202,
0.1155431941151619,
-0.12342916429042816,
-0.05468396842479706,
0.008962482213973999,
0.0938323587179184,
-0.00970400683581829,
-0.010999361053109169,
-0.017631053924560547,
-0.08289269357919693,
0.1518341600894928,
-0.0651254802942276,
-0.04386472702026367,
-0.056731563061475754,
0.21814477443695068,
0.06912100315093994,
-0.02362009324133396,
0.1058485209941864,
-0.04150914028286934,
0.13592970371246338,
-0.02693130634725094,
-0.14725086092948914,
0.08979278802871704,
-0.0733928382396698,
-0.14290232956409454,
0.015125725418329239,
0.05344567820429802,
0.01903669349849224,
-0.009752935729920864,
0.05808134749531746,
-0.009121548384428024,
-0.05391140654683113,
-0.08365028351545334,
0.05879496410489082,
0.20787523686885834,
0.0426800437271595,
0.04558611288666725,
0.009252921678125858,
-0.07181859761476517,
-0.04049725458025932,
0.09530876576900482,
0.2104986608028412,
0.19243204593658447,
-0.11198991537094116,
0.014740685001015663,
0.1156618520617485,
-0.009848332963883877,
-0.24811823666095734,
0.017556507140398026,
-0.053451057523489,
0.036332953721284866,
-0.16758914291858673,
-0.007926496677100658,
0.049072861671447754,
0.03012709692120552,
0.010720927268266678,
0.06746311485767365,
-0.23276782035827637,
-0.09003718197345734,
0.20104138553142548,
0.0378900021314621,
0.29255691170692444,
-0.00434926338493824,
0.00023318652529269457,
-0.03992844000458717,
-0.09614408016204834,
0.13915641605854034,
0.14201276004314423,
0.11487174779176712,
-0.06997323036193848,
0.008355415426194668,
0.03367175534367561,
-0.0658058300614357,
0.08421595394611359,
0.05632702633738518,
0.13695843517780304,
-0.05729701370000839,
-0.1257450431585312,
0.09603458642959595,
0.023890037089586258,
0.02456710673868656,
0.197825625538826,
0.056157186627388,
-0.0029006320983171463,
-0.041976429522037506,
-0.1180458515882492,
0.08093495666980743,
0.056412454694509506,
-0.11376442015171051,
-0.15975329279899597,
0.02806425467133522,
0.04994115233421326,
0.03173891082406044,
0.11223968863487244,
-0.015816234052181244,
-0.008949514478445053,
0.06473932415246964,
0.043101415038108826,
-0.12260926514863968,
0.00003855093018501066,
-0.0478915274143219,
0.013018770143389702,
0.16068409383296967,
-0.07625658065080643,
0.04487127438187599,
0.1196865439414978,
0.0439654104411602,
0.027549516409635544,
0.09363047778606415,
-0.07795388996601105,
0.031118890270590782,
0.04857863485813141,
-0.2205805778503418,
-0.030117562040686607,
-0.026701902970671654,
-0.1143084466457367,
0.17103835940361023,
0.031141381710767746,
0.09909985214471817,
-0.09087353199720383,
-0.005343601107597351,
0.04882163181900978,
0.010228712111711502,
-0.02568952925503254,
0.1092120036482811,
0.07101928442716599,
0.019064193591475487,
-0.16089054942131042,
0.10899622738361359,
-0.0579642727971077,
-0.10477715730667114,
-0.03314938396215439,
-0.10216228663921356,
-0.13338246941566467,
-0.06509534269571304,
-0.14007414877414703,
0.020101463422179222,
-0.05979097634553909,
-0.004608015064150095,
0.039635613560676575,
-0.09810823202133179,
0.07895441353321075,
0.07138384133577347,
0.09149431437253952,
0.12671509385108948,
-0.07154395431280136,
-0.026475269347429276,
-0.0908731147646904,
0.0477762371301651,
0.018035540357232094,
0.028601717203855515,
-0.20517891645431519,
-0.008689504116773605,
0.0008658569422550499,
0.0998258888721466,
-0.11887803673744202,
-0.02639715187251568,
-0.07176905125379562,
0.035773128271102905,
-0.19764447212219238,
0.026253558695316315,
-0.05157098174095154,
-0.02112397365272045,
-0.03161309286952019,
-0.061824411153793335,
-0.06439051032066345,
0.01332505140453577,
-0.04967832192778587,
0.06464166939258575,
0.014276986010372639,
-0.011457708664238453,
-0.02584722265601158,
-0.05131293088197708,
-0.026338152587413788,
-0.010712631978094578,
0.18385988473892212,
0.07970954477787018,
-0.09421750158071518,
0.016672831028699875,
-0.18760569393634796,
-0.15840210020542145,
0.19305990636348724,
0.031243063509464264,
0.062132012099027634,
-0.003568680491298437,
0.08134310692548752,
-0.0014341986970975995,
-0.08648546040058136,
0.02248586155474186,
0.11605896055698395,
-0.04444373771548271,
-0.001603697775863111,
-0.06323643028736115,
0.09124796092510223,
-0.06625024974346161,
0.031946372240781784,
0.11344743520021439,
0.1341562569141388,
-0.012598819099366665,
-0.02097618766129017,
-0.02700512297451496,
-0.03896445035934448,
-0.012169239111244678,
-0.0476670041680336,
-0.15388686954975128,
0.08284958451986313,
0.03586924448609352,
0.043150197714567184,
-0.036483198404312134,
0.30271098017692566,
-0.038182344287633896,
-0.1349017322063446,
-0.02040836587548256,
0.13099436461925507,
-0.042902279645204544,
0.06097203865647316,
0.22157803177833557,
0.04669235646724701,
0.020453903824090958,
0.004801352508366108,
0.11210814863443375,
0.15736956894397736,
0.19771580398082733,
0.07210298627614975,
-0.010328847914934158,
0.02263786457479,
0.05871551111340523,
0.08374630659818649,
-0.04222254827618599,
-0.014477754011750221,
0.12470651417970657,
-0.01783115789294243,
0.13757316768169403,
-0.05984734743833542,
-0.06564030051231384,
0.16039055585861206,
-0.03514346480369568,
-0.019482839852571487,
0.06321397423744202,
-0.06422211974859238,
-0.06712745875120163,
-0.30973750352859497,
-0.06930948793888092,
-0.0837133601307869,
0.04306716099381447,
-0.10335639864206314,
-0.06399383395910263,
0.11595457047224045,
0.10339266061782837,
-0.08517305552959442,
0.11591091752052307,
0.02628939412534237,
-0.06685591489076614,
0.15243573486804962,
0.04525550827383995,
-0.11479561775922775,
-0.007489575073122978,
-0.038005005568265915,
0.030405990779399872,
0.047050900757312775,
-0.0583309531211853,
0.015236783772706985,
0.0786675363779068,
0.007481671404093504,
-0.008275522850453854,
-0.03674212843179703,
-0.08737700432538986,
0.014793039299547672,
-0.004175290931016207,
0.05299382284283638,
0.03157241269946098,
-0.06091536208987236,
-0.0059429737739264965,
0.16405387222766876,
-0.02810860611498356,
-0.03487637639045715,
-0.04959568381309509,
0.08489644527435303,
-0.07971721887588501,
0.08609160780906677,
0.011885033920407295,
0.013084099628031254,
-0.02181520313024521,
0.2950965464115143,
0.30513668060302734,
-0.09446188062429428,
0.032236501574516296,
0.041690900921821594,
0.012082001194357872,
-0.051131874322891235,
0.1099165603518486,
0.02554938569664955,
0.20298711955547333,
-0.018108075484633446,
-0.03224192559719086,
-0.039370328187942505,
0.00033285023528151214,
-0.025297095999121666,
0.0655948594212532,
0.09542243927717209,
-0.035739466547966,
-0.09374639391899109,
0.03916458785533905,
-0.12977980077266693,
-0.018579255789518356,
-0.022999698296189308,
-0.10486336052417755,
-0.08199120312929153,
-0.024616146460175514,
-0.11571544408798218,
0.023285340517759323,
0.054822634905576706,
-0.14114485681056976,
0.03407776728272438,
-0.04425080120563507,
0.05484022572636604,
-0.1239151582121849,
-0.0027589802630245686,
0.05152979865670204,
0.1473265290260315,
0.13015475869178772,
-0.029215626418590546,
0.04786044731736183,
0.05704121291637421,
-0.022939937189221382,
-0.08877935260534286,
-0.006390891503542662,
-0.007068788167089224,
-0.088588185608387,
-0.044952861964702606,
0.09066318720579147,
0.00558328116312623,
-0.002367852721363306,
0.06542950123548508,
-0.12202136963605881,
-0.012057955376803875,
-0.08222788572311401,
-0.022575730457901955,
-0.07591059803962708,
-0.0020740211475640535,
-0.02883051335811615,
0.08950857818126678,
0.02644491195678711,
-0.03795720636844635,
-0.046601518988609314,
-0.07351314276456833,
0.040315963327884674,
0.02005120739340782,
0.06560323387384415,
0.03861295431852341,
-0.10092194378376007,
-0.002821374451741576,
0.026102175936102867,
0.009845511987805367,
0.023627838119864464,
-0.02013516053557396,
-0.04062527418136597,
-0.0142546147108078,
0.007165275979787111,
0.09404977411031723,
0.03604322299361229,
-0.007487105671316385,
-0.014484181068837643,
-0.018073109909892082,
0.0035616036038845778,
0.010272789746522903,
-0.1382446438074112,
-0.09157375991344452
] |
null | null |
transformers
|
# vgg13_bn
Implementation of VGG proposed in [Very Deep Convolutional Networks For
Large-Scale Image Recognition](https://arxiv.org/pdf/1409.1556.pdf)
``` python
VGG.vgg11()
VGG.vgg13()
VGG.vgg16()
VGG.vgg19()
VGG.vgg11_bn()
VGG.vgg13_bn()
VGG.vgg16_bn()
VGG.vgg19_bn()
```
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
``` python
# change activation
VGG.vgg11(activation = nn.SELU)
# change number of classes (default is 1000 )
VGG.vgg11(n_classes=100)
# pass a different block
from nn.models.classification.senet import SENetBasicBlock
VGG.vgg11(block=SENetBasicBlock)
# store the features tensor after every block
```
|
{}
| null |
glasses/vgg13_bn
|
[
"transformers",
"pytorch",
"arxiv:1409.1556",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1409.1556"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us
|
# vgg13_bn
Implementation of VGG proposed in Very Deep Convolutional Networks For
Large-Scale Image Recognition
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
|
[
"# vgg13_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n",
"# vgg13_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
29,
86
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n# vgg13_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
-0.06475048512220383,
-0.039802584797143936,
-0.0023066324647516012,
0.03094535320997238,
0.012478829361498356,
0.05105108395218849,
-0.011551514267921448,
0.08346470445394516,
-0.03602958098053932,
0.018178168684244156,
0.18572449684143066,
0.03881339728832245,
0.02816106751561165,
0.06492423266172409,
-0.05377376824617386,
-0.2428695261478424,
0.04239175468683243,
0.13234055042266846,
0.014463607221841812,
0.056565966457128525,
0.06972011923789978,
-0.14198100566864014,
0.07519340515136719,
0.0026604807935655117,
-0.24377559125423431,
-0.04843262583017349,
0.012877154164016247,
-0.041134484112262726,
0.0614258348941803,
0.07672403007745743,
0.033263400197029114,
0.025912635028362274,
0.008431079797446728,
-0.05337705463171005,
0.03369387611746788,
-0.0019151826854795218,
-0.013335954397916794,
0.08296763151884079,
0.047423385083675385,
0.08921542763710022,
0.04692258685827255,
-0.04210308566689491,
-0.07037641853094101,
-0.014712962321937084,
-0.064165398478508,
-0.11946055293083191,
0.013700645416975021,
0.10074427723884583,
-0.04093467444181442,
0.0064729745499789715,
0.006056458223611116,
0.14602357149124146,
-0.07847360521554947,
0.03415048494935036,
0.16794587671756744,
-0.25389963388442993,
-0.018845858052372932,
0.20759578049182892,
-0.03528529405593872,
-0.05399177223443985,
-0.04098207876086235,
0.12529873847961426,
0.07021990418434143,
-0.002478700364008546,
0.02160295657813549,
0.005447625648230314,
-0.05338865518569946,
0.005755584686994553,
-0.1265471875667572,
-0.08880119025707245,
0.09606382250785828,
0.04328623786568642,
0.017946679145097733,
-0.13985079526901245,
-0.1398114562034607,
-0.07881695032119751,
-0.08345218747854233,
-0.034489888697862625,
-0.05003802105784416,
0.01244692225009203,
-0.052255477756261826,
-0.06939546018838882,
-0.053661152720451355,
-0.0549655519425869,
-0.12842361629009247,
-0.02240000292658806,
0.0547042042016983,
0.06956946849822998,
-0.11018934845924377,
0.1581777185201645,
-0.13965754210948944,
-0.06629850715398788,
0.021931996569037437,
-0.173984095454216,
-0.04066750407218933,
0.04866249859333038,
0.0008343648514710367,
0.08922027796506882,
0.04940485581755638,
0.05180906131863594,
0.05437096953392029,
-0.06148970127105713,
-0.04569423198699951,
0.07118936628103256,
0.003273267997428775,
0.05485307052731514,
-0.2190071940422058,
0.055793263018131256,
0.10263792425394058,
-0.11092310398817062,
0.03571978956460953,
-0.0034771044738590717,
-0.06339030712842941,
-0.1954057365655899,
0.10822942852973938,
0.03333356976509094,
-0.04931768774986267,
0.14759637415409088,
-0.006937472149729729,
-0.09581304341554642,
0.0399513840675354,
0.021318722516298294,
-0.037104737013578415,
0.007233615033328533,
-0.05428745225071907,
0.09699346125125885,
0.017767421901226044,
-0.008423461578786373,
-0.047386251389980316,
-0.015415837056934834,
-0.08443933725357056,
-0.09552183002233505,
-0.022274363785982132,
-0.12646935880184174,
0.008967184461653233,
-0.1158219575881958,
0.04780179634690285,
-0.18280519545078278,
-0.10425304621458054,
0.10694056749343872,
-0.03136955946683884,
-0.08386562764644623,
0.024963470175862312,
-0.015178252011537552,
-0.08098044991493225,
-0.034849002957344055,
-0.0015447018668055534,
0.1971726268529892,
-0.0607844777405262,
0.024180825799703598,
-0.05505988746881485,
0.10640968382358551,
-0.0607326366007328,
0.014974645338952541,
-0.054009001702070236,
0.017954090610146523,
-0.043633002787828445,
0.04753889888525009,
-0.05750799551606178,
0.02108708582818508,
-0.05885172262787819,
-0.11553153395652771,
-0.06513134390115738,
-0.018055303022265434,
-0.01925806514918804,
0.1090485006570816,
-0.11923924088478088,
-0.04118441790342331,
0.13317914307117462,
-0.10852982103824615,
-0.10442106425762177,
0.16092471778392792,
-0.053541094064712524,
-0.08092031627893448,
0.05639396235346794,
0.058014482259750366,
0.0376502200961113,
-0.08652037382125854,
-0.0005757255130447447,
0.022064846009016037,
-0.1359739452600479,
0.02585669606924057,
-0.029898231849074364,
0.17566344141960144,
-0.028984731063246727,
0.007686179131269455,
-0.019526546820998192,
0.08726036548614502,
-0.12391339987516403,
-0.0723874568939209,
-0.027051914483308792,
-0.04595262557268143,
0.04901084676384926,
0.023289833217859268,
0.0720345750451088,
0.022377653047442436,
-0.039887357503175735,
-0.18976885080337524,
0.07479696720838547,
-0.07552235573530197,
0.034430552273988724,
-0.10218755155801773,
0.15341909229755402,
-0.17078803479671478,
0.03445987403392792,
-0.09622740000486374,
-0.017277156934142113,
0.08180267363786697,
0.08984339237213135,
-0.03905322775244713,
0.2249123603105545,
0.021725108847022057,
-0.024074029177427292,
-0.027717718854546547,
0.010381147265434265,
0.12045399844646454,
-0.012276503257453442,
-0.026868458837270737,
-0.08493546396493912,
0.0032392553985118866,
-0.05347743630409241,
0.05237962678074837,
-0.1499367505311966,
-0.06331799924373627,
0.16111165285110474,
0.09547742456197739,
-0.00567195750772953,
0.05136311799287796,
0.0774049162864685,
-0.047272928059101105,
-0.026031922549009323,
-0.05603455379605293,
0.07760892063379288,
-0.039627742022275925,
-0.08298342674970627,
0.09651757776737213,
-0.01867612823843956,
0.1102863997220993,
0.15389783680438995,
-0.19519709050655365,
-0.008077569305896759,
-0.083369679749012,
-0.041192714124917984,
-0.005098224617540836,
0.012687845155596733,
0.014441810548305511,
0.08871937543153763,
-0.05726160481572151,
0.07707379758358002,
-0.0800648182630539,
-0.007362411357462406,
0.0938967689871788,
0.019406598061323166,
-0.05417582392692566,
0.011972960084676743,
-0.008982229046523571,
-0.16754063963890076,
0.04821678623557091,
-0.02744865231215954,
-0.026820441707968712,
0.11449062079191208,
0.0006824558950029314,
-0.05343828722834587,
0.03265160322189331,
-0.0026081257965415716,
-0.00078637182014063,
0.10714279115200043,
-0.25177428126335144,
-0.03267432376742363,
0.101727195084095,
-0.05040217190980911,
0.04036305844783783,
-0.07911202311515808,
-0.019153153523802757,
-0.007421532180160284,
-0.05035356059670448,
0.09237278252840042,
0.10027771443128586,
0.021350426599383354,
0.1021074652671814,
-0.016417643055319786,
-0.055345624685287476,
0.03719359636306763,
-0.06382638216018677,
-0.03489332273602486,
0.12532390654087067,
-0.018543099984526634,
-0.23297083377838135,
-0.011442857794463634,
-0.12642258405685425,
-0.11025374382734299,
0.02740572765469551,
-0.046099524945020676,
-0.05330483242869377,
-0.05593409016728401,
-0.002817635191604495,
0.022643115371465683,
0.059819210320711136,
0.13287340104579926,
-0.013325747102499008,
-0.017002178356051445,
0.004835366737097502,
-0.07261636108160019,
-0.026257408782839775,
-0.15825416147708893,
-0.06885415315628052,
0.11487643420696259,
-0.06938980519771576,
0.1099274531006813,
0.1358739286661148,
-0.03259348124265671,
0.05443377420306206,
-0.019574685022234917,
0.231483593583107,
-0.039449531584978104,
-0.009921938180923462,
0.11385989934206009,
0.00016180933744180948,
-0.0471329391002655,
0.0562901645898819,
-0.0014909275341778994,
-0.12297207862138748,
-0.017645198851823807,
-0.023203684017062187,
-0.0831347107887268,
-0.06613349914550781,
-0.14442196488380432,
-0.06395819038152695,
-0.07024212181568146,
0.10042357444763184,
0.05543515086174011,
0.055150315165519714,
0.1200893297791481,
-0.06973063945770264,
0.07467308640480042,
0.016122758388519287,
0.08019014447927475,
0.20083829760551453,
-0.02589774876832962,
0.07258359342813492,
-0.0015421409625560045,
-0.0751977413892746,
0.0793793797492981,
0.003497786819934845,
0.18988215923309326,
-0.09229069948196411,
-0.012054996564984322,
0.11143196374177933,
0.11478782445192337,
0.1548030525445938,
0.05906480550765991,
-0.10139940679073334,
-0.05559461563825607,
-0.06785988807678223,
-0.07388492673635483,
0.028594352304935455,
0.044563375413417816,
0.03825487941503525,
-0.014894229359924793,
-0.037427354604005814,
-0.049724217504262924,
0.062391191720962524,
0.055130306631326675,
0.14469702541828156,
-0.3239385783672333,
-0.017926916480064392,
0.03591997176408768,
0.06787274032831192,
-0.08924386650323868,
0.07298269122838974,
0.13447368144989014,
-0.01830025017261505,
0.07739817351102829,
-0.09753827005624771,
0.07913379371166229,
-0.17876651883125305,
0.08317428082227707,
-0.06799876689910889,
0.01808863878250122,
0.01718132197856903,
0.07088510692119598,
-0.23890312016010284,
0.16982752084732056,
0.00006304092676145956,
-0.023964913561940193,
-0.10810483247041702,
-0.047651201486587524,
-0.003320915624499321,
0.07282129675149918,
0.18195189535617828,
-0.010240517556667328,
0.15261685848236084,
0.04822088032960892,
-0.032086197286844254,
0.04975206404924393,
0.0042793601751327515,
0.08113754540681839,
-0.02124205231666565,
0.01674601621925831,
-0.03125739470124245,
0.04155353456735611,
0.11563051491975784,
-0.12163588404655457,
-0.05411755293607712,
0.010496542789041996,
0.09243575483560562,
-0.014256144873797894,
-0.010664080269634724,
-0.01758544333279133,
-0.08643246442079544,
0.15994833409786224,
-0.06433220207691193,
-0.043542295694351196,
-0.05770537257194519,
0.21489973366260529,
0.06749220937490463,
-0.02382080815732479,
0.10766472667455673,
-0.042362332344055176,
0.13806366920471191,
-0.026400607079267502,
-0.14970865845680237,
0.09112618863582611,
-0.07481037080287933,
-0.1403290331363678,
0.016523847356438637,
0.05155779421329498,
0.01934918761253357,
-0.009612265974283218,
0.057284243404865265,
-0.009938957169651985,
-0.05045228451490402,
-0.08445131033658981,
0.05773889645934105,
0.21170972287654877,
0.04092243313789368,
0.04301050305366516,
0.01300701405853033,
-0.06665434688329697,
-0.04010344669222832,
0.09546877443790436,
0.2102181315422058,
0.19333842396736145,
-0.11310125887393951,
0.017283212393522263,
0.11744015663862228,
-0.009416124783456326,
-0.24691596627235413,
0.01740669086575508,
-0.055152054876089096,
0.03406234830617905,
-0.16437655687332153,
-0.007083948235958815,
0.04579583555459976,
0.032233864068984985,
0.00978141650557518,
0.054874468594789505,
-0.23670092225074768,
-0.09018303453922272,
0.1963234841823578,
0.040778521448373795,
0.29196178913116455,
-0.0036146859638392925,
-0.001699870335869491,
-0.03808607906103134,
-0.0973598062992096,
0.13681769371032715,
0.14399848878383636,
0.11284623295068741,
-0.06965707987546921,
0.007089295890182257,
0.03374244645237923,
-0.06711127609014511,
0.08579816669225693,
0.054706595838069916,
0.1346544325351715,
-0.05874773859977722,
-0.12041670083999634,
0.09163043648004532,
0.023149624466896057,
0.022143134847283363,
0.20025469362735748,
0.0579935759305954,
0.0006566969095729291,
-0.043448060750961304,
-0.11980192363262177,
0.07955311983823776,
0.056009624153375626,
-0.11500703543424606,
-0.16059811413288116,
0.0304307471960783,
0.052092038094997406,
0.03181404247879982,
0.1134520173072815,
-0.016995251178741455,
-0.005055888555943966,
0.06669678539037704,
0.04457941651344299,
-0.1248546615242958,
-0.0032590848859399557,
-0.047630202025175095,
0.01196033600717783,
0.15836374461650848,
-0.07346588373184204,
0.045150674879550934,
0.11872734129428864,
0.04278937354683876,
0.026453088968992233,
0.09312835335731506,
-0.07842613756656647,
0.028845522552728653,
0.04951168969273567,
-0.22233812510967255,
-0.029946651309728622,
-0.028077619150280952,
-0.1166563332080841,
0.16848507523536682,
0.03136889636516571,
0.09983839094638824,
-0.09005235135555267,
-0.004142816178500652,
0.050811685621738434,
0.011767726391553879,
-0.025371907278895378,
0.10741780698299408,
0.06980251520872116,
0.018766600638628006,
-0.16073381900787354,
0.10594356060028076,
-0.05760950967669487,
-0.10559582710266113,
-0.034630585461854935,
-0.10075990855693817,
-0.13509343564510345,
-0.06520146876573563,
-0.14171810448169708,
0.011399054899811745,
-0.06737492978572845,
-0.00321971345692873,
0.039553090929985046,
-0.09733401983976364,
0.07899165898561478,
0.07001348584890366,
0.09290000051259995,
0.1272715926170349,
-0.06970275938510895,
-0.026548605412244797,
-0.09086649119853973,
0.04493962973356247,
0.017029862850904465,
0.02944036014378071,
-0.20737622678279877,
-0.011185486800968647,
0.00211714138276875,
0.10126332193613052,
-0.11930497735738754,
-0.026920394971966743,
-0.07331257313489914,
0.03652884438633919,
-0.20015601813793182,
0.023624787107110023,
-0.054961420595645905,
-0.02118428237736225,
-0.03295646607875824,
-0.06071485951542854,
-0.06553923338651657,
0.012092093005776405,
-0.049493208527565,
0.06350904703140259,
0.013155033811926842,
-0.012251567095518112,
-0.023808814585208893,
-0.048885345458984375,
-0.02489037998020649,
-0.010744614526629448,
0.1844734400510788,
0.07849881052970886,
-0.09340155869722366,
0.01713641546666622,
-0.19015470147132874,
-0.1569899320602417,
0.19283361732959747,
0.030026912689208984,
0.062126532196998596,
-0.002393395872786641,
0.08066272735595703,
0.0008180539589375257,
-0.08810564875602722,
0.024690860882401466,
0.11140844970941544,
-0.045901354402303696,
0.0012841923162341118,
-0.062441833317279816,
0.0909251794219017,
-0.06604351848363876,
0.030631180852651596,
0.11438459157943726,
0.1357528418302536,
-0.015605021268129349,
-0.02256500907242298,
-0.026210669428110123,
-0.03833802044391632,
-0.012684961780905724,
-0.04833478853106499,
-0.15220648050308228,
0.07971490919589996,
0.03685057535767555,
0.043472010642290115,
-0.03580126166343689,
0.3057071566581726,
-0.036740925163030624,
-0.13199257850646973,
-0.01926862634718418,
0.12760299444198608,
-0.03432747721672058,
0.06000862643122673,
0.22075732052326202,
0.04851530119776726,
0.01939268782734871,
0.010570884682238102,
0.11470328271389008,
0.15585801005363464,
0.1956358253955841,
0.07186723500490189,
-0.008989031426608562,
0.01790582202374935,
0.05746036022901535,
0.08406860381364822,
-0.0389196053147316,
-0.011781609617173672,
0.12388893216848373,
-0.020590154454112053,
0.13799338042736053,
-0.06041094660758972,
-0.07031776756048203,
0.16381922364234924,
-0.03758139908313751,
-0.01850602775812149,
0.06260758638381958,
-0.06540646404027939,
-0.06939484924077988,
-0.30991700291633606,
-0.06942940503358841,
-0.08556016534566879,
0.04253857582807541,
-0.10355675965547562,
-0.06699316948652267,
0.11773009598255157,
0.1039159744977951,
-0.08486747741699219,
0.11418946832418442,
0.032590098679065704,
-0.06428353488445282,
0.1501205861568451,
0.04471953213214874,
-0.11297257989645004,
-0.0067775375209748745,
-0.03934360668063164,
0.030891984701156616,
0.05007172003388405,
-0.058513302356004715,
0.014436803758144379,
0.0812935158610344,
0.008143989369273186,
-0.008221752010285854,
-0.03776220977306366,
-0.08693533390760422,
0.01530708558857441,
-0.005636405665427446,
0.05269663780927658,
0.03156154230237007,
-0.06161423400044441,
-0.0058387224562466145,
0.1682453751564026,
-0.028051037341356277,
-0.03144820034503937,
-0.05145617201924324,
0.09156107157468796,
-0.08008507639169693,
0.08653871715068817,
0.011399207636713982,
0.012865548953413963,
-0.0247101578861475,
0.299393892288208,
0.30776816606521606,
-0.09584233909845352,
0.03293117508292198,
0.041806433349847794,
0.012882137671113014,
-0.048444148153066635,
0.11198551952838898,
0.02457832731306553,
0.20403873920440674,
-0.0166087094694376,
-0.03208063542842865,
-0.03902795910835266,
0.00025260119582526386,
-0.02457991987466812,
0.06474759429693222,
0.09747586399316788,
-0.0342278853058815,
-0.09449241310358047,
0.04200596362352371,
-0.13149940967559814,
-0.016633229330182076,
-0.027515729889273643,
-0.1029684767127037,
-0.08033709228038788,
-0.023183338344097137,
-0.12040051817893982,
0.022941986098885536,
0.05459833890199661,
-0.14063608646392822,
0.03397612273693085,
-0.046845391392707825,
0.05564587190747261,
-0.12406352162361145,
-0.004370670299977064,
0.05517961457371712,
0.1574656218290329,
0.13168317079544067,
-0.02836151048541069,
0.04697651043534279,
0.05658789351582527,
-0.022861074656248093,
-0.08917700499296188,
-0.006649676710367203,
-0.005760332103818655,
-0.0891399085521698,
-0.044846802949905396,
0.09207858890295029,
0.005831468850374222,
-0.0025107599794864655,
0.06598135828971863,
-0.12138950824737549,
-0.013166735880076885,
-0.08830345422029495,
-0.021726150065660477,
-0.07540002465248108,
-0.0025363105814903975,
-0.02796756476163864,
0.08806527405977249,
0.028292415663599968,
-0.03764640912413597,
-0.04544804245233536,
-0.0730363205075264,
0.04107208922505379,
0.01796872727572918,
0.060826994478702545,
0.038493454456329346,
-0.10257402062416077,
-0.001982158748432994,
0.02026728168129921,
0.008460050448775291,
0.021410267800092697,
-0.02188093587756157,
-0.03730887174606323,
-0.011599048040807247,
0.00789980310946703,
0.09593398869037628,
0.03344680368900299,
-0.009511841461062431,
-0.01438810769468546,
-0.0069109550677239895,
0.0023234363179653883,
0.010762888006865978,
-0.1381533294916153,
-0.0915379673242569
] |
null | null |
transformers
|
# vgg19_bn
Implementation of VGG proposed in [Very Deep Convolutional Networks For
Large-Scale Image Recognition](https://arxiv.org/pdf/1409.1556.pdf)
``` python
VGG.vgg11()
VGG.vgg13()
VGG.vgg16()
VGG.vgg19()
VGG.vgg11_bn()
VGG.vgg13_bn()
VGG.vgg16_bn()
VGG.vgg19_bn()
```
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
``` python
# change activation
VGG.vgg11(activation = nn.SELU)
# change number of classes (default is 1000 )
VGG.vgg11(n_classes=100)
# pass a different block
from nn.models.classification.senet import SENetBasicBlock
VGG.vgg11(block=SENetBasicBlock)
# store the features tensor after every block
```
|
{}
| null |
glasses/vgg19_bn
|
[
"transformers",
"pytorch",
"arxiv:1409.1556",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1409.1556"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us
|
# vgg19_bn
Implementation of VGG proposed in Very Deep Convolutional Networks For
Large-Scale Image Recognition
Please be aware that the [bn]{.title-ref} models uses BatchNorm but
they are very old and people back then don\'t know the bias is
superfluous in a conv followed by a batchnorm.
Examples:
|
[
"# vgg19_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n",
"# vgg19_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
29,
86
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1409.1556 #endpoints_compatible #region-us \n# vgg19_bn\nImplementation of VGG proposed in Very Deep Convolutional Networks For\nLarge-Scale Image Recognition\n\n \n\n Please be aware that the [bn]{.title-ref} models uses BatchNorm but\n they are very old and people back then don\\'t know the bias is\n superfluous in a conv followed by a batchnorm.\n\n Examples:"
] |
[
-0.06699889898300171,
-0.04339371994137764,
-0.002331333002075553,
0.02883179858326912,
0.009498326107859612,
0.05005422607064247,
-0.011239401064813137,
0.08377246558666229,
-0.03345214203000069,
0.02169434353709221,
0.18783965706825256,
0.038818709552288055,
0.028740694746375084,
0.06772572547197342,
-0.05537557229399681,
-0.2393626868724823,
0.040014464408159256,
0.13525733351707458,
0.010775054804980755,
0.05595269054174423,
0.06999342888593674,
-0.14465777575969696,
0.07583281397819519,
0.00268584955483675,
-0.24797554314136505,
-0.04862349480390549,
0.0124293752014637,
-0.04247654974460602,
0.062111154198646545,
0.07781637459993362,
0.03393508866429329,
0.027179639786481857,
0.008380480110645294,
-0.048961885273456573,
0.03415711224079132,
0.00040324320434592664,
-0.013198529370129108,
0.08374616503715515,
0.04720187932252884,
0.08585882931947708,
0.04938983544707298,
-0.042452868074178696,
-0.07133854180574417,
-0.014610923826694489,
-0.06426183134317398,
-0.12634146213531494,
0.014101164415478706,
0.09890340268611908,
-0.044065073132514954,
0.0056449840776622295,
0.004223728086799383,
0.14589835703372955,
-0.07862305641174316,
0.03411763906478882,
0.16239236295223236,
-0.2577348053455353,
-0.018282366916537285,
0.1985229253768921,
-0.037531908601522446,
-0.05373399332165718,
-0.04188286140561104,
0.1266951709985733,
0.0711197480559349,
-0.004315374884754419,
0.023188576102256775,
0.004519959911704063,
-0.053294338285923004,
0.0036858683452010155,
-0.12430400401353836,
-0.08685902506113052,
0.10166221857070923,
0.042851753532886505,
0.01634763926267624,
-0.14052045345306396,
-0.13756753504276276,
-0.07179659605026245,
-0.0835178941488266,
-0.03470265492796898,
-0.05067398026585579,
0.013268149457871914,
-0.051263585686683655,
-0.06894227862358093,
-0.05400807410478592,
-0.05434427410364151,
-0.1273207664489746,
-0.020723002031445503,
0.05412709340453148,
0.07044483721256256,
-0.11220891773700714,
0.15580885112285614,
-0.13681551814079285,
-0.07085337489843369,
0.024856548756361008,
-0.17326894402503967,
-0.038983359932899475,
0.05063098669052124,
0.0011956606758758426,
0.08422429114580154,
0.04883860424160957,
0.049074046313762665,
0.06095559522509575,
-0.06155189126729965,
-0.05319571867585182,
0.07195454835891724,
0.0037578188348561525,
0.05689352750778198,
-0.2142358124256134,
0.05167426913976669,
0.1048666313290596,
-0.1100701093673706,
0.03586150333285332,
-0.003076142631471157,
-0.0644291564822197,
-0.19338074326515198,
0.10587257146835327,
0.03369913995265961,
-0.04964791610836983,
0.1453339010477066,
-0.006975908298045397,
-0.0942690446972847,
0.035109687596559525,
0.022177794948220253,
-0.03594309836626053,
0.00845869816839695,
-0.05510158836841583,
0.09584543853998184,
0.016897527500987053,
-0.010745309293270111,
-0.04758273810148239,
-0.013848969712853432,
-0.08391273021697998,
-0.09632477909326553,
-0.02205994911491871,
-0.12686331570148468,
0.007657536305487156,
-0.12048665434122086,
0.04806997627019882,
-0.1834229677915573,
-0.10098669677972794,
0.10735641419887543,
-0.033793769776821136,
-0.0856454148888588,
0.027026323601603508,
-0.015225360170006752,
-0.07934334129095078,
-0.03442433476448059,
-0.001848467974923551,
0.1998852640390396,
-0.06083709001541138,
0.026240931823849678,
-0.05527614802122116,
0.10846901684999466,
-0.059695880860090256,
0.017219573259353638,
-0.054215945303440094,
0.01647392101585865,
-0.0474759042263031,
0.04773644357919693,
-0.05654025822877884,
0.018456827849149704,
-0.05813664570450783,
-0.11381173878908157,
-0.06565435230731964,
-0.01717340387403965,
-0.018820639699697495,
0.10897601395845413,
-0.12548381090164185,
-0.04317063093185425,
0.13535837829113007,
-0.1094643697142601,
-0.10338234156370163,
0.16240929067134857,
-0.054890137165784836,
-0.07870016992092133,
0.05706587806344032,
0.06188344210386276,
0.030264168977737427,
-0.08359245210886002,
-0.002443675184622407,
0.02019861899316311,
-0.13677901029586792,
0.02817465364933014,
-0.031672339886426926,
0.1755400002002716,
-0.0290481299161911,
0.0065818047150969505,
-0.017042163759469986,
0.08684440702199936,
-0.12339116632938385,
-0.07108049094676971,
-0.026815000921487808,
-0.04622099921107292,
0.04836409538984299,
0.024325698614120483,
0.07225601375102997,
0.01887565106153488,
-0.04048573598265648,
-0.19805072247982025,
0.07566485553979874,
-0.07484129816293716,
0.03285679221153259,
-0.10425224900245667,
0.1542615443468094,
-0.1698426455259323,
0.03619149327278137,
-0.09743605554103851,
-0.013906128704547882,
0.0834941640496254,
0.09393078833818436,
-0.03418228402733803,
0.22590063512325287,
0.019826559349894524,
-0.023801706731319427,
-0.025616610422730446,
0.008759001269936562,
0.12054243683815002,
-0.013122924603521824,
-0.02738143503665924,
-0.09004544466733932,
0.005201481748372316,
-0.05316959694027901,
0.05403149873018265,
-0.15255232155323029,
-0.06517784297466278,
0.16245542466640472,
0.09846536815166473,
-0.004274860955774784,
0.05351470038294792,
0.0756785124540329,
-0.04607997462153435,
-0.02544691413640976,
-0.05565975606441498,
0.0766897201538086,
-0.03994346782565117,
-0.08368197828531265,
0.09772416949272156,
-0.017453787848353386,
0.11017131805419922,
0.15632085502147675,
-0.2000964730978012,
-0.008540323935449123,
-0.07630370557308197,
-0.03881552815437317,
-0.004764965269714594,
0.013316549360752106,
0.01687542162835598,
0.08949802815914154,
-0.05672670528292656,
0.08023755997419357,
-0.08136489242315292,
-0.007321447599679232,
0.0946391150355339,
0.01944025233387947,
-0.05344122275710106,
0.014163804240524769,
-0.011640289798378944,
-0.17343415319919586,
0.04809348285198212,
-0.0226272139698267,
-0.023524396121501923,
0.11608699709177017,
0.0004920618957839906,
-0.053927965462207794,
0.03006746433675289,
-0.0042349910363554955,
-0.000484616553876549,
0.11218084394931793,
-0.2511864900588989,
-0.03566429018974304,
0.10253313928842545,
-0.05435095727443695,
0.03912518918514252,
-0.07941586524248123,
-0.017246639356017113,
-0.008251954801380634,
-0.049366939812898636,
0.0956050455570221,
0.1005646288394928,
0.022621866315603256,
0.10393926501274109,
-0.017016058787703514,
-0.056024618446826935,
0.03615517169237137,
-0.06346150487661362,
-0.03327153995633125,
0.12354596704244614,
-0.021463369950652122,
-0.2358955442905426,
-0.00934568326920271,
-0.1205746978521347,
-0.1105889156460762,
0.027983229607343674,
-0.04736064374446869,
-0.055889859795570374,
-0.05743870139122009,
-0.0012332607293501496,
0.022647874429821968,
0.058839261531829834,
0.13263548910617828,
-0.015210812911391258,
-0.015157248824834824,
0.003019419964402914,
-0.07276348769664764,
-0.027404919266700745,
-0.15937796235084534,
-0.06450977176427841,
0.11610699445009232,
-0.06501074880361557,
0.11025814712047577,
0.13357026875019073,
-0.03288177400827408,
0.05444813892245293,
-0.018584102392196655,
0.22686225175857544,
-0.04228775203227997,
-0.009407165460288525,
0.11554289609193802,
-0.0018149202223867178,
-0.048322394490242004,
0.053524747490882874,
-0.001221095328219235,
-0.12483934313058853,
-0.01896178163588047,
-0.02324647642672062,
-0.08169377595186234,
-0.06549602746963501,
-0.14594268798828125,
-0.06365979462862015,
-0.0729229524731636,
0.10219350457191467,
0.05348160117864609,
0.0562717504799366,
0.11970406025648117,
-0.06991112977266312,
0.07473466545343399,
0.011529705487191677,
0.08015531301498413,
0.20279328525066376,
-0.02883428893983364,
0.07481811940670013,
-0.0032351533882319927,
-0.07760927826166153,
0.07956361025571823,
-0.001883858349174261,
0.1917598694562912,
-0.09083133190870285,
-0.007745965383946896,
0.11094219237565994,
0.11794829368591309,
0.15580934286117554,
0.05883035436272621,
-0.10203812271356583,
-0.055812738835811615,
-0.06907524913549423,
-0.07433298230171204,
0.02512647584080696,
0.04597098380327225,
0.041131313890218735,
-0.01468720193952322,
-0.037570349872112274,
-0.0510537214577198,
0.06235423684120178,
0.05560838431119919,
0.14267848432064056,
-0.3217885494232178,
-0.017994070425629616,
0.0368124321103096,
0.06826672703027725,
-0.09137426316738129,
0.0716416984796524,
0.13720792531967163,
-0.017566757276654243,
0.0765606239438057,
-0.09581983834505081,
0.07945529371500015,
-0.17752477526664734,
0.08351342380046844,
-0.07497572153806686,
0.018315095454454422,
0.01645839400589466,
0.07060399651527405,
-0.2407732903957367,
0.16988307237625122,
0.0020132565405219793,
-0.023365052416920662,
-0.11160953342914581,
-0.04784107208251953,
-0.0034705528523772955,
0.07215911149978638,
0.1808333843946457,
-0.011703843250870705,
0.1460440456867218,
0.04581775143742561,
-0.029736988246440887,
0.04990703612565994,
0.0062843384221196175,
0.08109886199235916,
-0.020383698865771294,
0.018291594460606575,
-0.031036850064992905,
0.04126591607928276,
0.11359111219644547,
-0.12070990353822708,
-0.053750552237033844,
0.009853477589786053,
0.09258497506380081,
-0.012414987199008465,
-0.01147847156971693,
-0.016456399112939835,
-0.08072418719530106,
0.156271830201149,
-0.06525497138500214,
-0.04215727746486664,
-0.0565846785902977,
0.21563443541526794,
0.06687074899673462,
-0.02336782217025757,
0.10477200895547867,
-0.04104376956820488,
0.13960005342960358,
-0.027685005217790604,
-0.1488802582025528,
0.08997626602649689,
-0.07548747956752777,
-0.14333945512771606,
0.01497205812484026,
0.054268963634967804,
0.020896723493933678,
-0.010228361003100872,
0.05874219536781311,
-0.00908742006868124,
-0.05247275531291962,
-0.08300582319498062,
0.06033496558666229,
0.20488984882831573,
0.044721491634845734,
0.04357883334159851,
0.014128774404525757,
-0.06613686680793762,
-0.040896303951740265,
0.09431411325931549,
0.2107713669538498,
0.1922711431980133,
-0.11357996612787247,
0.018483193591237068,
0.11937572062015533,
-0.009065764024853706,
-0.2456858605146408,
0.018882660195231438,
-0.056974396109580994,
0.035147774964571,
-0.16853156685829163,
-0.010983889922499657,
0.048142071813344955,
0.029320932924747467,
0.011455195024609566,
0.05934059992432594,
-0.23285892605781555,
-0.09058753401041031,
0.20143309235572815,
0.037700552493333817,
0.2937033772468567,
-0.004248039331287146,
-0.0011161498259752989,
-0.03955213725566864,
-0.08870367705821991,
0.14034400880336761,
0.14159375429153442,
0.11171315610408783,
-0.06987452507019043,
0.0067124259658157825,
0.034925494343042374,
-0.06427111476659775,
0.08457010239362717,
0.05327490344643593,
0.13640333712100983,
-0.057722460478544235,
-0.11986258625984192,
0.09359937906265259,
0.025747979059815407,
0.02129809558391571,
0.19602513313293457,
0.0555124506354332,
-0.0021241307258605957,
-0.04212693125009537,
-0.11865958571434021,
0.08144663274288177,
0.05612345412373543,
-0.11429423093795776,
-0.16027258336544037,
0.0294945165514946,
0.04872749373316765,
0.031909018754959106,
0.11159060895442963,
-0.01662278361618519,
-0.005794001277536154,
0.06664512306451797,
0.044739361852407455,
-0.127844899892807,
-0.004002873785793781,
-0.045485496520996094,
0.012437302619218826,
0.16123716533184052,
-0.07288282364606857,
0.04392338544130325,
0.11748884618282318,
0.043565213680267334,
0.028609199449419975,
0.09180258214473724,
-0.07972274720668793,
0.02872185967862606,
0.04833744093775749,
-0.22320255637168884,
-0.029260996729135513,
-0.02689862810075283,
-0.11658291518688202,
0.17349058389663696,
0.027373529970645905,
0.10013487935066223,
-0.08929227292537689,
-0.004611840937286615,
0.0483860969543457,
0.010789069347083569,
-0.025602495297789574,
0.10656614601612091,
0.07027135789394379,
0.019322926178574562,
-0.1617390513420105,
0.10640376061201096,
-0.05451609194278717,
-0.10619834065437317,
-0.03127641975879669,
-0.10042960941791534,
-0.13394072651863098,
-0.0658375471830368,
-0.1425989419221878,
0.022292466834187508,
-0.06238067150115967,
-0.005339078139513731,
0.039155952632427216,
-0.10032875090837479,
0.07873362302780151,
0.07242023199796677,
0.09181969612836838,
0.12613517045974731,
-0.07248549163341522,
-0.026825526729226112,
-0.092234306037426,
0.046968862414360046,
0.018407262861728668,
0.028714893385767937,
-0.20358522236347198,
-0.006397643592208624,
0.0036798231303691864,
0.10222648084163666,
-0.11941015720367432,
-0.026645498350262642,
-0.07275596261024475,
0.036484550684690475,
-0.20224173367023468,
0.025244997814297676,
-0.05190708115696907,
-0.02257426083087921,
-0.03314865380525589,
-0.06107199937105179,
-0.0636773556470871,
0.010806470178067684,
-0.048975422978401184,
0.06489203870296478,
0.016579551622271538,
-0.009767410345375538,
-0.026179885491728783,
-0.050626471638679504,
-0.025062639266252518,
-0.010697019286453724,
0.18375231325626373,
0.08111532032489777,
-0.09152340888977051,
0.016728606075048447,
-0.18170981109142303,
-0.15772464871406555,
0.1911320984363556,
0.03245088458061218,
0.062349580228328705,
-0.003052517306059599,
0.08160777390003204,
-0.0013417727313935757,
-0.09015023708343506,
0.024648791179060936,
0.11615381389856339,
-0.04588936269283295,
0.002661414910107851,
-0.059757158160209656,
0.0906260758638382,
-0.06608881801366806,
0.03353109583258629,
0.11314160376787186,
0.13571099936962128,
-0.015492650680243969,
-0.022544268518686295,
-0.026852121576666832,
-0.039437659084796906,
-0.013009889051318169,
-0.049569807946681976,
-0.1541811227798462,
0.08182220906019211,
0.03462080657482147,
0.043840792030096054,
-0.0379314199090004,
0.30232325196266174,
-0.03500954434275627,
-0.13257579505443573,
-0.019135765731334686,
0.12786483764648438,
-0.0407489538192749,
0.06086345389485359,
0.21880659461021423,
0.04748757556080818,
0.02018093504011631,
0.006800795905292034,
0.1136607676744461,
0.15711776912212372,
0.19674599170684814,
0.07165101915597916,
-0.006820590700954199,
0.023871656507253647,
0.05561400204896927,
0.08576468378305435,
-0.04278567433357239,
-0.014084018766880035,
0.12841704487800598,
-0.019618572667241096,
0.13836313784122467,
-0.0589866004884243,
-0.06577873229980469,
0.16011159121990204,
-0.036372534930706024,
-0.017041366547346115,
0.06282718479633331,
-0.06465468555688858,
-0.06620752066373825,
-0.3075419068336487,
-0.06944993883371353,
-0.08305606245994568,
0.04186125844717026,
-0.10405083000659943,
-0.06463052332401276,
0.11093798279762268,
0.10305359214544296,
-0.08804876357316971,
0.11379142850637436,
0.0298820361495018,
-0.06520301848649979,
0.15341505408287048,
0.04651226848363876,
-0.11597341299057007,
-0.00764275761321187,
-0.03735004737973213,
0.029236136004328728,
0.0505845807492733,
-0.059472404420375824,
0.015708092600107193,
0.08070237189531326,
0.0073173027485609055,
-0.007657606154680252,
-0.037775300443172455,
-0.08558754622936249,
0.014037609100341797,
-0.0015255087055265903,
0.052916739135980606,
0.03263265639543533,
-0.06156408414244652,
-0.006807822268456221,
0.1663178652524948,
-0.02591836266219616,
-0.030070019885897636,
-0.04810183867812157,
0.08867104351520538,
-0.08245667815208435,
0.08504866808652878,
0.011488172225654125,
0.014948549680411816,
-0.02444199100136757,
0.2983493506908417,
0.3084549605846405,
-0.09701506048440933,
0.032972194254398346,
0.03985755890607834,
0.012280411086976528,
-0.05054135620594025,
0.11166886985301971,
0.02589818835258484,
0.205607607960701,
-0.017619607970118523,
-0.02932395413517952,
-0.041711073368787766,
-0.0009294543415307999,
-0.02667420171201229,
0.06781909614801407,
0.09808412939310074,
-0.03532833978533745,
-0.09332113713026047,
0.03984733670949936,
-0.1290520280599594,
-0.021894700825214386,
-0.02288922294974327,
-0.10746206343173981,
-0.08180881291627884,
-0.02399003691971302,
-0.12183584272861481,
0.021277012303471565,
0.05373145639896393,
-0.14146898686885834,
0.03276275470852852,
-0.04538761451840401,
0.05435332655906677,
-0.12109731137752533,
-0.004410903435200453,
0.055097416043281555,
0.14618970453739166,
0.12839697301387787,
-0.030521037057042122,
0.04703034460544586,
0.05659982189536095,
-0.020933877676725388,
-0.08801217377185822,
-0.007141385693103075,
-0.007127393037080765,
-0.0870153158903122,
-0.04429374262690544,
0.09395462274551392,
0.006897284649312496,
-0.001300387317314744,
0.06893806904554367,
-0.12455474585294724,
-0.013568058609962463,
-0.09182919561862946,
-0.024889642372727394,
-0.07409346848726273,
-0.0014442281099036336,
-0.029058534651994705,
0.08788127452135086,
0.028239119797945023,
-0.03818695619702339,
-0.047969408333301544,
-0.07311713695526123,
0.04058960825204849,
0.02094167098402977,
0.06590849161148071,
0.03824065625667572,
-0.1016160100698471,
-0.0012490319786593318,
0.026532521471381187,
0.00829207431524992,
0.019234901294112206,
-0.02118629775941372,
-0.037690941244363785,
-0.011874678544700146,
0.01047132071107626,
0.09498051553964615,
0.03384483978152275,
-0.008897360414266586,
-0.015550892800092697,
-0.011566080152988434,
0.0026810041163116693,
0.00986646767705679,
-0.13610480725765228,
-0.09014695882797241
] |
null | null |
transformers
|
# vit_base_patch16_224
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_base_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_base_patch16_224
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_base_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_base_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
63
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_base_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.0865512266755104,
-0.11963406205177307,
-0.005585163831710815,
0.07755785435438156,
0.1628875881433487,
0.014639626257121563,
0.04754586145281792,
0.015792714431881905,
-0.06016505882143974,
0.015386509709060192,
0.12802523374557495,
0.17432452738285065,
-0.010545630939304829,
-0.004989436827600002,
0.0042374394834041595,
-0.30981969833374023,
-0.029308637604117393,
0.1675168126821518,
-0.1265561729669571,
0.08566603809595108,
0.08352851867675781,
-0.1436561793088913,
0.1040310338139534,
0.037031132727861404,
-0.2828957438468933,
0.037736326456069946,
0.018715353682637215,
-0.08662837743759155,
0.1736387461423874,
0.057774435728788376,
0.10736828297376633,
-0.0020750181283801794,
-0.017332032322883606,
0.025729944929480553,
0.00951483752578497,
0.035358451306819916,
-0.04745930805802345,
0.05888277664780617,
0.07263518124818802,
0.1186579093337059,
0.04750456288456917,
-0.014937471598386765,
-0.023187048733234406,
-0.02506321482360363,
-0.11245042830705643,
-0.14178837835788727,
0.07319536060094833,
0.09087222814559937,
-0.01937873102724552,
0.023226529359817505,
0.02833709307014942,
0.14979195594787598,
0.004266437143087387,
0.1222553625702858,
0.1405172348022461,
-0.14189322292804718,
-0.11375715583562851,
-0.06709837913513184,
-0.009072634391486645,
0.07457076758146286,
-0.049612849950790405,
0.08273176848888397,
0.057603683322668076,
0.05360596254467964,
0.010093438439071178,
-0.03555015102028847,
-0.09334754198789597,
0.03410955145955086,
-0.14373934268951416,
-0.05249546095728874,
0.169727623462677,
0.026208022609353065,
0.0310536976903677,
-0.008724481798708439,
-0.16615746915340424,
-0.005279111210256815,
-0.06360743194818497,
-0.04445092752575874,
-0.018744831904768944,
-0.07884534448385239,
-0.13978876173496246,
-0.03059426136314869,
-0.13549810647964478,
-0.03261387348175049,
-0.10369929671287537,
0.1188318058848381,
0.02797693759202957,
0.08618137985467911,
-0.12116757035255432,
0.05777360871434212,
-0.04055928811430931,
-0.11063030362129211,
0.0368322879076004,
-0.1156642809510231,
0.010039753280580044,
0.02725675143301487,
0.01596125215291977,
-0.11423861235380173,
-0.042454689741134644,
-0.04837578535079956,
0.1194353923201561,
-0.0190789345651865,
-0.02166789211332798,
0.07995736598968506,
-0.08144627511501312,
0.24082623422145844,
-0.12013298273086548,
0.12083324044942856,
-0.06012234091758728,
-0.049926888197660446,
-0.0009539071470499039,
0.007848596200346947,
-0.10703778266906738,
-0.0781983807682991,
0.0014638180145993829,
-0.007334367837756872,
-0.03182487562298775,
0.09545937180519104,
0.047352906316518784,
-0.04016833379864693,
0.06648225337266922,
0.011481702327728271,
0.025189826264977455,
-0.014300440438091755,
0.0621708519756794,
0.16735446453094482,
0.08267147094011307,
-0.10302633047103882,
-0.07954918593168259,
0.03882785141468048,
-0.09157689660787582,
0.011353177018463612,
-0.027944667264819145,
-0.11059900373220444,
0.03279375657439232,
-0.021767310798168182,
0.0748458206653595,
-0.27565714716911316,
0.08868517726659775,
0.1035834550857544,
0.036832116544246674,
-0.0007496491889469326,
0.040740180760622025,
0.02217964641749859,
-0.13149124383926392,
0.0019311830401420593,
0.02049854025244713,
0.0011447025462985039,
0.03833063319325447,
0.10611924529075623,
0.09610620886087418,
0.10757731646299362,
-0.09659973531961441,
0.016676882281899452,
-0.08961974829435349,
-0.01176836621016264,
-0.1448945254087448,
-0.025826213881373405,
0.026605524122714996,
0.13128513097763062,
-0.03964586183428764,
-0.13605548441410065,
-0.016291419044137,
0.02815917320549488,
0.03551110997796059,
0.13207823038101196,
-0.07323919981718063,
-0.06968065351247787,
0.020586978644132614,
-0.15241049230098724,
-0.19795449078083038,
0.11313460022211075,
0.04668409004807472,
0.028751788660883904,
0.08129055052995682,
0.11846175044775009,
0.06725414842367172,
-0.1553696095943451,
-0.01657184027135372,
0.016717053949832916,
-0.13627104461193085,
-0.08149192482233047,
0.0004933432792313397,
0.19606037437915802,
-0.059690967202186584,
0.0005243849009275436,
-0.09515645354986191,
0.06378202140331268,
-0.13146959245204926,
-0.044754598289728165,
-0.07570337504148483,
-0.005224451422691345,
0.05088891461491585,
0.0680762529373169,
0.04380635917186737,
-0.04323996230959892,
-0.02782941423356533,
0.14715392887592316,
0.04335109889507294,
-0.05522496998310089,
0.029624203220009804,
-0.09202394634485245,
0.034867461770772934,
-0.16506808996200562,
0.045446380972862244,
-0.07841549068689346,
0.07312136888504028,
0.0038295413833111525,
0.026600142940878868,
0.010905194096267223,
0.14286141097545624,
0.0870765969157219,
0.028465881943702698,
-0.023647010326385498,
-0.057223111391067505,
-0.011948514729738235,
0.021790826693177223,
-0.030457014217972755,
-0.14145252108573914,
-0.014107747934758663,
-0.014295000582933426,
-0.14804723858833313,
-0.1079649105668068,
-0.04034671187400818,
0.13640159368515015,
0.0985991582274437,
-0.0048408289439976215,
0.045508384704589844,
-0.06131180003285408,
0.025417065247893333,
-0.00597389554604888,
-0.03563570976257324,
0.04367603734135628,
-0.028265267610549927,
0.057950783520936966,
0.14701777696609497,
-0.020842596888542175,
0.23590393364429474,
0.18169502913951874,
-0.36937442421913147,
-0.0005247760564088821,
0.09291285276412964,
-0.0022706117015331984,
0.02434314787387848,
0.02501792274415493,
-0.09305595606565475,
0.03517097607254982,
-0.06348317116498947,
0.1192331314086914,
-0.02883685566484928,
-0.009538362734019756,
0.10071256011724472,
0.006991034373641014,
-0.09662840515375137,
0.09958279132843018,
0.09191593527793884,
-0.23192991316318512,
0.07385553419589996,
0.17247183620929718,
-0.0521995835006237,
-0.000547973730135709,
0.029490211978554726,
-0.03955154865980148,
0.013236608356237411,
0.011094181798398495,
0.021351486444473267,
0.19965429604053497,
-0.3019413948059082,
-0.10585557669401169,
0.020029110834002495,
-0.08954864740371704,
0.00998604018241167,
-0.15262345969676971,
0.03500375524163246,
0.044467221945524216,
0.08229760825634003,
-0.03618619218468666,
0.037084225565195084,
-0.013586719520390034,
0.08992411941289902,
-0.014935196377336979,
-0.07683748006820679,
0.027103811502456665,
0.01523579191416502,
-0.03460260108113289,
0.14578945934772491,
-0.032970231026411057,
-0.3903217315673828,
-0.13361625373363495,
-0.07926733046770096,
0.05723874270915985,
0.02886442095041275,
0.05680593475699425,
-0.16133542358875275,
-0.07174216210842133,
0.07736466079950333,
0.1224096417427063,
-0.0536315031349659,
0.07943746447563171,
0.0021375976502895355,
0.02938428521156311,
0.013763684779405594,
-0.05983838438987732,
-0.008363328874111176,
-0.07941952347755432,
0.03450043126940727,
0.05035318806767464,
-0.023896606639027596,
0.16054745018482208,
0.07045134902000427,
-0.07328590005636215,
0.05434417724609375,
-0.015385176055133343,
0.18853914737701416,
-0.1250857710838318,
-0.005968042183667421,
0.20560549199581146,
0.005718360189348459,
0.008920201100409031,
0.11543694883584976,
0.05739555135369301,
-0.06807731091976166,
-0.018741702660918236,
-0.09945327043533325,
-0.06573089212179184,
-0.05290261283516884,
-0.016593588516116142,
-0.09915488213300705,
0.020995311439037323,
0.14304092526435852,
0.0370878130197525,
0.020173130556941032,
0.08192387968301773,
0.008586354553699493,
0.07798557728528976,
-0.015631049871444702,
0.12458840757608414,
0.35686519742012024,
-0.06705600768327713,
0.058593884110450745,
-0.05633668974041939,
-0.13365721702575684,
0.026323875412344933,
-0.045899465680122375,
0.15833063423633575,
-0.0019648971501737833,
-0.01647619530558586,
0.03914817422628403,
0.011849291622638702,
0.10239625722169876,
0.1356629878282547,
-0.03531400486826897,
-0.037550825625658035,
-0.014293302781879902,
-0.06545861810445786,
0.02011430263519287,
0.03562057390809059,
-0.009534506127238274,
0.015592356212437153,
-0.060950975865125656,
-0.09825137257575989,
0.07094094157218933,
0.025206314399838448,
0.14077192544937134,
-0.31804919242858887,
-0.04814401641488075,
-0.034834738820791245,
-0.05805264413356781,
-0.1341630220413208,
0.019780507311224937,
0.10039069503545761,
0.0022765249013900757,
-0.002875675680115819,
-0.1142016053199768,
0.08159848302602768,
-0.04743747040629387,
0.08656276017427444,
0.025102103129029274,
-0.04654378071427345,
0.03638439252972603,
0.03201509267091751,
-0.1968912035226822,
0.20562182366847992,
-0.027103601023554802,
-0.02036663331091404,
-0.0071510751731693745,
-0.06230548396706581,
-0.025365009903907776,
0.24049769341945648,
0.044268589466810226,
0.008802169002592564,
0.2526678144931793,
-0.09018368273973465,
0.09287186712026596,
0.041872356086969376,
0.0941595733165741,
0.06658021360635757,
-0.02256842702627182,
-0.018865706399083138,
-0.051651597023010254,
0.021244382485747337,
0.04161995276808739,
0.025295989587903023,
-0.07984425127506256,
0.02420821227133274,
-0.04797444865107536,
0.12658090889453888,
-0.014179225079715252,
-0.06284205615520477,
-0.017701447010040283,
0.016059646382927895,
0.018666543066501617,
0.017911763861775398,
-0.07336882501840591,
0.12710930407047272,
0.10411018133163452,
-0.07516094297170639,
0.15200109779834747,
-0.07363846153020859,
0.054071877151727676,
-0.012287824414670467,
-0.11421868950128555,
0.0737617239356041,
-0.10101234167814255,
0.008589384146034718,
0.009725364856421947,
0.048048704862594604,
-0.06173728406429291,
-0.036648940294981,
0.010401072911918163,
0.030163690447807312,
-0.1046823039650917,
-0.014222372323274612,
0.014859087765216827,
-0.08500391989946365,
-0.09533684700727463,
-0.021384358406066895,
0.0927453264594078,
0.006677646189928055,
-0.06574540585279465,
0.05331050977110863,
0.05413667857646942,
0.01990833692252636,
-0.06482091546058655,
-0.07075868546962738,
0.11238754540681839,
0.023106733337044716,
-0.2632106840610504,
0.008727994747459888,
-0.0033704128582030535,
-0.011982566677033901,
-0.09201271086931229,
-0.0014865888515487313,
0.13790251314640045,
-0.08870366960763931,
0.00291324220597744,
0.052081406116485596,
-0.15582425892353058,
-0.06668809801340103,
0.09863737225532532,
0.16781282424926758,
0.19400052726268768,
-0.10197769850492477,
0.01538455206900835,
0.007302889600396156,
-0.1279958039522171,
0.04516080021858215,
0.18285496532917023,
0.07481245696544647,
-0.023834219202399254,
0.008261655457317829,
0.019623497501015663,
-0.05074397101998329,
0.055981725454330444,
-0.08129551261663437,
0.006191486492753029,
-0.06179133057594299,
-0.13130225241184235,
0.08741707354784012,
0.03701013699173927,
-0.019494932144880295,
0.1895236372947693,
0.08216658234596252,
0.02631107158958912,
-0.006332125049084425,
-0.10733357816934586,
0.037955351173877716,
0.01279736403375864,
-0.03582966700196266,
-0.05896512046456337,
-0.03661045804619789,
-0.05647370219230652,
-0.013242232613265514,
0.20380006730556488,
0.0258068535476923,
-0.05299237743020058,
-0.013896249234676361,
0.03689327463507652,
-0.14838068187236786,
-0.1952512413263321,
0.00022861485194880515,
-0.046670008450746536,
0.1821841448545456,
-0.15761320292949677,
0.047089651226997375,
0.10332828760147095,
0.07175082713365555,
-0.014437705278396606,
0.11489695310592651,
-0.00788824912160635,
-0.0015684590907767415,
0.12505877017974854,
-0.10553291440010071,
-0.07694723457098007,
-0.11594385653734207,
0.09393592923879623,
0.1096002459526062,
0.18905417621135712,
0.13157588243484497,
-0.12623025476932526,
-0.003094198415055871,
0.0015930653316900134,
-0.0829746350646019,
-0.040064990520477295,
0.08159182965755463,
0.09308081120252609,
0.04645519331097603,
-0.09607317298650742,
-0.034224845468997955,
0.0006705572013743222,
-0.2712674140930176,
-0.09816762059926987,
0.08237843960523605,
-0.06594288349151611,
-0.060154665261507034,
0.05304987356066704,
-0.09563387185335159,
-0.16823947429656982,
-0.03688744083046913,
-0.057223375886678696,
-0.11927040666341782,
0.09713008999824524,
0.24802543222904205,
0.08631128817796707,
0.0495629757642746,
-0.08841699361801147,
0.09158435463905334,
-0.07526376843452454,
0.05734074115753174,
-0.056158047169446945,
0.13607430458068848,
-0.23367814719676971,
-0.04080221801996231,
0.020731709897518158,
0.1567852646112442,
-0.07416278123855591,
-0.0027262859512120485,
-0.11965028196573257,
0.044337254017591476,
-0.049448732286691666,
0.0011833878234028816,
0.0061243269592523575,
-0.040869154036045074,
-0.006918671075254679,
-0.14847634732723236,
-0.07733827084302902,
0.03778684511780739,
-0.06029719486832619,
0.002733719302341342,
0.043478623032569885,
0.002601365791633725,
-0.024527883157134056,
-0.011623322032392025,
0.04913649335503578,
-0.07478072494268417,
0.06957220286130905,
0.0465446412563324,
-0.09524810314178467,
0.10311716794967651,
-0.058268215507268906,
-0.16128402948379517,
0.14033986628055573,
0.05822603404521942,
0.02539105713367462,
0.062226030975580215,
0.10184583812952042,
0.00481073884293437,
0.04290550574660301,
-0.014531251974403858,
0.10981477051973343,
-0.04906778410077095,
-0.00244286865927279,
-0.06359482556581497,
-0.05701753497123718,
-0.040481578558683395,
0.06227920949459076,
0.02590242587029934,
0.04158462584018707,
0.01904827542603016,
-0.021377189084887505,
0.005589830223470926,
-0.09126418828964233,
0.01952001266181469,
-0.031324468553066254,
-0.17178846895694733,
-0.014336004853248596,
-0.0676756203174591,
0.027562400326132774,
-0.030643180012702942,
0.06025378778576851,
0.010275793261826038,
0.023593492805957794,
0.04340748488903046,
0.18925321102142334,
-0.13561972975730896,
0.049679357558488846,
0.18689368665218353,
0.03041676990687847,
0.02066098153591156,
-0.036023691296577454,
0.08309945464134216,
0.08913391083478928,
0.10532168298959732,
-0.07692158967256546,
0.09435378760099411,
-0.10999289900064468,
0.08048996329307556,
0.12410151958465576,
0.11144527792930603,
-0.14824967086315155,
-0.1449112892150879,
-0.039266061037778854,
0.08138454705476761,
-0.058238495141267776,
0.03959687054157257,
0.20079825818538666,
0.07309737056493759,
0.04881620407104492,
-0.00991613045334816,
-0.04031236842274666,
0.030649125576019287,
-0.1096365675330162,
-0.0741196796298027,
-0.16688276827335358,
0.0413590669631958,
-0.03983164206147194,
0.007515369448810816,
0.045878682285547256,
0.051270436495542526,
-0.049672093242406845,
0.21052968502044678,
0.19109995663166046,
-0.08998314291238785,
0.1088838204741478,
-0.0032592937350273132,
0.007237256970256567,
0.04796505346894264,
0.0762820914387703,
-0.03779762610793114,
0.04127353057265282,
-0.053602010011672974,
0.004898035898804665,
-0.02103767730295658,
-0.01793014071881771,
0.01192194502800703,
-0.07348611950874329,
-0.030386870726943016,
0.011784314177930355,
-0.029424527660012245,
-0.013268324546515942,
-0.041811753064394,
-0.029232608154416084,
-0.00627682963386178,
0.18872737884521484,
-0.0757070928812027,
-0.12902875244617462,
-0.04167109355330467,
0.1800389289855957,
-0.04225505515933037,
0.07515785843133926,
-0.03917669877409935,
-0.03750251606106758,
-0.09451636672019958,
0.32280561327934265,
0.16689689457416534,
-0.004794361535459757,
-0.01859363541007042,
0.07823163270950317,
0.021807944402098656,
-0.0016316752880811691,
0.1260562390089035,
0.07493024319410324,
0.20483696460723877,
-0.07844679057598114,
-0.01639673113822937,
-0.09268101304769516,
0.057015310972929,
0.006536492612212896,
-0.026507647708058357,
0.14425577223300934,
-0.02651367336511612,
-0.10610321909189224,
0.10189417749643326,
0.05009336769580841,
-0.011822284199297428,
0.07756441086530685,
-0.1688603162765503,
-0.016156794503331184,
-0.05157199129462242,
0.17592298984527588,
-0.0070473686791956425,
0.04891301691532135,
-0.060162950307130814,
-0.043441783636808395,
0.0061578708700835705,
0.039304282516241074,
-0.18470512330532074,
-0.022011421620845795,
0.06049402430653572,
-0.2013952136039734,
0.01650405116379261,
-0.08725589513778687,
-0.0006266962736845016,
0.07321681827306747,
0.02260851114988327,
-0.019135504961013794,
-0.0375540517270565,
-0.02886962890625,
-0.14937655627727509,
-0.00026523927226662636,
0.10203296691179276,
0.021182650700211525,
-0.017957914620637894,
-0.036744456738233566,
-0.19373172521591187,
-0.02069091610610485,
0.013536269776523113,
0.01390161830931902,
-0.036066487431526184,
0.01107962429523468,
-0.03741006180644035,
0.0259200781583786,
0.06484300643205643,
-0.01119320373982191,
-0.04756451025605202,
-0.09208375215530396,
-0.04256587103009224,
0.10877037793397903,
-0.0021873824298381805,
0.0390862412750721,
-0.08575441688299179,
-0.090939961373806,
-0.13117708265781403,
-0.00023881687957327813,
-0.03477345034480095,
-0.08483544737100601,
-0.01896015554666519,
-0.01943679340183735,
-0.05255847051739693,
0.04045439511537552,
0.09918605536222458,
-0.029970968142151833,
0.0035643961746245623,
-0.0295222420245409,
-0.007092474494129419,
0.12505340576171875,
-0.09030341356992722,
-0.05248270556330681
] |
null | null |
transformers
|
# vit_base_patch16_384
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_base_patch16_384
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_base_patch16_384
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_base_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_base_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
62
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_base_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.08170054852962494,
-0.1267194002866745,
-0.005548109300434589,
0.08406737446784973,
0.1522759050130844,
0.018372509628534317,
0.04906836897134781,
0.019201574847102165,
-0.05949369817972183,
0.018078656867146492,
0.12393990904092789,
0.16962139308452606,
-0.006644026376307011,
-0.010209368541836739,
0.0027544680051505566,
-0.31642118096351624,
-0.03476104140281677,
0.16421380639076233,
-0.11451398581266403,
0.08466704189777374,
0.08767242729663849,
-0.13797423243522644,
0.1074213832616806,
0.037458740174770355,
-0.2756783664226532,
0.037118393927812576,
0.019348254427313805,
-0.08595983684062958,
0.17415253818035126,
0.05413509160280228,
0.10464566946029663,
-0.0007025640807114542,
-0.022256217896938324,
0.026998329907655716,
0.009801739826798439,
0.03455685079097748,
-0.04793772101402283,
0.05626395344734192,
0.06992415338754654,
0.12367448955774307,
0.07053636014461517,
-0.02085486426949501,
-0.023104365915060043,
-0.021351279690861702,
-0.11310839653015137,
-0.12824107706546783,
0.06799191236495972,
0.08977601677179337,
-0.017423229292035103,
0.018524188548326492,
0.0240341629832983,
0.1516137421131134,
0.001578336930833757,
0.1231226846575737,
0.13903634250164032,
-0.15564164519309998,
-0.1142585277557373,
-0.04915294796228409,
-0.013061079196631908,
0.08142872154712677,
-0.0493781603872776,
0.08924612402915955,
0.06458570808172226,
0.05133776366710663,
0.0025800378061830997,
-0.036733075976371765,
-0.08828338980674744,
0.02868751809000969,
-0.14358288049697876,
-0.05223126709461212,
0.1728803962469101,
0.032173652201890945,
0.028085079044103622,
-0.017434891313314438,
-0.1681196242570877,
-0.005694412160664797,
-0.06079839542508125,
-0.04617905244231224,
-0.01913057453930378,
-0.07632419466972351,
-0.1312820315361023,
-0.03528750315308571,
-0.14038489758968353,
-0.028739066794514656,
-0.10537688434123993,
0.09748202562332153,
0.027487656101584435,
0.08388727158308029,
-0.11559578776359558,
0.06331970542669296,
-0.05387992039322853,
-0.11370878666639328,
0.029116390272974968,
-0.12054397910833359,
0.01409556157886982,
0.021856317296624184,
0.01928595080971718,
-0.12343554198741913,
-0.0468171127140522,
-0.055590249598026276,
0.12780655920505524,
-0.01589307002723217,
-0.03179676830768585,
0.08133441209793091,
-0.08215783536434174,
0.2380831241607666,
-0.11579793691635132,
0.11232656985521317,
-0.057864345610141754,
-0.052697181701660156,
0.00488639809191227,
0.007400521542876959,
-0.10823438316583633,
-0.08057630807161331,
-0.005203209351748228,
-0.007440291810780764,
-0.03376496955752373,
0.09584944695234299,
0.04733498394489288,
-0.03695053979754448,
0.06812532246112823,
0.015963295474648476,
0.021213356405496597,
-0.017163321375846863,
0.06055348366498947,
0.17143115401268005,
0.08895830065011978,
-0.1010841354727745,
-0.07756432145833969,
0.03431371599435806,
-0.09134373068809509,
0.01094971876591444,
-0.023992493748664856,
-0.1085536926984787,
0.03319163993000984,
-0.020725294947624207,
0.0749492421746254,
-0.27987924218177795,
0.08855631202459335,
0.10598606616258621,
0.03307950869202614,
-0.0064547364600002766,
0.04160285368561745,
0.021023575216531754,
-0.13002930581569672,
0.0021144640631973743,
0.024748312309384346,
0.002874556230381131,
0.03682743012905121,
0.10199014842510223,
0.08937544375658035,
0.1063520610332489,
-0.08861971646547318,
0.012770409695804119,
-0.08474093675613403,
-0.01335929799824953,
-0.15083883702754974,
-0.018980324268341064,
0.02863421104848385,
0.13560491800308228,
-0.042057085782289505,
-0.1427963376045227,
-0.01905164308845997,
0.024640927091240883,
0.029949117451906204,
0.1410588175058365,
-0.08294088393449783,
-0.06341125071048737,
0.021791644394397736,
-0.1459175944328308,
-0.19842469692230225,
0.11977482587099075,
0.042064424604177475,
0.03891652077436447,
0.08237277716398239,
0.12078577280044556,
0.07426360994577408,
-0.162376269698143,
-0.01579340174794197,
0.026029439643025398,
-0.1311749815940857,
-0.07544711977243423,
-0.0031562328804284334,
0.18869048357009888,
-0.0551561638712883,
-0.003902031807228923,
-0.0837021917104721,
0.0554375946521759,
-0.12701629102230072,
-0.042725980281829834,
-0.07538986206054688,
-0.005434600170701742,
0.05123045668005943,
0.06446796655654907,
0.04590623080730438,
-0.04781360551714897,
-0.031316451728343964,
0.12989281117916107,
0.04300835356116295,
-0.049031008034944534,
0.033486172556877136,
-0.08818908780813217,
0.033286966383457184,
-0.16284975409507751,
0.041716575622558594,
-0.07503461092710495,
0.07336314767599106,
0.00872568879276514,
0.023442961275577545,
0.014940578490495682,
0.15796631574630737,
0.08911430835723877,
0.028510194271802902,
-0.030490752309560776,
-0.05041947215795517,
-0.011463731527328491,
0.02093685418367386,
-0.028686661273241043,
-0.13822446763515472,
-0.010854212567210197,
-0.011903983540832996,
-0.14689235389232635,
-0.10225141793489456,
-0.04055524989962578,
0.14614617824554443,
0.09852159768342972,
-0.003802200313657522,
0.042535826563835144,
-0.05468617379665375,
0.01938779093325138,
-0.008804443292319775,
-0.040627650916576385,
0.037937287241220474,
-0.026744667440652847,
0.05265352502465248,
0.13771522045135498,
-0.01544562354683876,
0.21578460931777954,
0.1802423745393753,
-0.36201950907707214,
-0.00817095022648573,
0.09496379643678665,
-0.0004150616587139666,
0.0255891066044569,
0.02461107447743416,
-0.09224524348974228,
0.03731251880526543,
-0.06517963111400604,
0.11692196875810623,
-0.029405873268842697,
-0.009601895697414875,
0.10153600573539734,
0.009891526773571968,
-0.09809857606887817,
0.09728424996137619,
0.08704172074794769,
-0.23664554953575134,
0.06886201351881027,
0.16644690930843353,
-0.046787329018116,
-0.004720371216535568,
0.03178934007883072,
-0.04017830267548561,
0.013992523774504662,
0.016975877806544304,
0.02595970220863819,
0.19075413048267365,
-0.30978038907051086,
-0.10221435874700546,
0.020934460684657097,
-0.08154111355543137,
0.007201838307082653,
-0.1510920524597168,
0.034619081765413284,
0.0469118133187294,
0.08139421045780182,
-0.03978220745921135,
0.04293489083647728,
-0.015020925551652908,
0.0927731990814209,
-0.0133305499330163,
-0.07277409732341766,
0.025475777685642242,
0.013584147207438946,
-0.03327731415629387,
0.14813430607318878,
-0.030183566734194756,
-0.378915011882782,
-0.1368064284324646,
-0.08889199048280716,
0.05505965277552605,
0.028635507449507713,
0.05881278216838837,
-0.16393980383872986,
-0.06894390285015106,
0.0722259134054184,
0.12682786583900452,
-0.05118655785918236,
0.0832829549908638,
-0.000768908066675067,
0.027742797508835793,
0.00951340701431036,
-0.06284061074256897,
-0.009979154914617538,
-0.08135239779949188,
0.03124961443245411,
0.04530616104602814,
-0.02923310175538063,
0.15861481428146362,
0.07591624557971954,
-0.07061495631933212,
0.05304008349776268,
-0.018714526668190956,
0.19247475266456604,
-0.12603779137134552,
0.0033074950333684683,
0.20444218814373016,
0.008865410462021828,
0.008957931771874428,
0.12102697044610977,
0.056544285267591476,
-0.07313785701990128,
-0.013970404863357544,
-0.09857052564620972,
-0.06818043440580368,
-0.06019454076886177,
-0.01226338092237711,
-0.10517142713069916,
0.02300993725657463,
0.1409502476453781,
0.03666268289089203,
0.021724626421928406,
0.08124648779630661,
0.008852074854075909,
0.08747109770774841,
-0.008802606724202633,
0.1254989206790924,
0.3509695827960968,
-0.07056792080402374,
0.053728681057691574,
-0.06176326051354408,
-0.1271413415670395,
0.03062979318201542,
-0.05241939797997475,
0.1538362354040146,
-0.009541514329612255,
-0.012423846870660782,
0.04221397638320923,
0.014811909757554531,
0.09921146929264069,
0.13257235288619995,
-0.03529473766684532,
-0.03866015002131462,
-0.013498580083251,
-0.06553283333778381,
0.023849546909332275,
0.03909236565232277,
-0.005327877122908831,
0.019729262217879295,
-0.06038626283407211,
-0.09312442690134048,
0.06840717792510986,
0.02768615260720253,
0.15338249504566193,
-0.3107846677303314,
-0.04609968885779381,
-0.028162701055407524,
-0.05186537280678749,
-0.138665109872818,
0.02321316860616207,
0.09395474195480347,
0.0037965665105730295,
0.009760755114257336,
-0.11108631640672684,
0.07901667058467865,
-0.05114399641752243,
0.082942433655262,
0.022805919870734215,
-0.051017094403505325,
0.03077368065714836,
0.0285838283598423,
-0.2054302841424942,
0.20722271502017975,
-0.027527011930942535,
-0.020973410457372665,
-0.010311026126146317,
-0.058951202780008316,
-0.024645861238241196,
0.24490153789520264,
0.049158137291669846,
0.008127374574542046,
0.2502955496311188,
-0.08035772293806076,
0.0922771543264389,
0.034158457070589066,
0.09489496052265167,
0.065786212682724,
-0.027760598808526993,
-0.015450761653482914,
-0.04782459884881973,
0.020013226196169853,
0.049171287566423416,
0.02419457770884037,
-0.08014074712991714,
0.02477959543466568,
-0.049406249076128006,
0.11919594556093216,
-0.013528888113796711,
-0.06264596432447433,
-0.017812101170420647,
0.030397025868296623,
0.012513130903244019,
0.016528308391571045,
-0.06942230463027954,
0.12661674618721008,
0.1043054461479187,
-0.07588978111743927,
0.1535670906305313,
-0.0755017027258873,
0.05442513898015022,
-0.01532331295311451,
-0.11056119948625565,
0.07193510979413986,
-0.09725458920001984,
0.010608932003378868,
0.010641070082783699,
0.05316195636987686,
-0.0673186182975769,
-0.03132083639502525,
0.010706363245844841,
0.026189904659986496,
-0.10089221596717834,
-0.013114990666508675,
0.017011750489473343,
-0.08687654137611389,
-0.09923608601093292,
-0.015789348632097244,
0.09797128289937973,
0.002697035437449813,
-0.06776492297649384,
0.05174895003437996,
0.05168911814689636,
0.0313829705119133,
-0.06362446397542953,
-0.07454933226108551,
0.10505034774541855,
0.02306387387216091,
-0.2540896236896515,
0.017348503693938255,
-0.007765074726194143,
-0.011699825525283813,
-0.08834011852741241,
0.008487884886562824,
0.12817825376987457,
-0.09374898672103882,
0.00394513551145792,
0.0501287616789341,
-0.1614934653043747,
-0.07299347221851349,
0.09573729336261749,
0.16229182481765747,
0.1984434574842453,
-0.1012340560555458,
0.01427371334284544,
0.012020600028336048,
-0.14171403646469116,
0.02765481360256672,
0.17991888523101807,
0.07570034265518188,
-0.018705150112509727,
0.015350187197327614,
0.019789831712841988,
-0.05137931928038597,
0.06219249591231346,
-0.07645643502473831,
0.0036431581247597933,
-0.06685829907655716,
-0.12852559983730316,
0.08198472857475281,
0.035192523151636124,
-0.019473576918244362,
0.19593757390975952,
0.07799189537763596,
0.028618140146136284,
-0.003994110506027937,
-0.10560183972120285,
0.034494347870349884,
0.0075278934091329575,
-0.04180166870355606,
-0.06676465272903442,
-0.03455449640750885,
-0.054039936512708664,
-0.016514930874109268,
0.20484024286270142,
0.0236330758780241,
-0.048272229731082916,
-0.013933788053691387,
0.0378694124519825,
-0.14906218647956848,
-0.19611535966396332,
0.0005159855936653912,
-0.04585888609290123,
0.17807909846305847,
-0.16959206759929657,
0.0446903333067894,
0.09976805001497269,
0.0691310465335846,
-0.010775626637041569,
0.11277667433023453,
-0.015232856385409832,
0.008993140421807766,
0.1266239583492279,
-0.09973372519016266,
-0.0817885473370552,
-0.11497240513563156,
0.08172274380922318,
0.10420848429203033,
0.18126221001148224,
0.13403303921222687,
-0.13046124577522278,
0.002714060479775071,
0.0009367059683427215,
-0.0824294313788414,
-0.04564732685685158,
0.07847122102975845,
0.09188123792409897,
0.04537556320428848,
-0.0941312313079834,
-0.033317647874355316,
0.000872203556355089,
-0.26951470971107483,
-0.1005001962184906,
0.08055587857961655,
-0.06798364967107773,
-0.05894367769360542,
0.055821944028139114,
-0.10575217008590698,
-0.16080962121486664,
-0.02782433107495308,
-0.0538305789232254,
-0.12175031006336212,
0.09298326820135117,
0.23802851140499115,
0.08631403744220734,
0.05347499996423721,
-0.09417522698640823,
0.0896589457988739,
-0.07575687021017075,
0.051989585161209106,
-0.05461524799466133,
0.13526487350463867,
-0.23524215817451477,
-0.03930239379405975,
0.020257286727428436,
0.15500684082508087,
-0.07344608008861542,
0.0008925840957090259,
-0.11233504116535187,
0.04330301284790039,
-0.047757674008607864,
0.0029001752845942974,
0.006750789005309343,
-0.038874197751283646,
-0.009240766987204552,
-0.1482819765806198,
-0.08055900782346725,
0.03977233171463013,
-0.05757047235965729,
0.004160759504884481,
0.04360249638557434,
0.008139703422784805,
-0.020803192630410194,
-0.011614455841481686,
0.049575164914131165,
-0.07613622397184372,
0.07024050503969193,
0.03373408690094948,
-0.10215716809034348,
0.11056245863437653,
-0.061019547283649445,
-0.15455113351345062,
0.14690062403678894,
0.05811607837677002,
0.024362782016396523,
0.07273366302251816,
0.10091929137706757,
-0.0002545184106566012,
0.04054797440767288,
-0.014509143307805061,
0.09899193793535233,
-0.04681380093097687,
-0.0077391318045556545,
-0.0633026733994484,
-0.05441254749894142,
-0.04005574434995651,
0.0641314685344696,
0.02382410131394863,
0.03727901726961136,
0.01656210608780384,
-0.021852169185876846,
0.006546492222696543,
-0.09166830778121948,
0.014617356471717358,
-0.030920149758458138,
-0.17033472657203674,
-0.008764060214161873,
-0.07462967187166214,
0.030104471370577812,
-0.028182094916701317,
0.06014356017112732,
0.009707498364150524,
0.021659351885318756,
0.045701779425144196,
0.19338160753250122,
-0.1305190622806549,
0.04657316207885742,
0.18629708886146545,
0.028605682775378227,
0.01623416878283024,
-0.03912713751196861,
0.0845697671175003,
0.09005231410264969,
0.10553744435310364,
-0.06435675173997879,
0.0920850932598114,
-0.11151143163442612,
0.08096154779195786,
0.11466829478740692,
0.11342470347881317,
-0.14776362478733063,
-0.1431904435157776,
-0.03630616143345833,
0.08476708084344864,
-0.06002085655927658,
0.019385477527976036,
0.19877244532108307,
0.06701754033565521,
0.04742811247706413,
-0.00596894696354866,
-0.040002111345529556,
0.023032961413264275,
-0.10489174723625183,
-0.07463234663009644,
-0.16339486837387085,
0.04485562443733215,
-0.03599654883146286,
0.011688504368066788,
0.04258085414767265,
0.05541544407606125,
-0.051639702171087265,
0.21450355648994446,
0.17768269777297974,
-0.09160952270030975,
0.11504901945590973,
-0.001737660146318376,
0.007304520346224308,
0.04435737431049347,
0.07319839298725128,
-0.039456557482481,
0.03310583159327507,
-0.05607840418815613,
0.0003705608251038939,
-0.022310899570584297,
-0.020990075543522835,
0.009090053848922253,
-0.07150210440158844,
-0.030252059921622276,
0.011771979741752148,
-0.029540201649069786,
-0.011314652860164642,
-0.03805723041296005,
-0.03156980499625206,
-0.007497879676520824,
0.18522098660469055,
-0.07334384322166443,
-0.1280219852924347,
-0.046253446489572525,
0.1796732097864151,
-0.04008076339960098,
0.07475841045379639,
-0.03755338862538338,
-0.038040872663259506,
-0.09138092398643494,
0.32296934723854065,
0.16931544244289398,
-0.0012969647068530321,
-0.019167093560099602,
0.08041433244943619,
0.023127835243940353,
0.0009153112187050283,
0.13013294339179993,
0.07421572506427765,
0.19963040947914124,
-0.07483792304992676,
-0.0077658602967858315,
-0.08928410708904266,
0.053305190056562424,
0.005589186679571867,
-0.023048365488648415,
0.145967036485672,
-0.024775611236691475,
-0.10186434537172318,
0.10736146569252014,
0.05436987057328224,
-0.013292749412357807,
0.08462157100439072,
-0.17380854487419128,
-0.017723385244607925,
-0.048745159059762955,
0.1768840253353119,
-0.000461419636849314,
0.05011220648884773,
-0.05980493128299713,
-0.04773898050189018,
0.010267466306686401,
0.040366280823946,
-0.18240724503993988,
-0.025331152603030205,
0.05858362093567848,
-0.19957901537418365,
0.018381236121058464,
-0.08635364472866058,
-0.005604221485555172,
0.07570090144872665,
0.02059720642864704,
-0.014971363358199596,
-0.037486813962459564,
-0.023193353787064552,
-0.14454522728919983,
0.0037710946053266525,
0.10342874377965927,
0.017019890248775482,
-0.021123379468917847,
-0.034336552023887634,
-0.19603438675403595,
-0.01784302294254303,
0.006956105586141348,
0.012677265331149101,
-0.03996654972434044,
0.016185212880373,
-0.03876912221312523,
0.02739311009645462,
0.07005469501018524,
-0.01246192678809166,
-0.045033443719148636,
-0.08931431919336319,
-0.04599737003445625,
0.10886498540639877,
0.0062482417561113834,
0.040710821747779846,
-0.08122934401035309,
-0.09322502464056015,
-0.13285329937934875,
0.0022453467827290297,
-0.042435530573129654,
-0.08913646638393402,
-0.018828459084033966,
-0.015293077565729618,
-0.062490519136190414,
0.04501169174909592,
0.10015180706977844,
-0.03167208284139633,
0.005261406302452087,
-0.034954749047756195,
-0.010762232355773449,
0.12514449656009674,
-0.0917164534330368,
-0.05198075622320175
] |
null | null |
transformers
|
# vit_huge_patch16_224
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_huge_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_huge_patch16_224
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_huge_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_huge_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
64
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_huge_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.07955396175384521,
-0.12608379125595093,
-0.006179355550557375,
0.08355163037776947,
0.16097573935985565,
0.010560628958046436,
0.047906551510095596,
0.010279656387865543,
-0.05338799208402634,
0.01987811177968979,
0.13631419837474823,
0.17251065373420715,
-0.013467931188642979,
-0.006117361132055521,
0.005620485637336969,
-0.31455105543136597,
-0.03564884141087532,
0.16201376914978027,
-0.15067049860954285,
0.08697585761547089,
0.08107032626867294,
-0.15111008286476135,
0.10651998221874237,
0.034883640706539154,
-0.2811044454574585,
0.027975313365459442,
0.017112400382757187,
-0.08389671891927719,
0.16687478125095367,
0.05853681638836861,
0.10012619942426682,
-0.0033625278156250715,
-0.02721220627427101,
0.030375182628631592,
0.009341707453131676,
0.02867334894835949,
-0.0357276126742363,
0.055984172970056534,
0.08559256792068481,
0.13643722236156464,
0.04855216294527054,
-0.031654562801122665,
-0.024378687143325806,
-0.021529408171772957,
-0.10671956837177277,
-0.1425074338912964,
0.0652194544672966,
0.08310740441083908,
-0.042322032153606415,
0.023916466161608696,
0.029281262308359146,
0.16340327262878418,
-0.0032187392935156822,
0.11892296373844147,
0.14061903953552246,
-0.13252466917037964,
-0.11133793741464615,
-0.06145115941762924,
-0.0024399557150900364,
0.0744982361793518,
-0.05028882622718811,
0.08569105714559555,
0.06109471246600151,
0.05516355112195015,
0.018309371545910835,
-0.04280936345458031,
-0.09711915254592896,
0.03193691000342369,
-0.14484867453575134,
-0.05625137314200401,
0.16022957861423492,
0.025714226067066193,
0.030905023217201233,
-0.024125801399350166,
-0.1614036113023758,
0.0009444552706554532,
-0.05577393248677254,
-0.043975379317998886,
-0.008042078465223312,
-0.08890560269355774,
-0.1412419080734253,
-0.03341096639633179,
-0.14120395481586456,
-0.026628319174051285,
-0.11968354880809784,
0.12342219054698944,
0.02942562662065029,
0.08734969794750214,
-0.1331489533185959,
0.045613765716552734,
-0.030528640374541283,
-0.10493103414773941,
0.03530639782547951,
-0.1156710609793663,
0.00998042244464159,
0.02876114286482334,
0.017548711970448494,
-0.11714225262403488,
-0.03685225546360016,
-0.06256311386823654,
0.10382994264364243,
-0.024329762905836105,
-0.018313724547624588,
0.08466774970293045,
-0.07941120862960815,
0.24561156332492828,
-0.12644171714782715,
0.13040852546691895,
-0.0676848441362381,
-0.049399785697460175,
-0.001025065896101296,
0.006989007815718651,
-0.10172053426504135,
-0.08505554497241974,
-0.014753314666450024,
-0.00012030061043333262,
-0.03302284702658653,
0.0965532511472702,
0.047564707696437836,
-0.04173431545495987,
0.0494299978017807,
0.01453319564461708,
0.032146044075489044,
-0.013450372032821178,
0.06239162012934685,
0.18766741454601288,
0.0678209438920021,
-0.1122647300362587,
-0.07749910652637482,
0.050369374454021454,
-0.09195680916309357,
0.014357076026499271,
-0.023302925750613213,
-0.1060403361916542,
0.03925151005387306,
-0.012947329320013523,
0.07140401005744934,
-0.27496138215065,
0.10432077944278717,
0.10694781690835953,
0.025737619027495384,
0.003509087022393942,
0.0393291711807251,
0.03348444774746895,
-0.13356606662273407,
-0.007184829097241163,
0.02137945406138897,
0.016653884202241898,
0.03931235522031784,
0.11128818243741989,
0.09294191002845764,
0.11203363537788391,
-0.100608229637146,
0.011975898407399654,
-0.09014502912759781,
-0.010065893642604351,
-0.13025692105293274,
-0.03049742616713047,
0.040322545915842056,
0.13255062699317932,
-0.03506304696202278,
-0.14537474513053894,
-0.01333069521933794,
0.03045075014233589,
0.028453504666686058,
0.1259838044643402,
-0.08150137215852737,
-0.06977352499961853,
0.034261129796504974,
-0.1438681185245514,
-0.20305855572223663,
0.12380657345056534,
0.05136305093765259,
0.020912783220410347,
0.08074255287647247,
0.11099554598331451,
0.06195914000272751,
-0.14846041798591614,
-0.024428382515907288,
0.009857600554823875,
-0.14126095175743103,
-0.0850621908903122,
0.009155442006886005,
0.19394254684448242,
-0.053768694400787354,
0.0012557327281683683,
-0.083162821829319,
0.06583429127931595,
-0.13034598529338837,
-0.03767040744423866,
-0.07507269829511642,
-0.0043278643861413,
0.03791404142975807,
0.06454932689666748,
0.04183466359972954,
-0.05156804621219635,
-0.013752814382314682,
0.12344381213188171,
0.0397663339972496,
-0.044171761721372604,
0.02947714738547802,
-0.09373751282691956,
0.030563298612833023,
-0.16312412917613983,
0.04894055053591728,
-0.07453187555074692,
0.07835721224546432,
0.005415724124759436,
0.007377767004072666,
0.013468093238770962,
0.15638981759548187,
0.08515314012765884,
0.023188289254903793,
-0.021932469680905342,
-0.044838808476924896,
-0.004485240206122398,
0.012235884554684162,
-0.029903411865234375,
-0.1255892515182495,
0.0018263726960867643,
-0.011176355183124542,
-0.1565721333026886,
-0.11485395580530167,
-0.04324359446763992,
0.14812149107456207,
0.08756188303232193,
-0.001262782490812242,
0.04346537962555885,
-0.05628828704357147,
0.020498961210250854,
-0.005098231136798859,
-0.03206482157111168,
0.03773920238018036,
-0.03149095177650452,
0.049474287778139114,
0.14683380722999573,
-0.02931460365653038,
0.25062185525894165,
0.18505650758743286,
-0.3660176992416382,
0.005729358177632093,
0.0812121331691742,
-0.005609369371086359,
0.030557353049516678,
0.023814894258975983,
-0.08849333971738815,
0.02510880120098591,
-0.062146544456481934,
0.12600672245025635,
-0.028863899409770966,
-0.014781489968299866,
0.10858302563428879,
0.012104125693440437,
-0.09500665962696075,
0.10663150250911713,
0.10156474262475967,
-0.22337056696414948,
0.07158008962869644,
0.1826847344636917,
-0.043997302651405334,
0.007757813204079866,
0.02895997278392315,
-0.04126335307955742,
0.010314402170479298,
0.008734675124287605,
0.02396063134074211,
0.18826822936534882,
-0.2882034182548523,
-0.09948446601629257,
0.029197117313742638,
-0.0872022956609726,
0.002362901344895363,
-0.146669402718544,
0.028010599315166473,
0.04645470157265663,
0.08527613431215286,
-0.043914198875427246,
0.045723412185907364,
-0.005271085537970066,
0.09437727183103561,
-0.019938277080655098,
-0.07441812753677368,
0.023788541555404663,
0.01348149124532938,
-0.02581392042338848,
0.14350250363349915,
-0.01635301671922207,
-0.405844509601593,
-0.13184955716133118,
-0.07706118375062943,
0.051921673119068146,
0.025986995548009872,
0.053459908813238144,
-0.16077680885791779,
-0.06837843358516693,
0.07834891229867935,
0.1455368846654892,
-0.05851886793971062,
0.08824566006660461,
-0.005333306733518839,
0.03345159813761711,
0.01527715940028429,
-0.057603418827056885,
-0.012391332536935806,
-0.07968710362911224,
0.022796060889959335,
0.05282629653811455,
-0.015982504934072495,
0.16441713273525238,
0.061239276081323624,
-0.07273582369089127,
0.05666643753647804,
-0.01407099049538374,
0.17499041557312012,
-0.13142098486423492,
-0.013948329724371433,
0.21012374758720398,
0.010693289339542389,
0.00891102571040392,
0.11159413307905197,
0.06235435977578163,
-0.07125835120677948,
-0.020084498450160027,
-0.09958720207214355,
-0.06928981840610504,
-0.05493229627609253,
-0.01957024820148945,
-0.09958882629871368,
0.04092114418745041,
0.14663274586200714,
0.034482020884752274,
0.007557959295809269,
0.08264268189668655,
0.00829155370593071,
0.0844094455242157,
-0.022443385794758797,
0.12224825471639633,
0.34978175163269043,
-0.06417907029390335,
0.0566900297999382,
-0.059997960925102234,
-0.14456087350845337,
0.03438127785921097,
-0.05664856731891632,
0.16077592968940735,
0.00006930022937012836,
-0.004416018724441528,
0.04271186143159866,
-0.012555794790387154,
0.09696336835622787,
0.1438443809747696,
-0.02785881794989109,
-0.04272063821554184,
-0.019828733056783676,
-0.06482349336147308,
0.027874700725078583,
0.029800277203321457,
0.010272383689880371,
0.02149094082415104,
-0.06809696555137634,
-0.09566112607717514,
0.07214871793985367,
0.020925352349877357,
0.13591192662715912,
-0.3153034448623657,
-0.0393102653324604,
-0.03771888092160225,
-0.06034646928310394,
-0.12529367208480835,
0.027311984449625015,
0.12452707439661026,
0.00048698531463742256,
0.004127867519855499,
-0.11316327750682831,
0.07790927588939667,
-0.049472931772470474,
0.09329758584499359,
0.01825413666665554,
-0.04435500502586365,
0.03674251213669777,
0.03531300276517868,
-0.2040579915046692,
0.20380663871765137,
-0.028426198288798332,
-0.029049379751086235,
-0.016227001324295998,
-0.06580033898353577,
-0.020863864570856094,
0.22530677914619446,
0.05060205236077309,
0.009099654853343964,
0.272896409034729,
-0.0878768339753151,
0.09376710653305054,
0.040805090218782425,
0.09752599895000458,
0.06521376222372055,
-0.02096664533019066,
-0.02532189153134823,
-0.047449540346860886,
0.014688098803162575,
0.05256427079439163,
0.049246031790971756,
-0.08303727954626083,
0.022105379030108452,
-0.035130683332681656,
0.09639333188533783,
-0.014615673571825027,
-0.059957560151815414,
-0.02755356766283512,
0.003943156450986862,
0.014843580313026905,
0.025985026732087135,
-0.06376627087593079,
0.1341727375984192,
0.11002957820892334,
-0.07986176013946533,
0.13633912801742554,
-0.07551448047161102,
0.053035952150821686,
-0.017252793535590172,
-0.09860704094171524,
0.07206359505653381,
-0.09713935106992722,
-0.0009258367936126888,
0.009720864705741405,
0.052982572466135025,
-0.07796906679868698,
-0.03269972279667854,
0.003963158465921879,
0.029267488047480583,
-0.10570845007896423,
-0.019656138494610786,
0.017077483236789703,
-0.08251839131116867,
-0.0927107110619545,
-0.013487393036484718,
0.10399044305086136,
0.014825845137238503,
-0.05767635628581047,
0.04688384011387825,
0.06202453747391701,
0.030048180371522903,
-0.06096198409795761,
-0.0742645338177681,
0.10553797334432602,
0.034989144653081894,
-0.26649901270866394,
0.01121255662292242,
-0.012041407637298107,
-0.017433540895581245,
-0.08195821195840836,
0.01183802355080843,
0.14136622846126556,
-0.07489430159330368,
0.00891997292637825,
0.0625074952840805,
-0.16646182537078857,
-0.06553252786397934,
0.07547743618488312,
0.15846146643161774,
0.2063213288784027,
-0.10516616702079773,
0.016237184405326843,
0.006940824445337057,
-0.11071790009737015,
0.06188841909170151,
0.18114595115184784,
0.07648883014917374,
-0.028548026457428932,
0.01010046899318695,
0.02469428814947605,
-0.05629151687026024,
0.058885447680950165,
-0.09344055503606796,
0.002377850702032447,
-0.06672567874193192,
-0.14426881074905396,
0.0846240371465683,
0.03787994757294655,
-0.017331410199403763,
0.16651533544063568,
0.07370076328516006,
0.03790559992194176,
-0.012535206973552704,
-0.1116604432463646,
0.04158993810415268,
0.011395975947380066,
-0.032384179532527924,
-0.0675906240940094,
-0.03954156115651131,
-0.0538051500916481,
-0.012160911224782467,
0.19135713577270508,
0.02998417243361473,
-0.07742448151111603,
-0.033106330782175064,
0.046611227095127106,
-0.1523084044456482,
-0.21195945143699646,
-0.006107176188379526,
-0.04136497527360916,
0.17980390787124634,
-0.15442241728305817,
0.04901190847158432,
0.10610048472881317,
0.07176442444324493,
-0.019773190841078758,
0.11880508810281754,
-0.00236140307970345,
0.0004822735209017992,
0.12349607795476913,
-0.0969318151473999,
-0.09894026070833206,
-0.10658272355794907,
0.11290004849433899,
0.10733730345964432,
0.182835191488266,
0.12713328003883362,
-0.11954003572463989,
-0.0019897387828677893,
0.004315047990530729,
-0.08020849525928497,
-0.03256123512983322,
0.0857849046587944,
0.0831046849489212,
0.05111169442534447,
-0.09962143003940582,
-0.03939978778362274,
0.006071725394576788,
-0.28378841280937195,
-0.11022055149078369,
0.09069038927555084,
-0.06038307398557663,
-0.0556563101708889,
0.05364450812339783,
-0.10994545370340347,
-0.15848921239376068,
-0.03485479578375816,
-0.046262387186288834,
-0.1234041228890419,
0.09605996310710907,
0.22785358130931854,
0.08738040924072266,
0.04717838019132614,
-0.08572277426719666,
0.08355806022882462,
-0.08240856975317001,
0.06008323282003403,
-0.06722860783338547,
0.13575641810894012,
-0.23511481285095215,
-0.02797449566423893,
0.014696087688207626,
0.15795539319515228,
-0.06980448961257935,
0.004128559958189726,
-0.1126323789358139,
0.040019284933805466,
-0.045332759618759155,
0.005153258331120014,
0.010968446731567383,
-0.04499691724777222,
-0.0086530651897192,
-0.14508147537708282,
-0.07367975264787674,
0.029780041426420212,
-0.06235995888710022,
-0.005622946657240391,
0.046048544347286224,
0.012462880462408066,
-0.01805216073989868,
-0.018832078203558922,
0.05085211247205734,
-0.07346624881029129,
0.07490388303995132,
0.035466764122247696,
-0.10064632445573807,
0.09596066921949387,
-0.048466529697179794,
-0.17119069397449493,
0.13947036862373352,
0.0646853968501091,
0.02935154177248478,
0.07188145071268082,
0.1037764698266983,
0.008798932656645775,
0.048841580748558044,
-0.007285766769200563,
0.09118540585041046,
-0.04928845167160034,
0.0030612030532211065,
-0.05925274267792702,
-0.06129559129476547,
-0.037986189126968384,
0.051993273198604584,
0.003918042406439781,
0.042838409543037415,
0.024190904572606087,
-0.01373154018074274,
0.01153172180056572,
-0.09329783916473389,
0.023255962878465652,
-0.0350654162466526,
-0.17538543045520782,
-0.008773475885391235,
-0.056286197155714035,
0.0327274315059185,
-0.03201696649193764,
0.06591285765171051,
0.009710068814456463,
0.015877975150942802,
0.043225210160017014,
0.19824549555778503,
-0.14004261791706085,
0.04566430673003197,
0.18135303258895874,
0.03177402541041374,
0.014470785856246948,
-0.035777341574430466,
0.07842817157506943,
0.09848733246326447,
0.11991537362337112,
-0.06097032502293587,
0.1039585992693901,
-0.11192744225263596,
0.07287023216485977,
0.12352778017520905,
0.11400522291660309,
-0.15315857529640198,
-0.14857831597328186,
-0.037260252982378006,
0.08297406882047653,
-0.05508068576455116,
0.04075615853071213,
0.18877319991588593,
0.07707077264785767,
0.04773992300033569,
-0.015104464255273342,
-0.030656883493065834,
0.04118405282497406,
-0.10581738501787186,
-0.0794559195637703,
-0.1437682807445526,
0.0388273261487484,
-0.03606976941227913,
0.004206150304526091,
0.05121256783604622,
0.05391639471054077,
-0.05524301528930664,
0.2119317650794983,
0.17506594955921173,
-0.08826007694005966,
0.11046160012483597,
-0.0020008711144328117,
0.006216592621058226,
0.054930903017520905,
0.0796552523970604,
-0.049529243260622025,
0.04303405061364174,
-0.05266900733113289,
0.007168890908360481,
-0.025478532537817955,
-0.032527897506952286,
0.0032800694461911917,
-0.0702894926071167,
-0.03445614501833916,
0.005858574528247118,
-0.02400391362607479,
-0.030090995132923126,
-0.04031932353973389,
-0.02847316302359104,
-0.010533883236348629,
0.17801442742347717,
-0.07340508699417114,
-0.13204742968082428,
-0.028933102265000343,
0.17311091721057892,
-0.04601876437664032,
0.07406216859817505,
-0.040146347135305405,
-0.034766290336847305,
-0.11061462759971619,
0.3298923373222351,
0.176719531416893,
-0.019876020029187202,
-0.019315725192427635,
0.07995782792568207,
0.023039525374770164,
-0.005707854870706797,
0.12838777899742126,
0.07492213696241379,
0.22578157484531403,
-0.0660848543047905,
-0.008429741486907005,
-0.08954323083162308,
0.06386633962392807,
0.01239035464823246,
-0.020376484841108322,
0.14278678596019745,
-0.016841722652316093,
-0.10507343709468842,
0.09777391701936722,
0.056886546313762665,
-0.008282875642180443,
0.07700201123952866,
-0.17840582132339478,
-0.015565961599349976,
-0.045291222631931305,
0.17167319357395172,
-0.00594269810244441,
0.055819619446992874,
-0.06307907402515411,
-0.04892586171627045,
-0.023567546159029007,
0.03591111674904823,
-0.19653619825839996,
-0.00881084892898798,
0.06485440582036972,
-0.1909898966550827,
0.03480745479464531,
-0.0818687155842781,
0.00920548103749752,
0.07414419203996658,
0.014450193382799625,
-0.020203249529004097,
-0.044778455048799515,
-0.02480573207139969,
-0.1401297152042389,
-0.00019314885139465332,
0.10490583628416061,
0.01803305558860302,
-0.039554789662361145,
-0.026655368506908417,
-0.19493848085403442,
-0.026744842529296875,
0.028538713231682777,
0.016286766156554222,
-0.03987088054418564,
0.015289044007658958,
-0.03569589555263519,
0.016504770144820213,
0.07393176108598709,
-0.010505986399948597,
-0.04707378149032593,
-0.08983311057090759,
-0.036742985248565674,
0.10922107845544815,
0.0035143205896019936,
0.032313041388988495,
-0.07623261958360672,
-0.09784857928752899,
-0.1464805006980896,
0.003289151471108198,
-0.043528489768505096,
-0.08042830228805542,
-0.01944754831492901,
-0.016256846487522125,
-0.05438585206866264,
0.034539710730314255,
0.09292043000459671,
-0.032553959637880325,
0.005435393191874027,
-0.022259410470724106,
-0.012784754857420921,
0.12011796236038208,
-0.08818571269512177,
-0.049038682132959366
] |
null | null |
transformers
|
# vit_huge_patch32_384
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_huge_patch32_384
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_huge_patch32_384
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_huge_patch32_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_huge_patch32_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
63
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_huge_patch32_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.08256111294031143,
-0.12424663454294205,
-0.0056297145783901215,
0.08221103996038437,
0.16368825733661652,
0.009874277748167515,
0.0513765811920166,
0.01653192937374115,
-0.05624595656991005,
0.020242741331458092,
0.12786118686199188,
0.17118079960346222,
-0.010467668063938618,
-0.0072916545905172825,
0.0037366419564932585,
-0.31879734992980957,
-0.03721362724900246,
0.16487424075603485,
-0.14730559289455414,
0.08582797646522522,
0.08383343368768692,
-0.14866788685321808,
0.10718229413032532,
0.03447748348116875,
-0.2857658565044403,
0.031082456931471825,
0.01702834852039814,
-0.08901476114988327,
0.16613760590553284,
0.06147913634777069,
0.10218920558691025,
-0.005469390656799078,
-0.018414206802845,
0.03091641701757908,
0.008478237316012383,
0.032870739698410034,
-0.03588495030999184,
0.056113045662641525,
0.08618944883346558,
0.14229802787303925,
0.055778127163648605,
-0.038017917424440384,
-0.02815104089677334,
-0.021869322285056114,
-0.1013089120388031,
-0.1383550763130188,
0.07050206512212753,
0.08640232682228088,
-0.033848170191049576,
0.02700859308242798,
0.03207564726471901,
0.16113682091236115,
-0.00008128133049467579,
0.12220301479101181,
0.14286138117313385,
-0.13837307691574097,
-0.11081317812204361,
-0.052155788987874985,
-0.00007035242015263066,
0.07686500251293182,
-0.04839859902858734,
0.08254769444465637,
0.057695358991622925,
0.05703743174672127,
0.00933594349771738,
-0.04306073114275932,
-0.08748230338096619,
0.026553714647889137,
-0.14515246450901031,
-0.053930554538965225,
0.15705130994319916,
0.025216611102223396,
0.03116501122713089,
-0.028915001079440117,
-0.16607464849948883,
0.0102854548022151,
-0.058021992444992065,
-0.039547111839056015,
-0.014034629799425602,
-0.08674851059913635,
-0.14817799627780914,
-0.03518134728074074,
-0.1381288319826126,
-0.029000109061598778,
-0.11864328384399414,
0.12007822841405869,
0.030917486175894737,
0.08818450570106506,
-0.1375603973865509,
0.04514600709080696,
-0.030560776591300964,
-0.1019805446267128,
0.03995142877101898,
-0.11570090055465698,
0.010505832731723785,
0.025953399017453194,
0.01567806489765644,
-0.13035522401332855,
-0.03538457676768303,
-0.06686654686927795,
0.09934041649103165,
-0.02459201216697693,
-0.015504672192037106,
0.08512508869171143,
-0.0794503390789032,
0.2511201798915863,
-0.12691278755664825,
0.12262236326932907,
-0.0709865391254425,
-0.05181974172592163,
-0.0021490033250302076,
0.00929348822683096,
-0.10653013736009598,
-0.09174197912216187,
-0.015014861710369587,
-0.005191952455788851,
-0.031023109331727028,
0.09886311739683151,
0.05677640065550804,
-0.04380430281162262,
0.06226992607116699,
0.016207190230488777,
0.030510349199175835,
-0.014827236533164978,
0.06098450347781181,
0.1926063746213913,
0.0717499703168869,
-0.10943174362182617,
-0.07952740043401718,
0.04353538528084755,
-0.08925318717956543,
0.011391684412956238,
-0.02554231882095337,
-0.11104139685630798,
0.037037286907434464,
-0.013684425503015518,
0.07053873687982559,
-0.2761780917644501,
0.10646525770425797,
0.1090613529086113,
0.0237625390291214,
0.0013499691849574447,
0.0393652580678463,
0.030714601278305054,
-0.12815366685390472,
-0.003307310165837407,
0.021579258143901825,
0.01341172307729721,
0.043388694524765015,
0.1129116341471672,
0.08857283741235733,
0.11921434849500656,
-0.10899669677019119,
0.012591262347996235,
-0.08367136120796204,
-0.011892292648553848,
-0.12659792602062225,
-0.028195740655064583,
0.039058998227119446,
0.12267952412366867,
-0.04349425807595253,
-0.1466372013092041,
0.00017012532043736428,
0.03208016976714134,
0.028191320598125458,
0.1185164526104927,
-0.07394115626811981,
-0.06959005445241928,
0.04149485006928444,
-0.1432429999113083,
-0.19918017089366913,
0.12285705655813217,
0.0547846294939518,
0.026500605046749115,
0.08129850029945374,
0.10726785659790039,
0.0713699534535408,
-0.14176973700523376,
-0.023957395926117897,
0.020389927551150322,
-0.14137102663516998,
-0.09290266782045364,
0.004773580003529787,
0.19440798461437225,
-0.054999109357595444,
-0.0023771198466420174,
-0.08070071786642075,
0.07148481160402298,
-0.13112907111644745,
-0.037537939846515656,
-0.07552520185709,
-0.009307844564318657,
0.035601768642663956,
0.0627988874912262,
0.04712490737438202,
-0.048356298357248306,
-0.01773850992321968,
0.12754274904727936,
0.04195202887058258,
-0.04892479255795479,
0.030290767550468445,
-0.0916563868522644,
0.022012881934642792,
-0.1651369035243988,
0.047594666481018066,
-0.08209110051393509,
0.0687624141573906,
0.00703054154291749,
0.00751061737537384,
0.011660140939056873,
0.1607593297958374,
0.08852905035018921,
0.026100704446434975,
-0.023669408634305,
-0.04675617441534996,
-0.00025005266070365906,
0.01736871711909771,
-0.0374150313436985,
-0.1162453293800354,
-0.004639091435819864,
-0.010582181625068188,
-0.1670716255903244,
-0.09333240240812302,
-0.043578922748565674,
0.1421680599451065,
0.08748773485422134,
-0.0011082628043368459,
0.04610240459442139,
-0.05883508548140526,
0.026844948530197144,
-0.00714559992775321,
-0.031140880659222603,
0.04360603168606758,
-0.032800156623125076,
0.04963643476366997,
0.14454801380634308,
-0.025553477928042412,
0.23164808750152588,
0.18217749893665314,
-0.36116543412208557,
0.0024547160137444735,
0.08219436556100845,
-0.005614956375211477,
0.027244070544838905,
0.0192719753831625,
-0.09382916241884232,
0.02348492480814457,
-0.06513296812772751,
0.12660011649131775,
-0.027961470186710358,
-0.006093829870223999,
0.10544756799936295,
0.011499359272420406,
-0.09879449009895325,
0.10441136360168457,
0.10371831804513931,
-0.22719162702560425,
0.07260221242904663,
0.18376435339450836,
-0.04959849640727043,
0.007242144551128149,
0.029003186151385307,
-0.04549683630466461,
0.017893744632601738,
0.005472222343087196,
0.02341501973569393,
0.17748458683490753,
-0.27845731377601624,
-0.09629734605550766,
0.030578216537833214,
-0.0834202989935875,
0.003472505370154977,
-0.14952202141284943,
0.03138690069317818,
0.0435054711997509,
0.08610013872385025,
-0.04085557907819748,
0.04731757938861847,
-0.007098039612174034,
0.09580346941947937,
-0.01686122454702854,
-0.08365917950868607,
0.02569693885743618,
0.013530545867979527,
-0.023061478510499,
0.1384597271680832,
-0.019350768998265266,
-0.4044802188873291,
-0.1372615098953247,
-0.08419951051473618,
0.05181805416941643,
0.02667699009180069,
0.054789066314697266,
-0.1636359542608261,
-0.06935284286737442,
0.0747905746102333,
0.13539977371692657,
-0.05034441128373146,
0.09017136693000793,
0.0020983151625841856,
0.034864675253629684,
0.012543848715722561,
-0.054764796048402786,
-0.013445989228785038,
-0.07883429527282715,
0.030830278992652893,
0.05297686532139778,
-0.02298257313668728,
0.16374395787715912,
0.06788651645183563,
-0.07251935452222824,
0.0578470341861248,
-0.015356726944446564,
0.17469441890716553,
-0.12769317626953125,
-0.006345946807414293,
0.21667815744876862,
0.007831870578229427,
0.010295352898538113,
0.10473266243934631,
0.061926525086164474,
-0.06488285213708878,
-0.017432348802685738,
-0.10258457809686661,
-0.07226564735174179,
-0.04749812185764313,
-0.019749190658330917,
-0.10003557801246643,
0.04770048335194588,
0.15043878555297852,
0.03183269873261452,
0.007479330524802208,
0.08438379317522049,
0.015324662439525127,
0.08494678139686584,
-0.022893721237778664,
0.11882293224334717,
0.34989142417907715,
-0.06641936302185059,
0.05677196383476257,
-0.06226992607116699,
-0.14836440980434418,
0.03349917009472847,
-0.05095924809575081,
0.14782579243183136,
-0.0022035527508705854,
-0.004736723378300667,
0.03202648088335991,
-0.016413874924182892,
0.09536673873662949,
0.14459672570228577,
-0.03079761564731598,
-0.0385940819978714,
-0.022219544276595116,
-0.06507804244756699,
0.029019271954894066,
0.038115885108709335,
0.020207367837429047,
0.01637142337858677,
-0.0674683228135109,
-0.08704725652933121,
0.0676223561167717,
0.030518537387251854,
0.14101570844650269,
-0.3166668117046356,
-0.0427151620388031,
-0.03555064648389816,
-0.05510149523615837,
-0.12538059055805206,
0.03154532611370087,
0.12456765025854111,
0.001003576093353331,
-0.0005943775177001953,
-0.11435946822166443,
0.07709404826164246,
-0.04911595210433006,
0.0932391807436943,
0.027423657476902008,
-0.03727346286177635,
0.038037098944187164,
0.033723700791597366,
-0.20383328199386597,
0.18986539542675018,
-0.02960965968668461,
-0.03654201701283455,
-0.02472924441099167,
-0.06070332229137421,
-0.024006055667996407,
0.2200193852186203,
0.05219539999961853,
0.0063353790901601315,
0.26423677802085876,
-0.08848986029624939,
0.08709418773651123,
0.04029080644249916,
0.09955858439207077,
0.07343649119138718,
-0.017976930364966393,
-0.028779080137610435,
-0.052417218685150146,
0.021137654781341553,
0.04387630149722099,
0.054560910910367966,
-0.08331816643476486,
0.020731719210743904,
-0.0231346245855093,
0.09054260700941086,
-0.013749993406236172,
-0.06162102892994881,
-0.03109707124531269,
0.014341865666210651,
0.02746032364666462,
0.022547081112861633,
-0.0666971504688263,
0.1325433999300003,
0.10592299699783325,
-0.08091260492801666,
0.14006468653678894,
-0.0741065964102745,
0.05045347288250923,
-0.013587333261966705,
-0.1091308668255806,
0.07968082278966904,
-0.1009003296494484,
0.0024822137784212828,
0.009970121085643768,
0.048808157444000244,
-0.0778636783361435,
-0.03381796181201935,
0.002182823373004794,
0.0277409628033638,
-0.10423064231872559,
-0.017558740451931953,
0.017467746511101723,
-0.07461699843406677,
-0.09171856194734573,
-0.004759954288601875,
0.09621313959360123,
0.009756078012287617,
-0.05247404798865318,
0.04708912968635559,
0.07030956447124481,
0.02110453136265278,
-0.06063389778137207,
-0.07531534880399704,
0.10083786398172379,
0.04009230062365532,
-0.272225946187973,
0.007019245997071266,
-0.0078100175596773624,
-0.01726479083299637,
-0.0785377249121666,
0.01460836362093687,
0.13837891817092896,
-0.08175169676542282,
0.01198283489793539,
0.05838781222701073,
-0.1766785830259323,
-0.0641363188624382,
0.08311139047145844,
0.163648322224617,
0.20233087241649628,
-0.10414794832468033,
0.019768983125686646,
0.00723333889618516,
-0.11639554053544998,
0.05490778759121895,
0.19035601615905762,
0.0801510363817215,
-0.028074460104107857,
0.009826451539993286,
0.024183673784136772,
-0.059947237372398376,
0.05125562846660614,
-0.09009452909231186,
-0.0033089451026171446,
-0.06676686555147171,
-0.141715407371521,
0.07737074047327042,
0.03924008458852768,
-0.017035607248544693,
0.1719023585319519,
0.0760050043463707,
0.04264214262366295,
-0.012655266560614109,
-0.11683151125907898,
0.03519487753510475,
0.01503406185656786,
-0.03633139654994011,
-0.06724660843610764,
-0.03659239411354065,
-0.05320431664586067,
-0.015471034683287144,
0.1975013017654419,
0.028116069734096527,
-0.07401330024003983,
-0.02028495818376541,
0.04024862125515938,
-0.14093676209449768,
-0.21498548984527588,
-0.007494934368878603,
-0.04624541103839874,
0.17776787281036377,
-0.15148460865020752,
0.048631563782691956,
0.10711587220430374,
0.07081088423728943,
-0.018427619710564613,
0.11834710091352463,
0.0014535110676661134,
-0.0029276127461344004,
0.12164771556854248,
-0.09441924840211868,
-0.08609863370656967,
-0.11183609813451767,
0.1090647354722023,
0.10926267504692078,
0.17616838216781616,
0.12996673583984375,
-0.12099213153123856,
0.0019070807611569762,
0.0015865812310948968,
-0.07922010868787766,
-0.032593920826911926,
0.08300836384296417,
0.08740779012441635,
0.04668094217777252,
-0.09506469964981079,
-0.034794796258211136,
0.004828972276300192,
-0.28125226497650146,
-0.10387062281370163,
0.08831401914358139,
-0.06317398697137833,
-0.052506525069475174,
0.05483512952923775,
-0.11600196361541748,
-0.1743084043264389,
-0.033251840621232986,
-0.0490555576980114,
-0.12145326286554337,
0.09891501069068909,
0.21852236986160278,
0.08813989907503128,
0.04334549978375435,
-0.09291409701108932,
0.08716437220573425,
-0.08707467466592789,
0.060934025794267654,
-0.07228291779756546,
0.13695698976516724,
-0.22984273731708527,
-0.028052525594830513,
0.014054648578166962,
0.16208083927631378,
-0.07043947279453278,
0.006425726693123579,
-0.11765436083078384,
0.0418226383626461,
-0.05923589691519737,
-0.0004729079082608223,
0.0065067908726632595,
-0.045314863324165344,
-0.012617179192602634,
-0.14357982575893402,
-0.07023891061544418,
0.03422312065958977,
-0.05844749137759209,
-0.008984406478703022,
0.04295480623841286,
0.009591273963451385,
-0.018549630418419838,
-0.017138618975877762,
0.04543941095471382,
-0.07063775509595871,
0.07099121063947678,
0.030583536252379417,
-0.10097673535346985,
0.09272423386573792,
-0.04005742445588112,
-0.17209301888942719,
0.1414288729429245,
0.06453877687454224,
0.03295740485191345,
0.07100532203912735,
0.10424146056175232,
0.007969703525304794,
0.04401780292391777,
-0.008692387491464615,
0.0911317691206932,
-0.044795285910367966,
0.011248069815337658,
-0.06476487219333649,
-0.05587473884224892,
-0.04342512786388397,
0.055426329374313354,
0.016258662566542625,
0.04349551722407341,
0.02125760354101658,
-0.01752815581858158,
0.012143905274569988,
-0.08597945421934128,
0.023354867473244667,
-0.0366421602666378,
-0.17629313468933105,
-0.002120169810950756,
-0.06432470679283142,
0.03375432640314102,
-0.032879941165447235,
0.057685259729623795,
0.0098425829783082,
0.0184280164539814,
0.04570792242884636,
0.18556052446365356,
-0.13601736724376678,
0.045973192900419235,
0.19446460902690887,
0.027896888554096222,
0.014840814284980297,
-0.02874787151813507,
0.08182544261217117,
0.10328149050474167,
0.12317129969596863,
-0.05869853496551514,
0.09247798472642899,
-0.11746332794427872,
0.07658984512090683,
0.12473288923501968,
0.10995311290025711,
-0.16006772220134735,
-0.15226171910762787,
-0.03697417676448822,
0.08485236018896103,
-0.0588982068002224,
0.03693806752562523,
0.17879092693328857,
0.0767013430595398,
0.04254448413848877,
-0.014314317144453526,
-0.03224978595972061,
0.03897342085838318,
-0.09233186393976212,
-0.07926163822412491,
-0.1434106081724167,
0.03532109782099724,
-0.03699541464447975,
0.0049355714581906796,
0.04766437038779259,
0.049971356987953186,
-0.05427996441721916,
0.22029893100261688,
0.16181784868240356,
-0.09022734314203262,
0.10993912070989609,
-0.003715943545103073,
0.00728007173165679,
0.06184348836541176,
0.07440216094255447,
-0.04899470508098602,
0.047284845262765884,
-0.0514516644179821,
0.006421130150556564,
-0.02887227199971676,
-0.029473736882209778,
0.007888369262218475,
-0.07005123794078827,
-0.03703654184937477,
0.0065984223037958145,
-0.01876249723136425,
-0.030054016038775444,
-0.0369032546877861,
-0.027500679716467857,
-0.010983495973050594,
0.17092002928256989,
-0.07938190549612045,
-0.1312277764081955,
-0.028556525707244873,
0.1872655153274536,
-0.043199677020311356,
0.0734199807047844,
-0.03817671164870262,
-0.0394698791205883,
-0.11459159106016159,
0.3248864710330963,
0.17461521923542023,
-0.02907649427652359,
-0.019009895622730255,
0.07862148433923721,
0.023322822526097298,
-0.003290672553703189,
0.13744670152664185,
0.07855971902608871,
0.21697838604450226,
-0.06562201678752899,
-0.010713028721511364,
-0.08781713247299194,
0.0609576441347599,
0.00887282658368349,
-0.02411634661257267,
0.15001434087753296,
-0.021059511229395866,
-0.10428813844919205,
0.10183041542768478,
0.05306233838200569,
-0.0038594973739236593,
0.07599744200706482,
-0.17387233674526215,
-0.018154766410589218,
-0.03641364350914955,
0.17537827789783478,
-0.005848724395036697,
0.05592062696814537,
-0.06729765981435776,
-0.04017512872815132,
-0.016189366579055786,
0.03527052327990532,
-0.19300512969493866,
0.0023640731815248728,
0.061114758253097534,
-0.1999429315328598,
0.024136831983923912,
-0.07952146977186203,
0.0104002570733428,
0.07351260632276535,
0.014627222903072834,
-0.017005203291773796,
-0.037264078855514526,
-0.028258295729756355,
-0.1473342925310135,
-0.0007825910579413176,
0.10205131769180298,
0.009763508103787899,
-0.03121921978890896,
-0.030114946886897087,
-0.2004416137933731,
-0.0262447502464056,
0.03394359350204468,
0.019851479679346085,
-0.0411764420568943,
0.01018823403865099,
-0.03781993314623833,
0.02701849676668644,
0.07427841424942017,
-0.009548968635499477,
-0.0488070510327816,
-0.08598709851503372,
-0.03646814823150635,
0.10644664615392685,
0.003441767068579793,
0.022733302786946297,
-0.07615324854850769,
-0.09741831570863724,
-0.15843141078948975,
0.010935087688267231,
-0.03900708630681038,
-0.08025866746902466,
-0.023631049320101738,
-0.016792934387922287,
-0.058593571186065674,
0.035775378346443176,
0.09218757599592209,
-0.03654836490750313,
0.006759442389011383,
-0.01572415791451931,
-0.013021968305110931,
0.12031052261590958,
-0.09864562004804611,
-0.053711723536252975
] |
null | null |
transformers
|
# vit_large_patch16_224
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_large_patch16_224
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_large_patch16_224
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_large_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_large_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
64
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_large_patch16_224\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.08101126551628113,
-0.1245104968547821,
-0.006042378023266792,
0.0812443420290947,
0.16184081137180328,
0.011273846961557865,
0.04556046426296234,
0.01032885443419218,
-0.053977079689502716,
0.017692456021904945,
0.1342523694038391,
0.17616531252861023,
-0.013633537106215954,
-0.011258551850914955,
0.005057038273662329,
-0.3117009103298187,
-0.03474315255880356,
0.16198140382766724,
-0.140976682305336,
0.08673278242349625,
0.08325041830539703,
-0.15022264420986176,
0.10522029548883438,
0.03656123951077461,
-0.2741881012916565,
0.027049507945775986,
0.018003147095441818,
-0.08701152354478836,
0.1676228642463684,
0.05773922801017761,
0.10035021603107452,
-0.0012175764422863722,
-0.02538999915122986,
0.028515011072158813,
0.008209939114749432,
0.027416357770562172,
-0.03555489331483841,
0.0543815903365612,
0.08713878691196442,
0.1350107640028,
0.045569851994514465,
-0.027954261749982834,
-0.020829293876886368,
-0.02218249812722206,
-0.10686536878347397,
-0.15000425279140472,
0.0661511942744255,
0.07780744880437851,
-0.039411451667547226,
0.02884736657142639,
0.030019789934158325,
0.16159862279891968,
-0.006059217266738415,
0.1203671395778656,
0.14405092597007751,
-0.1352488100528717,
-0.10802331566810608,
-0.06574387103319168,
-0.0035833686124533415,
0.07357785850763321,
-0.052573200315237045,
0.08362852036952972,
0.061163995414972305,
0.05433595925569534,
0.020126909017562866,
-0.040662847459316254,
-0.10319641977548599,
0.0321987010538578,
-0.1434430629014969,
-0.0564495325088501,
0.16347625851631165,
0.025096649304032326,
0.03158576413989067,
-0.023191828280687332,
-0.15952812135219574,
0.004064743872731924,
-0.05700911208987236,
-0.04152044653892517,
-0.010035752318799496,
-0.08925747871398926,
-0.13926833868026733,
-0.032982416450977325,
-0.14209109544754028,
-0.027725733816623688,
-0.1196550577878952,
0.12114907056093216,
0.02799821086227894,
0.08643778413534164,
-0.13435852527618408,
0.04548460617661476,
-0.03355933725833893,
-0.10509122163057327,
0.03348080441355705,
-0.11710382252931595,
0.009553713724017143,
0.028276728466153145,
0.01806037127971649,
-0.1213453859090805,
-0.038341835141181946,
-0.061176102608442307,
0.11002012342214584,
-0.024783650413155556,
-0.019280102103948593,
0.08363243192434311,
-0.08355863392353058,
0.24828854203224182,
-0.127567857503891,
0.13059324026107788,
-0.06739318370819092,
-0.05202845111489296,
-0.0016356499399989843,
0.008695337921380997,
-0.10207158327102661,
-0.08706896007061005,
-0.011923497542738914,
-0.0015240336069837213,
-0.03422160446643829,
0.09377474337816238,
0.04818032309412956,
-0.038489922881126404,
0.051845673471689224,
0.014354347251355648,
0.030532974749803543,
-0.011912673711776733,
0.06127077713608742,
0.18300175666809082,
0.07238545268774033,
-0.11299560964107513,
-0.07559041678905487,
0.05318130552768707,
-0.09055306762456894,
0.015914030373096466,
-0.026048460975289345,
-0.10792496055364609,
0.03924892097711563,
-0.01394524797797203,
0.07178498059511185,
-0.27805399894714355,
0.10154137015342712,
0.105037622153759,
0.025750981643795967,
0.0049969847314059734,
0.04199009761214256,
0.031109727919101715,
-0.13165786862373352,
-0.003305852646008134,
0.019468851387500763,
0.011629282496869564,
0.03853864595293999,
0.11267147958278656,
0.09404435008764267,
0.11267999559640884,
-0.09995971620082855,
0.013080926612019539,
-0.09228362888097763,
-0.014377687126398087,
-0.1288667917251587,
-0.028805037960410118,
0.037206124514341354,
0.1336989551782608,
-0.033538661897182465,
-0.14878858625888824,
-0.015308523550629616,
0.02696782350540161,
0.027786625549197197,
0.1257718801498413,
-0.07922351360321045,
-0.06773990392684937,
0.03805107995867729,
-0.1457909792661667,
-0.19954712688922882,
0.12566708028316498,
0.05016819015145302,
0.0208401158452034,
0.08077671378850937,
0.11157019436359406,
0.0666641965508461,
-0.15415114164352417,
-0.02337109111249447,
0.015329668298363686,
-0.1413828581571579,
-0.08600962162017822,
0.01066603884100914,
0.19211114943027496,
-0.05036141723394394,
-0.000013678612958756275,
-0.08354011178016663,
0.06603238731622696,
-0.13065995275974274,
-0.041177183389663696,
-0.07508333772420883,
-0.005665823817253113,
0.04500149190425873,
0.0663432702422142,
0.04264483600854874,
-0.049573011696338654,
-0.012660816311836243,
0.12641176581382751,
0.04026850312948227,
-0.04696814715862274,
0.030013544484972954,
-0.09508154541254044,
0.03343495726585388,
-0.1594444066286087,
0.04661291837692261,
-0.07503784447908401,
0.07513067126274109,
0.0041227866895496845,
0.007223986554890871,
0.009691942483186722,
0.15608839690685272,
0.0863516628742218,
0.025124767795205116,
-0.02268182672560215,
-0.045603446662425995,
-0.006026026327162981,
0.012233558110892773,
-0.02689853310585022,
-0.12562374770641327,
0.0023881318047642708,
-0.012309967540204525,
-0.15276527404785156,
-0.1144467368721962,
-0.04448143020272255,
0.14589805901050568,
0.08514850586652756,
-0.0006523970514535904,
0.04527794569730759,
-0.05396571755409241,
0.02080167457461357,
-0.0048094242811203,
-0.03213074058294296,
0.03759342059493065,
-0.03145648539066315,
0.05006895214319229,
0.1468346118927002,
-0.027340196073055267,
0.25372788310050964,
0.18031734228134155,
-0.3629385530948639,
0.0035860876087099314,
0.08717263489961624,
-0.003878980875015259,
0.030900996178388596,
0.020614834502339363,
-0.088055819272995,
0.025330210104584694,
-0.06426429003477097,
0.12390792369842529,
-0.029282240197062492,
-0.012396647594869137,
0.1081315353512764,
0.008242454379796982,
-0.09211450070142746,
0.10696031898260117,
0.09259243309497833,
-0.22716952860355377,
0.07261509448289871,
0.17691443860530853,
-0.05209164693951607,
0.006584972608834505,
0.029219428077340126,
-0.042602553963661194,
0.01344913337379694,
0.010385682806372643,
0.023804912343621254,
0.1891634464263916,
-0.2890004515647888,
-0.09963487833738327,
0.030437923967838287,
-0.08744858205318451,
0.0032287007197737694,
-0.1468707174062729,
0.029747724533081055,
0.046947527676820755,
0.08331308513879776,
-0.04495491459965706,
0.04523584619164467,
-0.0032913677860051394,
0.09382762759923935,
-0.01864101178944111,
-0.07592037320137024,
0.025657307356595993,
0.014639098197221756,
-0.02829209342598915,
0.14575237035751343,
-0.019763948395848274,
-0.40679696202278137,
-0.1379416584968567,
-0.0735524594783783,
0.0566372275352478,
0.02499321475625038,
0.05507689714431763,
-0.1628505140542984,
-0.07015588879585266,
0.07781603932380676,
0.13842180371284485,
-0.055746257305145264,
0.08438383787870407,
-0.0034362671431154013,
0.03357483819127083,
0.014094206504523754,
-0.05797627195715904,
-0.012428228743374348,
-0.07907196879386902,
0.02476249448955059,
0.051868464797735214,
-0.015593579038977623,
0.16776488721370697,
0.060718949884176254,
-0.0734032467007637,
0.05512360855937004,
-0.014947671443223953,
0.1729213446378708,
-0.13221095502376556,
-0.017048517242074013,
0.2128666192293167,
0.01031230203807354,
0.008304264396429062,
0.11184550076723099,
0.06130241975188255,
-0.07019468396902084,
-0.020849261432886124,
-0.10036542266607285,
-0.06826915591955185,
-0.0586077943444252,
-0.020088128745555878,
-0.10011559724807739,
0.036357250064611435,
0.14020763337612152,
0.03297693654894829,
0.009288582019507885,
0.08035445958375931,
0.009700238704681396,
0.07587834447622299,
-0.019272862002253532,
0.12341497093439102,
0.352852463722229,
-0.06269195675849915,
0.056314822286367416,
-0.06184753403067589,
-0.14143207669258118,
0.0347101092338562,
-0.059399835765361786,
0.16536381840705872,
-0.004199181217700243,
-0.005063693039119244,
0.04334307834506035,
-0.0049916524440050125,
0.09808850288391113,
0.14334991574287415,
-0.029484758153557777,
-0.04197467118501663,
-0.017284933477640152,
-0.0650055855512619,
0.024934109300374985,
0.028729328885674477,
0.005248756613582373,
0.017098451033234596,
-0.06753119826316833,
-0.09397242218255997,
0.07338188588619232,
0.018422579392790794,
0.1389870047569275,
-0.31925076246261597,
-0.04216694459319115,
-0.03932035714387894,
-0.0595519058406353,
-0.1273939609527588,
0.026905829086899757,
0.12344316393136978,
-0.003191871801391244,
0.005532095208764076,
-0.11389262229204178,
0.07721477746963501,
-0.054624300450086594,
0.09182989597320557,
0.018383968621492386,
-0.04869333654642105,
0.038024693727493286,
0.038517992943525314,
-0.20709651708602905,
0.2081533819437027,
-0.028499603271484375,
-0.026612402871251106,
-0.015607037581503391,
-0.06699677556753159,
-0.0210011824965477,
0.22986219823360443,
0.05108126252889633,
0.008519026450812817,
0.27014487981796265,
-0.0880444124341011,
0.09301508963108063,
0.038697946816682816,
0.09702394157648087,
0.06912431865930557,
-0.02260943502187729,
-0.024383211508393288,
-0.04844030365347862,
0.012969732284545898,
0.049762267619371414,
0.04705944284796715,
-0.08451265096664429,
0.02242565155029297,
-0.03509235754609108,
0.10298043489456177,
-0.01335968542844057,
-0.06121581792831421,
-0.025763986632227898,
0.0018552463734522462,
0.010015659965574741,
0.023562539368867874,
-0.06391044706106186,
0.12956522405147552,
0.10956668108701706,
-0.07985129952430725,
0.13895590603351593,
-0.0751483216881752,
0.05253157764673233,
-0.01609453745186329,
-0.09814468026161194,
0.06901916116476059,
-0.10013565421104431,
0.0034457347355782986,
0.012265785597264767,
0.04977915436029434,
-0.07475721836090088,
-0.03156943619251251,
0.006373627111315727,
0.026742329820990562,
-0.10420498251914978,
-0.018855631351470947,
0.012463971972465515,
-0.07934894412755966,
-0.0887526273727417,
-0.012252169661223888,
0.10124760121107101,
0.007454235106706619,
-0.060529518872499466,
0.047492995858192444,
0.06493089348077774,
0.02368205599486828,
-0.06362363696098328,
-0.07112857699394226,
0.10967463254928589,
0.03450119495391846,
-0.2661631405353546,
0.012576631270349026,
-0.011365015059709549,
-0.01767503283917904,
-0.08069618791341782,
0.00890372134745121,
0.1373613476753235,
-0.07675711810588837,
0.006648852024227381,
0.05656673386693001,
-0.16358506679534912,
-0.06418191641569138,
0.08133430778980255,
0.15557339787483215,
0.202385812997818,
-0.1044500395655632,
0.015101675875484943,
0.006461358163505793,
-0.11822833120822906,
0.05542835220694542,
0.18525080382823944,
0.07789906114339828,
-0.026843199506402016,
0.012452905997633934,
0.024467578157782555,
-0.05809639021754265,
0.0590648353099823,
-0.09406013786792755,
0.0038227832410484552,
-0.06709415465593338,
-0.13606257736682892,
0.08787926286458969,
0.03738958388566971,
-0.010986794717609882,
0.17509286105632782,
0.0759161189198494,
0.035629115998744965,
-0.014242337085306644,
-0.11375514417886734,
0.04036318138241768,
0.013632921501994133,
-0.0305562112480402,
-0.06682883948087692,
-0.03860364109277725,
-0.05097347870469093,
-0.012179427780210972,
0.19262942671775818,
0.031004104763269424,
-0.07436566799879074,
-0.03679979592561722,
0.05341442674398422,
-0.1496790051460266,
-0.2157377004623413,
-0.006406290456652641,
-0.04147772118449211,
0.1805328130722046,
-0.15023872256278992,
0.049158159643411636,
0.1061495840549469,
0.07123710960149765,
-0.018678879365324974,
0.11794894188642502,
-0.002176848007366061,
0.0021777378860861063,
0.1251838505268097,
-0.09756211936473846,
-0.09640727192163467,
-0.10569922626018524,
0.11332826316356659,
0.10796604305505753,
0.18335355818271637,
0.12822502851486206,
-0.12009774893522263,
-0.00120852782856673,
0.002731824293732643,
-0.07859636843204498,
-0.03184151649475098,
0.08757089823484421,
0.08303696662187576,
0.05135537311434746,
-0.09977602958679199,
-0.03930673375725746,
0.004268074873834848,
-0.2842477560043335,
-0.10956920683383942,
0.08827079832553864,
-0.061273884028196335,
-0.05648181959986687,
0.05454974249005318,
-0.10639078915119171,
-0.16323025524616241,
-0.04048987478017807,
-0.04780757427215576,
-0.12049007415771484,
0.09571262449026108,
0.23338638246059418,
0.08725687116384506,
0.048118650913238525,
-0.08318759500980377,
0.08348587900400162,
-0.07982040196657181,
0.05869658291339874,
-0.065052330493927,
0.13451221585273743,
-0.23747362196445465,
-0.028105435892939568,
0.01379734929651022,
0.15900246798992157,
-0.07018724083900452,
0.0063284956850111485,
-0.11153900623321533,
0.041416071355342865,
-0.04525146260857582,
0.007880980148911476,
0.011046692728996277,
-0.0423116497695446,
-0.009132077917456627,
-0.14278115332126617,
-0.07119779288768768,
0.031210005283355713,
-0.06264735758304596,
-0.0016424627974629402,
0.045476626604795456,
0.012626610696315765,
-0.020841743797063828,
-0.021425848826766014,
0.049673110246658325,
-0.0723959356546402,
0.07567284256219864,
0.039328619837760925,
-0.10065869987010956,
0.09566338360309601,
-0.05077982321381569,
-0.17113256454467773,
0.1414506584405899,
0.06418273597955704,
0.025839662179350853,
0.0700995996594429,
0.10384903848171234,
0.008485603146255016,
0.0484066978096962,
-0.006875478196889162,
0.09208451211452484,
-0.04986941069364548,
0.005725406110286713,
-0.05886177718639374,
-0.060194045305252075,
-0.03872007504105568,
0.05298677831888199,
0.005843641702085733,
0.039772264659404755,
0.024203471839427948,
-0.016063891351222992,
0.011537943035364151,
-0.0886044055223465,
0.022174496203660965,
-0.03489477187395096,
-0.1732986867427826,
-0.012311574071645737,
-0.056632149964571,
0.03418740630149841,
-0.03111857734620571,
0.06495635211467743,
0.011748120188713074,
0.016495155170559883,
0.04111446812748909,
0.20008395612239838,
-0.13465876877307892,
0.042755499482154846,
0.17446652054786682,
0.03133910149335861,
0.013033835217356682,
-0.03298233076930046,
0.08027519285678864,
0.09509145468473434,
0.12480378150939941,
-0.05637788400053978,
0.10715638101100922,
-0.10824770480394363,
0.07464627176523209,
0.12021857500076294,
0.11260302364826202,
-0.15548276901245117,
-0.15188400447368622,
-0.037513911724090576,
0.08419343829154968,
-0.05595267564058304,
0.03655127063393593,
0.19314704835414886,
0.07754524052143097,
0.051167890429496765,
-0.015393027104437351,
-0.03229658305644989,
0.03649061918258667,
-0.10504841804504395,
-0.07965008169412613,
-0.14401200413703918,
0.0408981516957283,
-0.03786174952983856,
0.007777992635965347,
0.05167503282427788,
0.053533829748630524,
-0.05483352392911911,
0.21398548781871796,
0.17615669965744019,
-0.08989168703556061,
0.11066797375679016,
-0.002157010603696108,
0.00816824845969677,
0.05464590713381767,
0.07899751514196396,
-0.04587982967495918,
0.039883777499198914,
-0.05028875172138214,
0.008313991129398346,
-0.023405518382787704,
-0.030784916132688522,
0.002981682773679495,
-0.06956316530704498,
-0.035280048847198486,
0.009351616725325584,
-0.021135227754712105,
-0.025506136938929558,
-0.040940772742033005,
-0.029179802164435387,
-0.009915593080222607,
0.18265190720558167,
-0.07543164491653442,
-0.13258874416351318,
-0.031878761947155,
0.17112970352172852,
-0.04528256878256798,
0.07528826594352722,
-0.041224028915166855,
-0.033514026552438736,
-0.10736083984375,
0.3312666118144989,
0.16816137731075287,
-0.015280627645552158,
-0.020888766273856163,
0.07880361378192902,
0.022789718583226204,
-0.004310627933591604,
0.1268136203289032,
0.07499293982982635,
0.23315060138702393,
-0.06964446604251862,
-0.014168603345751762,
-0.09043389558792114,
0.06589865684509277,
0.008922416716814041,
-0.02002810686826706,
0.13772134482860565,
-0.016873279586434364,
-0.10568898171186447,
0.10038767009973526,
0.05579834058880806,
-0.009058413095772266,
0.07771389186382294,
-0.17647774517536163,
-0.016238458454608917,
-0.04474570229649544,
0.17727366089820862,
-0.0060396017506718636,
0.053866077214479446,
-0.06095186248421669,
-0.050126925110816956,
-0.02260439842939377,
0.03732847049832344,
-0.19538277387619019,
-0.010628534480929375,
0.06287173926830292,
-0.1870807707309723,
0.033149681985378265,
-0.08193343132734299,
0.011917044408619404,
0.07521091401576996,
0.013360770419239998,
-0.021393921226263046,
-0.04155054688453674,
-0.024864673614501953,
-0.14246311783790588,
0.0025924970395863056,
0.10588011145591736,
0.017509078606963158,
-0.03604459390044212,
-0.028252726420760155,
-0.19977903366088867,
-0.02571602538228035,
0.030885804444551468,
0.014164634980261326,
-0.04003438726067543,
0.015848206356167793,
-0.035832520574331284,
0.01770331896841526,
0.07294674962759018,
-0.010359887965023518,
-0.04864272475242615,
-0.08975685387849808,
-0.03669240325689316,
0.10863672941923141,
-0.0015452788211405277,
0.03310078755021095,
-0.07821287214756012,
-0.09760797768831253,
-0.14981362223625183,
0.0026758082676678896,
-0.04273469001054764,
-0.0820072814822197,
-0.015730401501059532,
-0.01615067385137081,
-0.053454846143722534,
0.032381705939769745,
0.09134259074926376,
-0.0325762964785099,
0.006672355812042952,
-0.027540137991309166,
-0.010851803235709667,
0.12010956555604935,
-0.09207112342119217,
-0.04813666641712189
] |
null | null |
transformers
|
# vit_large_patch16_384
Implementation of Vision Transformer (ViT) proposed in [An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale](https://arxiv.org/pdf/2010.11929.pdf)
The following image from the authors shows the architecture.

``` python
ViT.vit_small_patch16_224()
ViT.vit_base_patch16_224()
ViT.vit_base_patch16_384()
ViT.vit_base_patch32_384()
ViT.vit_huge_patch16_224()
ViT.vit_huge_patch32_384()
ViT.vit_large_patch16_224()
ViT.vit_large_patch16_384()
ViT.vit_large_patch32_384()
```
Examples:
``` python
# change activation
ViT.vit_base_patch16_224(activation = nn.SELU)
# change number of classes (default is 1000 )
ViT.vit_base_patch16_224(n_classes=100)
# pass a different block, default is TransformerEncoderBlock
ViT.vit_base_patch16_224(block=MyCoolTransformerBlock)
# get features
model = ViT.vit_base_patch16_224
# first call .features, this will activate the forward hooks and tells the model you'll like to get the features
model.encoder.features
model(torch.randn((1,3,224,224)))
# get the features from the encoder
features = model.encoder.features
print([x.shape for x in features])
#[[torch.Size([1, 197, 768]), torch.Size([1, 197, 768]), ...]
# change the tokens, you have to subclass ViTTokens
class MyTokens(ViTTokens):
def __init__(self, emb_size: int):
super().__init__(emb_size)
self.my_new_token = nn.Parameter(torch.randn(1, 1, emb_size))
ViT(tokens=MyTokens)
```
|
{}
| null |
glasses/vit_large_patch16_384
|
[
"transformers",
"pytorch",
"arxiv:2010.11929",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2010.11929"
] |
[] |
TAGS
#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us
|
# vit_large_patch16_384
Implementation of Vision Transformer (ViT) proposed in An Image Is
Worth 16x16 Words: Transformers For Image Recognition At
Scale
The following image from the authors shows the architecture.
!image
Examples:
|
[
"# vit_large_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n",
"# vit_large_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
29,
63
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-2010.11929 #endpoints_compatible #region-us \n# vit_large_patch16_384\n Implementation of Vision Transformer (ViT) proposed in An Image Is\n Worth 16x16 Words: Transformers For Image Recognition At\n Scale\n\n The following image from the authors shows the architecture.\n\n !image\n\n \n\n Examples:"
] |
[
-0.0825812891125679,
-0.12653283774852753,
-0.005577229429036379,
0.07951048016548157,
0.1649554818868637,
0.010088114999234676,
0.045649539679288864,
0.016682276502251625,
-0.05395953357219696,
0.018850134685635567,
0.12577524781227112,
0.17578400671482086,
-0.009808264672756195,
-0.012767858803272247,
0.004180215764790773,
-0.31502440571784973,
-0.036379486322402954,
0.16549193859100342,
-0.13746167719364166,
0.0856962725520134,
0.08700726181268692,
-0.1480611115694046,
0.10610094666481018,
0.04003886133432388,
-0.277543306350708,
0.027817323803901672,
0.018007000908255577,
-0.09258905053138733,
0.16817675530910492,
0.0596289224922657,
0.10118314623832703,
-0.0017560349078848958,
-0.020747380331158638,
0.031202280893921852,
0.007426528725773096,
0.031502146273851395,
-0.036128971725702286,
0.055710554122924805,
0.08943069726228714,
0.1367310881614685,
0.047057315707206726,
-0.032694559544324875,
-0.02417837642133236,
-0.021540576592087746,
-0.10536402463912964,
-0.14926274120807648,
0.07161006331443787,
0.07715389877557755,
-0.029866954311728477,
0.03148798644542694,
0.031042659655213356,
0.1573459655046463,
-0.005527804140001535,
0.12225077301263809,
0.14615003764629364,
-0.1400723159313202,
-0.1083315834403038,
-0.05883428454399109,
-0.0017277064034715295,
0.07284022867679596,
-0.052282270044088364,
0.0808817446231842,
0.05956965312361717,
0.0553995706140995,
0.013992201536893845,
-0.040167927742004395,
-0.09587588161230087,
0.02693128027021885,
-0.14391379058361053,
-0.05300454795360565,
0.16158531606197357,
0.025182800367474556,
0.03366953507065773,
-0.02888045646250248,
-0.1630299836397171,
0.008641097694635391,
-0.05899418890476227,
-0.040768373757600784,
-0.014328894205391407,
-0.08779186755418777,
-0.1461225003004074,
-0.03446344658732414,
-0.1404617428779602,
-0.027585698291659355,
-0.1217140182852745,
0.11473429203033447,
0.027685025706887245,
0.08603324741125107,
-0.13822002708911896,
0.046949755400419235,
-0.03943419083952904,
-0.10223635286092758,
0.03605562075972557,
-0.11582297831773758,
0.008364107459783554,
0.024156855419278145,
0.01874583773314953,
-0.1335046887397766,
-0.038773417472839355,
-0.061921942979097366,
0.10373900085687637,
-0.026129649952054024,
-0.017200930044054985,
0.08427780121564865,
-0.08235488831996918,
0.2552954852581024,
-0.1302167922258377,
0.12489274144172668,
-0.07142599672079086,
-0.055215876549482346,
-0.0046212500892579556,
0.011834990233182907,
-0.10622155666351318,
-0.09420696645975113,
-0.009892544709146023,
-0.005509450566023588,
-0.03170136734843254,
0.09782323986291885,
0.05520697310566902,
-0.04006355628371239,
0.06527058780193329,
0.0174874197691679,
0.02772614359855652,
-0.014135041274130344,
0.06067194417119026,
0.18663080036640167,
0.07450924068689346,
-0.10863178968429565,
-0.07691282033920288,
0.05043700337409973,
-0.08874236792325974,
0.01239323616027832,
-0.026540791615843773,
-0.1112271323800087,
0.037362296134233475,
-0.013947058469057083,
0.07228130847215652,
-0.28037911653518677,
0.10568039864301682,
0.10673908144235611,
0.023125087842345238,
0.0021012721117585897,
0.042274147272109985,
0.029652738943696022,
-0.12805742025375366,
-0.0011578550329431891,
0.018996691331267357,
0.010607641190290451,
0.04128880053758621,
0.11372268944978714,
0.08911358565092087,
0.11806976795196533,
-0.10392085462808609,
0.014737372286617756,
-0.08761569112539291,
-0.015941912308335304,
-0.13225297629833221,
-0.028160108253359795,
0.0381934829056263,
0.12809498608112335,
-0.042295780032873154,
-0.14963452517986298,
-0.005949854850769043,
0.028229421004652977,
0.027598081156611443,
0.1220797523856163,
-0.07004828006029129,
-0.06774691492319107,
0.040246836841106415,
-0.143173947930336,
-0.1977715939283371,
0.1279493123292923,
0.054119404405355453,
0.028861328959465027,
0.08028747141361237,
0.11330128461122513,
0.07256851345300674,
-0.15057261288166046,
-0.02463006041944027,
0.024213450029492378,
-0.1436193436384201,
-0.09014294296503067,
0.008211134932935238,
0.19132012128829956,
-0.047265227884054184,
-0.001975015038624406,
-0.07943133264780045,
0.06942731887102127,
-0.13122163712978363,
-0.04055454209446907,
-0.07347407937049866,
-0.01039296668022871,
0.04611191526055336,
0.06500773131847382,
0.0484449677169323,
-0.046481817960739136,
-0.01688246987760067,
0.12549667060375214,
0.04056132584810257,
-0.04821944609284401,
0.03146795183420181,
-0.09291645884513855,
0.028012985363602638,
-0.1623104214668274,
0.044778645038604736,
-0.08075287193059921,
0.06834064424037933,
0.0049112592823803425,
0.005744642112404108,
0.007300730794668198,
0.16142146289348602,
0.09016704559326172,
0.024475840851664543,
-0.025647087022662163,
-0.046823471784591675,
-0.000545594550203532,
0.017477914690971375,
-0.033733535557985306,
-0.11701163649559021,
-0.0007915748283267021,
-0.01115945354104042,
-0.15859192609786987,
-0.09571642428636551,
-0.044682130217552185,
0.14061056077480316,
0.08512604236602783,
-0.0005998341366648674,
0.048486292362213135,
-0.05739894509315491,
0.023939989507198334,
-0.007000694517046213,
-0.03024674952030182,
0.04230794683098793,
-0.03219064325094223,
0.049997586756944656,
0.1475282609462738,
-0.02493400312960148,
0.23872144520282745,
0.17806632816791534,
-0.3612707853317261,
-0.002496981294825673,
0.08885671943426132,
-0.004233530722558498,
0.027624694630503654,
0.01642693392932415,
-0.09518858045339584,
0.02757200039923191,
-0.06619682163000107,
0.12462792545557022,
-0.028044285252690315,
-0.0066049168817698956,
0.10630696266889572,
0.0074822623282670975,
-0.09613662958145142,
0.10680968314409256,
0.09112434834241867,
-0.22779931128025055,
0.07416603714227676,
0.18080943822860718,
-0.05633760616183281,
0.006106778979301453,
0.0306552667170763,
-0.04462704062461853,
0.019406450912356377,
0.0076852706260979176,
0.02341235615313053,
0.17832408845424652,
-0.2830401360988617,
-0.0957663357257843,
0.03207864984869957,
-0.083082415163517,
0.004482716787606478,
-0.14844568073749542,
0.03213706985116005,
0.04444798827171326,
0.08438003063201904,
-0.04040376469492912,
0.04935917630791664,
-0.003882297547534108,
0.09690570831298828,
-0.015574119985103607,
-0.08261991292238235,
0.026990478858351707,
0.01419799868017435,
-0.026225104928016663,
0.1423107385635376,
-0.02207048237323761,
-0.40596112608909607,
-0.1409638673067093,
-0.07877514511346817,
0.05781679227948189,
0.02498672716319561,
0.05743113160133362,
-0.16673988103866577,
-0.07044491171836853,
0.0755215585231781,
0.13059411942958832,
-0.052750442177057266,
0.08440733700990677,
0.002319191349670291,
0.03474867343902588,
0.009854762814939022,
-0.05534334480762482,
-0.014146272093057632,
-0.08041425049304962,
0.02765895612537861,
0.05271581932902336,
-0.0216910932213068,
0.16865777969360352,
0.06762373447418213,
-0.07258186489343643,
0.05480063334107399,
-0.017619220539927483,
0.1740177422761917,
-0.13155266642570496,
-0.009366421960294247,
0.21889658272266388,
0.010164734907448292,
0.009126218035817146,
0.11035164445638657,
0.06046958640217781,
-0.06499839574098587,
-0.019497139379382133,
-0.10379266738891602,
-0.07037485390901566,
-0.055017706006765366,
-0.01985287107527256,
-0.1011400818824768,
0.043194402009248734,
0.14219315350055695,
0.030462659895420074,
0.011698347516357899,
0.08111477643251419,
0.015095111913979053,
0.07315211743116379,
-0.018442410975694656,
0.12077724188566208,
0.35587021708488464,
-0.06538616865873337,
0.056645121425390244,
-0.06250055879354477,
-0.14534059166908264,
0.03386833891272545,
-0.053747836500406265,
0.1521967500448227,
-0.007432336453348398,
-0.0037380356807261705,
0.03437674418091774,
-0.009035966359078884,
0.09778835624456406,
0.14146189391613007,
-0.03108314611017704,
-0.04010791704058647,
-0.020117251202464104,
-0.0645034983754158,
0.027509793639183044,
0.03589797392487526,
0.017395852133631706,
0.012399394065141678,
-0.06891167163848877,
-0.09057485312223434,
0.07185929268598557,
0.024164631962776184,
0.14296665787696838,
-0.31878456473350525,
-0.045016247779130936,
-0.03773709014058113,
-0.054537441581487656,
-0.1263989955186844,
0.02979232557117939,
0.11991850286722183,
-0.005399101879447699,
0.001909302663989365,
-0.11546465754508972,
0.07687542587518692,
-0.05348965898156166,
0.09294122457504272,
0.02541363798081875,
-0.04177800938487053,
0.03701544180512428,
0.0375637449324131,
-0.21173614263534546,
0.1984223574399948,
-0.02888467162847519,
-0.03191277012228966,
-0.021706009283661842,
-0.06463373452425003,
-0.0250630434602499,
0.22572757303714752,
0.05570535361766815,
0.005676083266735077,
0.2610977292060852,
-0.08986537903547287,
0.08959948271512985,
0.03705020993947983,
0.09743682295084,
0.07369881123304367,
-0.022462008520960808,
-0.026075733825564384,
-0.05305555835366249,
0.01875263638794422,
0.044446781277656555,
0.05356527492403984,
-0.08317458629608154,
0.02258291281759739,
-0.028179416432976723,
0.10025522857904434,
-0.010844942182302475,
-0.06134938821196556,
-0.03444467857480049,
0.01062309741973877,
0.020893519744277,
0.021268442273139954,
-0.06640368700027466,
0.1254240870475769,
0.10556132346391678,
-0.0818546861410141,
0.14307677745819092,
-0.0750938132405281,
0.050482261925935745,
-0.015321210026741028,
-0.10555467754602432,
0.07620735466480255,
-0.1044585108757019,
0.00517382612451911,
0.011010629124939442,
0.045664023607969284,
-0.07410180568695068,
-0.032546401023864746,
0.004812074359506369,
0.025852831080555916,
-0.10423164814710617,
-0.01560601219534874,
0.01471151877194643,
-0.07284148037433624,
-0.08883881568908691,
-0.005949394311755896,
0.09389642626047134,
0.002259831875562668,
-0.05897359177470207,
0.04570072889328003,
0.07143314927816391,
0.021573275327682495,
-0.06481539458036423,
-0.07248859852552414,
0.10980086773633957,
0.04021477326750755,
-0.27157676219940186,
0.010478775016963482,
-0.007813162170350552,
-0.018391618505120277,
-0.07598166912794113,
0.014803401194512844,
0.13381710648536682,
-0.08208080381155014,
0.007610179018229246,
0.05137952044606209,
-0.17032372951507568,
-0.06556591391563416,
0.08831038326025009,
0.15679290890693665,
0.20301617681980133,
-0.10251422971487045,
0.02003343030810356,
0.00827688630670309,
-0.12362932413816452,
0.047566886991262436,
0.19497926533222198,
0.08020145446062088,
-0.02509281039237976,
0.012968485243618488,
0.02450598031282425,
-0.06067478284239769,
0.051937285810709,
-0.08735155314207077,
-0.002014484954997897,
-0.06697335094213486,
-0.13650818169116974,
0.08180033415555954,
0.03828654810786247,
-0.012362848035991192,
0.18688303232192993,
0.07613789290189743,
0.04014243558049202,
-0.012802847661077976,
-0.119366854429245,
0.03813835233449936,
0.01686525158584118,
-0.03637135028839111,
-0.06914608925580978,
-0.03334960713982582,
-0.050466138869524,
-0.015375933609902859,
0.19692695140838623,
0.02780456840991974,
-0.06731841713190079,
-0.024073859676718712,
0.04910707101225853,
-0.1423829197883606,
-0.21706576645374298,
-0.005251748487353325,
-0.04646550118923187,
0.17880754172801971,
-0.1446770280599594,
0.04909532144665718,
0.10705888271331787,
0.06980310380458832,
-0.016832707449793816,
0.11793004721403122,
-0.002449827268719673,
0.0000756081790314056,
0.12232571840286255,
-0.09544777125120163,
-0.08708170056343079,
-0.1108088418841362,
0.1066369116306305,
0.10915417224168777,
0.1775810569524765,
0.12891064584255219,
-0.12161958962678909,
0.0015536349965259433,
0.001542737241834402,
-0.07687725871801376,
-0.03090805746614933,
0.08477270603179932,
0.08590376377105713,
0.047275785356760025,
-0.09555044025182724,
-0.03529699146747589,
0.004751323256641626,
-0.28513357043266296,
-0.10439500957727432,
0.08341535180807114,
-0.06375301629304886,
-0.05546196177601814,
0.0528891496360302,
-0.1139066219329834,
-0.17588801681995392,
-0.037435922771692276,
-0.049430426210165024,
-0.12049438804388046,
0.09802217036485672,
0.2227465659379959,
0.08826894313097,
0.04355412349104881,
-0.08599549531936646,
0.08681914955377579,
-0.08421286940574646,
0.060273487120866776,
-0.06905568391084671,
0.13429559767246246,
-0.23100584745407104,
-0.029016545042395592,
0.013532753102481365,
0.16640795767307281,
-0.0710008442401886,
0.008520376868546009,
-0.11801543831825256,
0.0432400144636631,
-0.04886884614825249,
0.001639755442738533,
0.006548414472490549,
-0.04351267218589783,
-0.013543404638767242,
-0.1434888243675232,
-0.07031495124101639,
0.03221976384520531,
-0.060324814170598984,
-0.0021632330026477575,
0.0433555506169796,
0.011321146972477436,
-0.020407358184456825,
-0.019866203889250755,
0.046655792742967606,
-0.06965713202953339,
0.07398824393749237,
0.03318651020526886,
-0.10124433040618896,
0.09415154904127121,
-0.048976968973875046,
-0.17310702800750732,
0.14076556265354156,
0.0617033876478672,
0.029334967955946922,
0.0695633515715599,
0.10208394378423691,
0.007702886592596769,
0.04401217773556709,
-0.007551420945674181,
0.08522137999534607,
-0.0472518689930439,
0.012634688057005405,
-0.062114980071783066,
-0.05489528179168701,
-0.044369447976350784,
0.05638306960463524,
0.013443049043416977,
0.039326366037130356,
0.022632449865341187,
-0.019192790612578392,
0.012639534659683704,
-0.08365000039339066,
0.02223038114607334,
-0.03620323911309242,
-0.17534691095352173,
-0.0025267251767218113,
-0.0636458471417427,
0.034811168909072876,
-0.032299865037202835,
0.059693869203329086,
0.014384661801159382,
0.018760621547698975,
0.04507763311266899,
0.1899816244840622,
-0.13789844512939453,
0.04268373176455498,
0.1824616938829422,
0.02877623774111271,
0.014211823232471943,
-0.029424399137496948,
0.08380111306905746,
0.0971258208155632,
0.13178570568561554,
-0.05233527719974518,
0.09794104844331741,
-0.11489742994308472,
0.07747335731983185,
0.11909333616495132,
0.10803327709436417,
-0.15899577736854553,
-0.15098954737186432,
-0.03954216092824936,
0.0858004093170166,
-0.057896003127098083,
0.030708463862538338,
0.1883396953344345,
0.07632031291723251,
0.04854102060198784,
-0.015187802724540234,
-0.032014522701501846,
0.03307104483246803,
-0.09351250529289246,
-0.07924134284257889,
-0.14372985064983368,
0.03827804699540138,
-0.03888818249106407,
0.010829555802047253,
0.049519240856170654,
0.05151372775435448,
-0.05525815114378929,
0.21966230869293213,
0.16578638553619385,
-0.0931231752038002,
0.11150693893432617,
-0.004756070673465729,
0.011532649397850037,
0.05765022709965706,
0.07457908242940903,
-0.045633938163518906,
0.04565351828932762,
-0.048953983932733536,
0.0071238186210393906,
-0.02858121134340763,
-0.029826922342181206,
0.006178511772304773,
-0.07080581784248352,
-0.038013238459825516,
0.011868405155837536,
-0.01819838397204876,
-0.028437329456210136,
-0.03776293620467186,
-0.028235062956809998,
-0.009494476020336151,
0.17983371019363403,
-0.0793377086520195,
-0.12883396446704865,
-0.032816845923662186,
0.18506626784801483,
-0.04236510396003723,
0.07333876192569733,
-0.039763499051332474,
-0.03741755336523056,
-0.10889259725809097,
0.3317192792892456,
0.16967421770095825,
-0.022890910506248474,
-0.020232273265719414,
0.07706192880868912,
0.02390916831791401,
-0.00119629071559757,
0.13478967547416687,
0.07681574672460556,
0.22440709173679352,
-0.07081109285354614,
-0.011717707850039005,
-0.08773910254240036,
0.06338007003068924,
0.005860826000571251,
-0.020982347428798676,
0.14540670812129974,
-0.01964622549712658,
-0.10383059829473495,
0.10407169908285141,
0.05078020319342613,
-0.006658522877842188,
0.07167693227529526,
-0.17450906336307526,
-0.01849382556974888,
-0.0375991091132164,
0.17963945865631104,
-0.003446411108598113,
0.05529697611927986,
-0.06496348977088928,
-0.043635040521621704,
-0.017783528193831444,
0.03577083721756935,
-0.18902741372585297,
-0.0004457533359527588,
0.05948367342352867,
-0.19009517133235931,
0.02563883550465107,
-0.08028781414031982,
0.008933980949223042,
0.07597177475690842,
0.013181962072849274,
-0.017644615843892097,
-0.035822805017232895,
-0.02692984603345394,
-0.14634475111961365,
0.005442032590508461,
0.1038220003247261,
0.011403866112232208,
-0.032044488936662674,
-0.029872411862015724,
-0.20929376780986786,
-0.022569755092263222,
0.03024226613342762,
0.01729353703558445,
-0.04105808958411217,
0.014274891465902328,
-0.036276400089263916,
0.026469213888049126,
0.07380200177431107,
-0.00982057023793459,
-0.05071067810058594,
-0.08671978116035461,
-0.03836369886994362,
0.10516506433486938,
0.000003830840341834119,
0.024954767897725105,
-0.07812722027301788,
-0.09634632617235184,
-0.1588371992111206,
0.008651518262922764,
-0.03811795637011528,
-0.08148692548274994,
-0.018792198970913887,
-0.01786704547703266,
-0.057692158967256546,
0.03343362733721733,
0.09156742691993713,
-0.03591698780655861,
0.00827085692435503,
-0.017342325299978256,
-0.010290462523698807,
0.12048635631799698,
-0.10130523890256882,
-0.051722344011068344
] |
null | null |
transformers
|
# wide_resnet101_2
Implementation of Wide ResNet proposed in [\"Wide Residual
Networks\"](https://arxiv.org/pdf/1605.07146.pdf)
Create a default model
``` python
WideResNet.wide_resnet50_2()
WideResNet.wide_resnet101_2()
# create a wide_resnet18_4
WideResNet.resnet18(block=WideResNetBottleNeckBlock, width_factor=4)
```
Examples:
``` python
# change activation
WideResNet.resnext50_32x4d(activation = nn.SELU)
# change number of classes (default is 1000 )
WideResNet.resnext50_32x4d(n_classes=100)
# pass a different block
WideResNet.resnext50_32x4d(block=SENetBasicBlock)
# change the initial convolution
model = WideResNet.resnext50_32x4d
model.encoder.gate.conv1 = nn.Conv2d(3, 64, kernel_size=3)
# store each feature
x = torch.rand((1, 3, 224, 224))
model = WideResNet.wide_resnet50_2()
features = []
x = model.encoder.gate(x)
for block in model.encoder.layers:
x = block(x)
features.append(x)
print([x.shape for x in features])
# [torch.Size([1, 64, 56, 56]), torch.Size([1, 128, 28, 28]), torch.Size([1, 256, 14, 14]), torch.Size([1, 512, 7, 7])]
```
|
{}
| null |
glasses/wide_resnet101_2
|
[
"transformers",
"pytorch",
"arxiv:1605.07146",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"1605.07146"
] |
[] |
TAGS
#transformers #pytorch #arxiv-1605.07146 #endpoints_compatible #region-us
|
# wide_resnet101_2
Implementation of Wide ResNet proposed in \"Wide Residual
Networks\"
Create a default model
Examples:
|
[
"# wide_resnet101_2\nImplementation of Wide ResNet proposed in \\\"Wide Residual\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:"
] |
[
"TAGS\n#transformers #pytorch #arxiv-1605.07146 #endpoints_compatible #region-us \n",
"# wide_resnet101_2\nImplementation of Wide ResNet proposed in \\\"Wide Residual\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:"
] |
[
30,
36
] |
[
"passage: TAGS\n#transformers #pytorch #arxiv-1605.07146 #endpoints_compatible #region-us \n# wide_resnet101_2\nImplementation of Wide ResNet proposed in \\\"Wide Residual\nNetworks\\\"\n\n Create a default model\n\n \n\n Examples:"
] |
[
-0.07823602855205536,
-0.18009544909000397,
-0.005192617420107126,
-0.00931302085518837,
0.023327266797423363,
0.012417958118021488,
0.058129098266363144,
-0.01604258269071579,
-0.07158584147691727,
-0.05519130825996399,
0.17977310717105865,
0.02143724635243416,
-0.05262915790081024,
-0.005292552523314953,
-0.04380550980567932,
-0.22881758213043213,
0.055925820022821426,
0.0775391012430191,
-0.1517472267150879,
0.10727115720510483,
0.13253886997699738,
-0.020579885691404343,
0.06496556848287582,
0.056726954877376556,
-0.12047939002513885,
0.019194047898054123,
0.015563823282718658,
-0.008249351754784584,
0.10641878098249435,
0.0325126014649868,
0.22476328909397125,
0.01519556250423193,
-0.00006843741721240804,
-0.16831813752651215,
0.02398483082652092,
0.05194481462240219,
-0.017189105972647667,
0.09715989977121353,
0.09827770292758942,
0.06121315807104111,
0.05483465641736984,
0.09017077833414078,
-0.08340395241975784,
0.014403339475393295,
-0.04505280405282974,
-0.12084018439054489,
-0.021895507350564003,
0.1320788562297821,
-0.018533332273364067,
0.10583391040563583,
0.0137381162494421,
0.1135612204670906,
-0.2020442634820938,
0.06608172506093979,
0.1584659367799759,
-0.3387077748775482,
0.013787965290248394,
0.1981119066476822,
0.06875989586114883,
-0.020322537049651146,
0.009282740764319897,
-0.04391231760382652,
0.04866844415664673,
0.039103515446186066,
-0.06672745198011398,
-0.032899051904678345,
-0.05877745524048805,
0.00504183117300272,
-0.16375766694545746,
0.0406373105943203,
0.09093938767910004,
0.028514936566352844,
0.09674155712127686,
-0.020851822569966316,
-0.23812822997570038,
0.02813408337533474,
-0.020508157089352608,
-0.08082346618175507,
-0.026086287572979927,
0.09673552960157394,
0.0315348245203495,
-0.02829628624022007,
-0.08897317945957184,
-0.012403915636241436,
-0.19487564265727997,
0.07608562707901001,
0.07823603600263596,
0.09146535396575928,
-0.15717813372612,
0.04479599744081497,
0.0017697283765301108,
-0.06527643650770187,
0.06803736090660095,
-0.1798929125070572,
-0.022490983828902245,
0.013123948127031326,
-0.07867550849914551,
-0.03611277788877487,
0.10465602576732635,
0.18425077199935913,
-0.0869605615735054,
-0.0317951962351799,
0.10228212922811508,
0.13057930767536163,
-0.07203251123428345,
-0.13771280646324158,
-0.29045990109443665,
0.05141402408480644,
0.0010088209528476,
-0.09556712955236435,
0.053332217037677765,
-0.024391327053308487,
-0.060097843408584595,
-0.031209703534841537,
0.019718270748853683,
0.020052168518304825,
-0.010271423496305943,
0.030726129189133644,
-0.04220570623874664,
-0.08405349403619766,
0.1520017683506012,
-0.033708151429891586,
-0.01101846992969513,
0.01627742499113083,
-0.06148067116737366,
0.11274675279855728,
0.02452979050576687,
0.05931374803185463,
-0.04538733512163162,
0.06647136062383652,
-0.08752720803022385,
-0.06949024647474289,
-0.028094377368688583,
-0.10193923115730286,
0.04084954783320427,
0.1262599676847458,
0.048014383763074875,
-0.18295301496982574,
-0.0203205905854702,
0.025791315361857414,
0.09621764719486237,
-0.0034904172644019127,
-0.04699571430683136,
0.01861158013343811,
-0.1300971657037735,
-0.044550616294145584,
-0.03725794330239296,
0.10729051381349564,
-0.04649427533149719,
-0.01663270592689514,
-0.025248536840081215,
0.020582327619194984,
-0.11599400639533997,
-0.01727105863392353,
-0.07180484384298325,
-0.01605970785021782,
-0.0485742911696434,
0.006174745038151741,
-0.1045193150639534,
0.011106993071734905,
-0.1029745414853096,
-0.06609498709440231,
-0.06606180220842361,
0.025115223601460457,
0.06526347249746323,
0.07720978558063507,
0.00510804820805788,
-0.06578461080789566,
0.24381375312805176,
-0.11651062220335007,
-0.14776186645030975,
0.12230389565229416,
-0.011178908869624138,
-0.07071756571531296,
-0.010002539493143559,
0.12458352744579315,
0.13100846111774445,
-0.0406307727098465,
0.03011789359152317,
0.06860919296741486,
-0.11896513402462006,
-0.14160007238388062,
-0.03688972443342209,
0.1067391037940979,
0.05911008268594742,
-0.05278676003217697,
0.07119230180978775,
0.10336094349622726,
-0.11786026507616043,
-0.0538170263171196,
-0.05199024826288223,
-0.10801989585161209,
0.07891727238893509,
0.044981107115745544,
0.09740312397480011,
0.08518079668283463,
-0.060435350984334946,
-0.10674029588699341,
0.06530357152223587,
-0.011964944191277027,
0.09594909101724625,
-0.10609599202871323,
0.04637588933110237,
-0.035250473767519,
0.0105248698964715,
-0.2193295955657959,
-0.13083259761333466,
-0.011956184171140194,
0.0339256152510643,
-0.03000018186867237,
0.3203755021095276,
0.07201434671878815,
-0.0773095116019249,
-0.07083838433027267,
0.016958478838205338,
0.12687700986862183,
0.01571458950638771,
-0.06881904602050781,
-0.12739533185958862,
-0.030072515830397606,
-0.06133411452174187,
-0.058278150856494904,
-0.2341403365135193,
-0.012390941381454468,
0.03190505504608154,
0.17368243634700775,
-0.028960926458239555,
0.0660112202167511,
-0.01713714934885502,
0.010131122544407845,
-0.054437410086393356,
-0.01951528526842594,
-0.005946111865341663,
-0.029332153499126434,
-0.08437145501375198,
0.0636032447218895,
-0.004540535155683756,
0.2691590189933777,
0.05313488095998764,
-0.25150594115257263,
-0.029519841074943542,
0.10339521616697311,
-0.09004298597574234,
-0.001116040162742138,
0.015053854323923588,
0.058698225766420364,
0.07380495220422745,
0.007936193607747555,
0.08301800489425659,
-0.034830883145332336,
0.03238455206155777,
-0.004307364579290152,
-0.10907097905874252,
0.1389504373073578,
0.054900962859392166,
0.007398162502795458,
-0.3000437319278717,
0.08623838424682617,
0.0044379099272191525,
-0.11698247492313385,
-0.013913028873503208,
-0.027283713221549988,
-0.013021462596952915,
0.04399560019373894,
0.1084524393081665,
-0.03774836286902428,
-0.004880160558968782,
0.015435879118740559,
0.031348731368780136,
0.06366409361362457,
-0.014112130738794804,
0.09597230702638626,
-0.08245658129453659,
-0.01207039225846529,
-0.0018008598126471043,
0.005512466188520193,
-0.060643985867500305,
0.05771820992231369,
0.011887912638485432,
0.05627300590276718,
-0.03148208186030388,
-0.22676867246627808,
0.05211929604411125,
-0.033515751361846924,
-0.03448820486664772,
0.20236195623874664,
-0.021304413676261902,
-0.11642260104417801,
-0.046911049634218216,
-0.11314643919467926,
-0.07489748299121857,
-0.030416887253522873,
-0.0018112274119630456,
-0.10054752975702286,
-0.04364174231886864,
0.04045317694544792,
0.04314357042312622,
0.010053267702460289,
0.026999803259968758,
-0.0404011607170105,
0.05273504555225372,
0.03595723211765289,
-0.09001129120588303,
0.05063166096806526,
-0.14688585698604584,
-0.05795498564839363,
0.016188131645321846,
-0.11317416280508041,
0.03652586787939072,
0.16996344923973083,
0.07779382169246674,
-0.02775857038795948,
0.052224740386009216,
0.14981821179389954,
-0.016782104969024658,
0.05731431394815445,
0.13879342377185822,
-0.04623844474554062,
0.1027204617857933,
0.15583139657974243,
0.10321065038442612,
-0.013965143822133541,
-0.02365782856941223,
-0.00660302909091115,
-0.013651331886649132,
-0.22254513204097748,
-0.14694741368293762,
-0.11376215517520905,
-0.05024117976427078,
-0.0048359776847064495,
0.029569217935204506,
0.02509932778775692,
0.12957164645195007,
-0.02248838171362877,
-0.051000550389289856,
0.0029572248458862305,
0.02057809568941593,
0.19361376762390137,
0.01878354512155056,
0.1258714199066162,
-0.10504023730754852,
-0.12798649072647095,
0.03658904507756233,
0.05891341343522072,
0.2362135499715805,
-0.02241467498242855,
0.0233161523938179,
0.11195787787437439,
0.2626307010650635,
0.12127479910850525,
0.16004745662212372,
-0.022715579718351364,
-0.07967498898506165,
0.019234728068113327,
0.02369149960577488,
0.045785706490278244,
0.07077444344758987,
0.0034763335715979338,
-0.12710346281528473,
0.09153648465871811,
0.016335446387529373,
-0.01991133764386177,
0.12852005660533905,
0.1480698436498642,
-0.23476475477218628,
0.010526604019105434,
-0.07533709704875946,
-0.09270288795232773,
-0.009559550322592258,
0.07743295282125473,
0.14292766153812408,
-0.024088285863399506,
0.14947441220283508,
-0.05918841436505318,
0.09894247353076935,
-0.009375483728945255,
0.04593132063746452,
-0.025495393201708794,
0.04599655419588089,
0.056614819914102554,
0.10391039401292801,
-0.1721097081899643,
0.2866858243942261,
-0.03762606531381607,
-0.0935402438044548,
-0.08759136497974396,
-0.013582375831902027,
0.0011947264429181814,
0.09992889314889908,
0.08974651247262955,
0.014440183527767658,
0.07586255669593811,
0.14713425934314728,
0.00906283501535654,
0.01653897389769554,
0.10166831314563751,
0.05057818070054054,
-0.023258855566382408,
0.03143037483096123,
-0.04766121879220009,
0.07906285673379898,
0.06502065062522888,
-0.031269293278455734,
-0.11316102743148804,
0.022556131705641747,
0.16251924633979797,
-0.16469642519950867,
-0.044833600521087646,
-0.011923493817448616,
0.024554219096899033,
0.25385627150535583,
-0.054878119379282,
-0.10055606812238693,
-0.02301175892353058,
-0.03537923842668533,
0.2028023898601532,
-0.0783892497420311,
0.04611843451857567,
-0.021617263555526733,
0.012700442224740982,
-0.03719506785273552,
-0.11835011094808578,
0.19813752174377441,
-0.14298297464847565,
-0.03695458173751831,
-0.027441468089818954,
0.050344452261924744,
0.005012452602386475,
-0.016017131507396698,
0.014252166263759136,
-0.011214944534003735,
-0.18451473116874695,
-0.10819221287965775,
-0.15300248563289642,
-0.09400971978902817,
-0.09819876402616501,
0.05801955610513687,
-0.022365737706422806,
0.10301610827445984,
0.03626108914613724,
0.07340873032808304,
0.16501069068908691,
0.058171775192022324,
-0.07287473231554031,
0.017666099593043327,
0.17566953599452972,
0.06951875984668732,
-0.2552880644798279,
-0.17761090397834778,
0.02073816955089569,
-0.046708013862371445,
0.0018841034034267068,
-0.13880686461925507,
0.18924587965011597,
-0.010037158615887165,
-0.04600892588496208,
0.04227107763290405,
-0.057928964495658875,
-0.028197159990668297,
0.18092648684978485,
-0.06660514324903488,
0.30773505568504333,
-0.019031399860978127,
-0.024392232298851013,
-0.03384057804942131,
0.041765935719013214,
0.04813973605632782,
-0.06967576593160629,
0.07308825850486755,
0.02710181102156639,
0.13885369896888733,
-0.011228115297853947,
-0.06239555776119232,
0.03484882786870003,
0.07411321997642517,
0.013217168860137463,
-0.0829026997089386,
0.0809359923005104,
0.20918847620487213,
0.03497226908802986,
-0.01686062105000019,
0.03417005389928818,
0.02349807135760784,
-0.07568708062171936,
-0.05030827969312668,
-0.13248243927955627,
0.006196406204253435,
0.020391086116433144,
-0.018226690590381622,
-0.12060894072055817,
0.008954827673733234,
0.10561605542898178,
0.03903989493846893,
0.24690739810466766,
0.06101302430033684,
0.15287162363529205,
0.2244686484336853,
0.14105455577373505,
-0.14144182205200195,
-0.07219133526086807,
-0.029822999611496925,
-0.03054504096508026,
0.0973769947886467,
-0.10097360610961914,
0.009549329057335854,
0.12553678452968597,
-0.031000832095742226,
-0.13682979345321655,
0.12672868371009827,
-0.0852157399058342,
-0.07266370952129364,
0.10340874642133713,
-0.0813332051038742,
-0.16523322463035583,
0.025223957374691963,
0.10637146979570389,
-0.07213333994150162,
0.17579297721385956,
0.1457885503768921,
-0.04158194735646248,
0.00002320579187653493,
0.04306245595216751,
0.05378241464495659,
-0.17214158177375793,
0.1232147365808487,
0.1460941731929779,
0.09440850466489792,
-0.12834179401397705,
0.057621389627456665,
-0.011409967206418514,
-0.02031909115612507,
-0.11385761946439743,
0.0006000778521411121,
-0.121767058968544,
-0.05905073881149292,
-0.034555189311504364,
-0.026061147451400757,
-0.17510822415351868,
-0.10146467387676239,
-0.03483518958091736,
-0.1233600378036499,
0.05304804816842079,
0.06706491857767105,
0.10383434593677521,
0.050402767956256866,
-0.025956688448786736,
-0.04619141295552254,
-0.05637958273291588,
-0.05875740572810173,
0.1109476089477539,
0.059256281703710556,
-0.16997234523296356,
-0.1349831521511078,
-0.023725006729364395,
0.12347793579101562,
-0.0885872095823288,
0.08185406029224396,
-0.06850389391183853,
0.09100822359323502,
-0.17649216949939728,
-0.08652578294277191,
-0.05551775544881821,
-0.036687009036540985,
-0.056307338178157806,
-0.10846183449029922,
-0.08582360297441483,
0.06546387821435928,
-0.09505447000265121,
0.07807978987693787,
0.016349228098988533,
-0.08655166625976562,
-0.1114460751414299,
0.01787528581917286,
0.004372807685285807,
-0.04017206281423569,
0.09391225874423981,
0.03172687813639641,
-0.1420450508594513,
0.056003637611866,
-0.09167379885911942,
-0.15568757057189941,
0.07561973482370377,
0.10530690103769302,
0.04529278725385666,
0.044283900409936905,
0.008722410537302494,
0.06587396562099457,
0.074568971991539,
-0.0055540031753480434,
-0.03587009012699127,
-0.011545914225280285,
0.05497970059514046,
-0.11032123863697052,
-0.022617390379309654,
-0.025730876252055168,
-0.0696277990937233,
0.024981915950775146,
0.13579775393009186,
0.11997175216674805,
0.01381664164364338,
0.047589629888534546,
-0.010802584700286388,
0.042737264186143875,
0.002845226554200053,
-0.07106789201498032,
0.21267567574977875,
-0.04559614136815071,
0.009176217950880527,
-0.034086596220731735,
0.29278838634490967,
-0.18146923184394836,
-0.07791438698768616,
-0.0017547773895785213,
0.11976800858974457,
-0.05662447586655617,
0.039820071309804916,
0.14689640700817108,
0.09894682466983795,
-0.025827592238783836,
-0.0380280576646328,
0.006046896334737539,
0.06365517526865005,
0.10653454810380936,
0.12889960408210754,
0.1316635012626648,
-0.024209681898355484,
0.03216203302145004,
0.02908480539917946,
-0.06511290371417999,
0.05243994668126106,
-0.09066834300756454,
-0.07079222053289413,
-0.021043263375759125,
0.0007883144426159561,
-0.03617771714925766,
0.11718446016311646,
0.09962832182645798,
0.014297847636044025,
0.04312099143862724,
-0.0345507450401783,
-0.13334055244922638,
-0.09511943906545639,
-0.04502270743250847,
-0.13216465711593628,
-0.022049224004149437,
-0.05047837644815445,
-0.09747013449668884,
0.19854821264743805,
0.08947861194610596,
0.0162523053586483,
0.14792926609516144,
0.04729638621211052,
-0.045734863728284836,
0.019755445420742035,
-0.025821058079600334,
-0.033959973603487015,
0.17327693104743958,
-0.06361901760101318,
-0.060269974172115326,
0.06838730722665787,
-0.03107149340212345,
0.026860862970352173,
-0.0343039371073246,
0.10475511103868484,
-0.022268272936344147,
-0.029617914929986,
-0.08797848224639893,
0.03639267757534981,
0.004332047421485186,
0.020701834931969643,
-0.06276719272136688,
0.010504872538149357,
-0.00025529638514854014,
0.22174683213233948,
-0.11818815022706985,
-0.22583162784576416,
-0.07999125868082047,
0.10680224746465683,
-0.004046958405524492,
0.09406977146863937,
-0.08273739367723465,
0.003348891157656908,
-0.09691762179136276,
0.2710506021976471,
0.31869032979011536,
-0.16002583503723145,
0.03800198435783386,
0.10780995339155197,
0.02138461172580719,
0.016831332817673683,
0.017730101943016052,
0.0999615266919136,
0.3022504448890686,
-0.07424390316009521,
-0.11001237481832504,
-0.008689653128385544,
-0.009835029020905495,
-0.12295371294021606,
0.03014025278389454,
0.05890323594212532,
-0.07485178858041763,
-0.08911703526973724,
0.13136321306228638,
-0.2536464035511017,
0.07565850019454956,
-0.0498691089451313,
-0.11755440384149551,
-0.08660778403282166,
-0.08073237538337708,
0.17775504291057587,
0.013773138634860516,
0.12719658017158508,
-0.03240525722503662,
-0.03148460015654564,
0.03961591795086861,
0.030255991965532303,
-0.13267633318901062,
-0.025442562997341156,
0.029395785182714462,
0.0019718126859515905,
0.005558060947805643,
0.02590743452310562,
0.09805531054735184,
0.13393472135066986,
-0.07006048411130905,
-0.04852506145834923,
-0.01852603629231453,
0.0333130806684494,
-0.17798325419425964,
-0.058438997715711594,
-0.03940358757972717,
-0.010762380436062813,
-0.09124740213155746,
-0.012535616755485535,
-0.32937684655189514,
-0.01378459483385086,
0.15389369428157806,
0.05444213002920151,
-0.0646744966506958,
-0.06181694567203522,
-0.05164632201194763,
0.04350520670413971,
0.027399681508541107,
-0.07400462031364441,
0.02182113192975521,
-0.04885312542319298,
0.01715400256216526,
0.05218293145298958,
0.008833762258291245,
-0.0024471976794302464,
-0.21154733002185822,
-0.0606483519077301,
-0.060366541147232056,
0.06149572506546974,
0.031614840030670166,
0.013009509071707726,
-0.004330472555011511,
0.026700293645262718,
-0.06348709017038345,
0.05302625894546509,
0.09787731617689133,
-0.0013748739147558808,
0.039161987602710724,
0.037430837750434875,
-0.02018468640744686,
-0.016829319298267365,
-0.11597952246665955,
-0.08465324342250824
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-spanish-custom
This model was trained from scratch on the common_voice dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.2245
- eval_wer: 0.2082
- eval_runtime: 801.6784
- eval_samples_per_second: 18.822
- eval_steps_per_second: 2.354
- epoch: 0.76
- step: 8400
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
{"tags": ["generated_from_trainer"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-spanish-custom", "results": []}]}
|
automatic-speech-recognition
|
glob-asr/base-spanish-asr
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #endpoints_compatible #region-us
|
# wav2vec2-large-xls-r-300m-spanish-custom
This model was trained from scratch on the common_voice dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.2245
- eval_wer: 0.2082
- eval_runtime: 801.6784
- eval_samples_per_second: 18.822
- eval_steps_per_second: 2.354
- epoch: 0.76
- step: 8400
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
[
"# wav2vec2-large-xls-r-300m-spanish-custom\n\nThis model was trained from scratch on the common_voice dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.2245\n- eval_wer: 0.2082\n- eval_runtime: 801.6784\n- eval_samples_per_second: 18.822\n- eval_steps_per_second: 2.354\n- epoch: 0.76\n- step: 8400",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #endpoints_compatible #region-us \n",
"# wav2vec2-large-xls-r-300m-spanish-custom\n\nThis model was trained from scratch on the common_voice dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.2245\n- eval_wer: 0.2082\n- eval_runtime: 801.6784\n- eval_samples_per_second: 18.822\n- eval_steps_per_second: 2.354\n- epoch: 0.76\n- step: 8400",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 10\n- mixed_precision_training: Native AMP",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
53,
117,
6,
12,
8,
3,
140,
41
] |
[
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #generated_from_trainer #dataset-common_voice #endpoints_compatible #region-us \n# wav2vec2-large-xls-r-300m-spanish-custom\n\nThis model was trained from scratch on the common_voice dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.2245\n- eval_wer: 0.2082\n- eval_runtime: 801.6784\n- eval_samples_per_second: 18.822\n- eval_steps_per_second: 2.354\n- epoch: 0.76\n- step: 8400## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 200\n- num_epochs: 10\n- mixed_precision_training: Native AMP### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.17.1.dev0\n- Tokenizers 0.11.0"
] |
[
-0.0907943919301033,
0.09070634841918945,
-0.005313801113516092,
0.050080303102731705,
0.10855699330568314,
0.022230979055166245,
0.10773549973964691,
0.14047794044017792,
-0.06046762689948082,
0.09586306661367416,
0.07014477252960205,
0.01263956818729639,
0.07043635845184326,
0.08171439170837402,
-0.005397886969149113,
-0.24029207229614258,
0.01979232393205166,
-0.021629733964800835,
-0.0872982069849968,
0.10185588896274567,
0.1037091463804245,
-0.09616285562515259,
0.04140595346689224,
0.03356214985251427,
-0.07393749058246613,
0.02475779317319393,
-0.027561472728848457,
-0.06696200370788574,
0.11207634210586548,
0.03828831762075424,
0.10164213180541992,
-0.014087222516536713,
0.09747245162725449,
-0.2802269458770752,
-0.004102346953004599,
0.10871399194002151,
0.04141020402312279,
0.07380351424217224,
0.07657618075609207,
-0.0013254923978820443,
0.11425387859344482,
-0.08254719525575638,
0.12155573070049286,
0.03309699147939682,
-0.11413925886154175,
-0.25367847084999084,
-0.08495385199785233,
0.02922177128493786,
0.11989166587591171,
0.0987640917301178,
-0.02852482907474041,
0.05786668509244919,
-0.09504002332687378,
0.06595157086849213,
0.19105730950832367,
-0.24790646135807037,
-0.05133162438869476,
0.012402670457959175,
0.033365122973918915,
0.03924516215920448,
-0.09761585295200348,
-0.029782429337501526,
0.03489435836672783,
0.016315482556819916,
0.10863113403320312,
0.020181214436888695,
0.012423662468791008,
0.030337605625391006,
-0.11308960616588593,
-0.10181841254234314,
0.11199992895126343,
0.05053327605128288,
-0.03355863690376282,
-0.1700311154127121,
-0.04585129767656326,
-0.16798725724220276,
0.0074079581536352634,
-0.042497631162405014,
0.020802075043320656,
-0.017399562522768974,
-0.06876372545957565,
-0.013261829502880573,
-0.07566778361797333,
-0.07078760862350464,
0.031057655811309814,
0.12793157994747162,
0.038370922207832336,
0.006053850054740906,
-0.027426309883594513,
0.12835519015789032,
0.03461660444736481,
-0.12223506718873978,
-0.04394032061100006,
0.018996601924300194,
-0.12301205843687057,
-0.014942898415029049,
-0.04927555099129677,
-0.024774590507149696,
-0.04183395206928253,
0.17085061967372894,
-0.003984021954238415,
0.07909374684095383,
0.04568273574113846,
0.004416865762323141,
-0.04422574117779732,
0.17755243182182312,
-0.03842252120375633,
-0.05899236723780632,
-0.05710180848836899,
0.10969948768615723,
-0.016613757237792015,
-0.02685900218784809,
-0.035477422177791595,
0.014452925883233547,
0.05106139928102493,
0.05045156553387642,
-0.030336350202560425,
0.008148238062858582,
-0.04534474387764931,
-0.025028947740793228,
-0.028075769543647766,
-0.13374879956245422,
0.04127728193998337,
0.03188604488968849,
-0.0802484005689621,
-0.01356763020157814,
0.021570878103375435,
0.004100554157048464,
-0.052757520228624344,
0.11128786206245422,
-0.05002368986606598,
-0.029473675414919853,
-0.07873682677745819,
-0.08461356908082962,
0.007930820807814598,
-0.03823447972536087,
-0.009101801551878452,
-0.05169324949383736,
-0.15030381083488464,
-0.06297794729471207,
0.0720805823802948,
-0.046139705926179886,
-0.0457557812333107,
-0.04949260875582695,
-0.032431021332740784,
0.051359329372644424,
-0.01583559811115265,
0.1152784526348114,
-0.04466424509882927,
0.07122119516134262,
-0.028655217960476875,
0.03145333006978035,
0.1324327439069748,
0.05421432852745056,
-0.07307180017232895,
0.047829244285821915,
-0.06830240786075592,
0.11109145730733871,
-0.0495106503367424,
-0.02829771116375923,
-0.13553468883037567,
-0.08933845907449722,
-0.04680633917450905,
0.012115566059947014,
0.08820841461420059,
0.09371127188205719,
-0.15937833487987518,
-0.0717620775103569,
0.1636742353439331,
-0.06070196256041527,
-0.06509935110807419,
0.10570263117551804,
-0.05335526540875435,
0.016052722930908203,
0.05153016746044159,
0.1456092745065689,
0.12379686534404755,
-0.1261591762304306,
-0.006490130443125963,
-0.028667720034718513,
0.0889967679977417,
0.10601405054330826,
0.0773487463593483,
-0.05162125825881958,
0.06353811174631119,
-0.00867256335914135,
-0.07602509111166,
0.026432940736413002,
-0.04442202299833298,
-0.0873667374253273,
-0.023444723337888718,
-0.042840808629989624,
0.01172382477670908,
0.020364975556731224,
0.011063688434660435,
-0.05861315131187439,
-0.14184248447418213,
0.05930040404200554,
0.1227191835641861,
-0.10814578831195831,
0.03268599510192871,
-0.08152756839990616,
0.0063882144168019295,
0.008491083979606628,
-0.004759647883474827,
-0.16878128051757812,
-0.04881967604160309,
0.02355727180838585,
-0.10535039752721786,
-0.024229036644101143,
0.042387038469314575,
0.07739388942718506,
0.04910128936171532,
-0.04843379929661751,
-0.04850761219859123,
-0.12075372785329819,
-0.03748783469200134,
-0.0720297247171402,
-0.13908368349075317,
-0.07044002413749695,
-0.02692614682018757,
0.171256884932518,
-0.21464617550373077,
0.006301353219896555,
0.03322746604681015,
0.11431370675563812,
0.054998550564050674,
-0.08222191780805588,
-0.03281557559967041,
0.036294639110565186,
-0.010976545512676239,
-0.09408800303936005,
0.017846757546067238,
0.0051464480347931385,
-0.09558899700641632,
0.01842876337468624,
-0.1793603003025055,
0.013683686964213848,
0.0725332647562027,
0.07020676881074905,
-0.0689539760351181,
-0.05431048572063446,
-0.05545550584793091,
-0.0359816774725914,
-0.08420535922050476,
-0.03843908756971359,
0.24316731095314026,
0.0435183048248291,
0.13041391968727112,
-0.0694831907749176,
-0.07689045369625092,
0.001476471428759396,
0.03298301249742508,
-0.0022561950609087944,
0.12630809843540192,
0.0321345217525959,
-0.029341747984290123,
0.056830279529094696,
0.0721750482916832,
-0.009645511396229267,
0.1557222157716751,
-0.05248340591788292,
-0.10746516287326813,
-0.03908577188849449,
0.014523002319037914,
0.01244335062801838,
0.09677795320749283,
-0.09525937587022781,
0.007876341231167316,
0.04384605213999748,
0.04750102385878563,
0.008254926651716232,
-0.18112847208976746,
-0.0007160279783420265,
0.09271325916051865,
-0.050661541521549225,
-0.017642296850681305,
-0.018846634775400162,
0.011073028668761253,
0.07119515538215637,
0.010434459894895554,
-0.020214179530739784,
-0.018460314720869064,
-0.043117642402648926,
-0.10578972846269608,
0.181865856051445,
-0.07564681023359299,
-0.1434425711631775,
-0.13194508850574493,
0.0314909853041172,
-0.0008451006142422557,
-0.009365011006593704,
0.02582661435008049,
-0.09431691467761993,
-0.06121661514043808,
-0.09518823772668839,
0.03852314129471779,
-0.029275858774781227,
-0.016442742198705673,
0.05032723769545555,
0.0114050367847085,
0.07790130376815796,
-0.14092965424060822,
0.018399933353066444,
-0.009533554315567017,
-0.043364159762859344,
-0.02006467618048191,
0.05018256977200508,
0.05687756463885307,
0.14953045547008514,
0.023951705545186996,
0.031024735420942307,
-0.031834740191698074,
0.2049591988325119,
-0.12432088702917099,
-0.03590920940041542,
0.12197203934192657,
0.0010885787196457386,
0.049257777631282806,
0.08886613696813583,
0.03762292489409447,
-0.07692696154117584,
0.02617022953927517,
0.07809006422758102,
-0.04278666898608208,
-0.24935175478458405,
-0.013268820010125637,
-0.03766067698597908,
-0.03904491662979126,
0.16955646872520447,
0.024587959051132202,
-0.03348532319068909,
0.0318196676671505,
-0.04773244261741638,
0.032564159482717514,
0.05830880254507065,
0.07247712463140488,
0.0764889344573021,
0.03620073199272156,
0.10087218135595322,
-0.0110199935734272,
-0.024061741307377815,
0.04250703752040863,
-0.007236554753035307,
0.25072669982910156,
-0.013280163519084454,
0.14276207983493805,
0.02451791986823082,
0.09234020113945007,
-0.021990440785884857,
0.0607001930475235,
0.004440542310476303,
-0.0019639290403574705,
0.00726767024025321,
-0.06276635825634003,
-0.035702720284461975,
0.059228453785181046,
0.029226385056972504,
0.021601231768727303,
-0.08007872849702835,
0.03609559312462807,
0.019224070012569427,
0.3308884799480438,
0.07138153165578842,
-0.26171353459358215,
-0.06471337378025055,
0.013201963156461716,
-0.062398288398981094,
-0.07163377851247787,
0.024468272924423218,
0.13649797439575195,
-0.13589966297149658,
0.040015727281570435,
-0.0760737955570221,
0.08546188473701477,
-0.06674948334693909,
-0.010886204428970814,
0.06423290073871613,
0.04638395830988884,
-0.0010196845978498459,
0.06429926306009293,
-0.1772657334804535,
0.2215256690979004,
-0.009018863551318645,
0.08658107370138168,
-0.0633627250790596,
0.05970415472984314,
0.004519400652498007,
-0.045576125383377075,
0.09117306023836136,
0.007300651166588068,
-0.09654632210731506,
-0.1711118370294571,
-0.07189126312732697,
0.023133128881454468,
0.10694608837366104,
-0.11506718397140503,
0.10829973965883255,
-0.04985426738858223,
0.011570355854928493,
0.026704462245106697,
-0.02765694260597229,
-0.071097731590271,
-0.15807771682739258,
0.02125111036002636,
0.0024501518346369267,
0.031975146383047104,
-0.09110325574874878,
-0.08506304770708084,
-0.033820655196905136,
0.1421298086643219,
-0.07246426492929459,
-0.03655168414115906,
-0.12635576725006104,
0.03859404847025871,
0.1380767524242401,
-0.05651981011033058,
0.04737474024295807,
0.054610755294561386,
0.13775812089443207,
-0.0013813197147101164,
-0.03159705922007561,
0.057917650789022446,
-0.061387985944747925,
-0.18251705169677734,
-0.05205366015434265,
0.15559490025043488,
0.04516642913222313,
0.0550566129386425,
-0.0029361359775066376,
0.041156165301799774,
-0.0034041458275169134,
-0.07891790568828583,
0.04113351181149483,
0.08494029939174652,
0.01492855791002512,
0.035638805478811264,
-0.002806986914947629,
0.07130710780620575,
-0.05130595713853836,
-0.027546430006623268,
0.11210563778877258,
0.24024172127246857,
-0.07381577044725418,
0.06911393254995346,
0.025647785514593124,
-0.0762069895863533,
-0.14118072390556335,
-0.001681990223005414,
0.12213058024644852,
0.059236299246549606,
0.0583045557141304,
-0.20412437617778778,
0.05012914538383484,
0.11071764677762985,
-0.015215720981359482,
0.022166598588228226,
-0.33892834186553955,
-0.14685215055942535,
0.07737920433282852,
0.05537598952651024,
-0.026337001472711563,
-0.09508870542049408,
-0.04323476180434227,
-0.0404263436794281,
-0.09898959845304489,
0.05662555992603302,
-0.020013606175780296,
0.10305500775575638,
0.002053347881883383,
0.04978613927960396,
0.05455610901117325,
-0.0394442155957222,
0.16780734062194824,
-0.0028462312184274197,
0.06813139468431473,
-0.025268299505114555,
0.04178230091929436,
0.010113539174199104,
-0.08362328261137009,
0.03522752970457077,
-0.0785461887717247,
0.04113385081291199,
-0.18690644204616547,
-0.04325537756085396,
-0.04778176546096802,
0.005649383179843426,
-0.03902376815676689,
-0.05083543807268143,
-0.06738431751728058,
0.02751985564827919,
0.0960763692855835,
-0.006779715418815613,
0.08075345307588577,
-0.009156130254268646,
0.07206738740205765,
0.09871643036603928,
0.062460821121931076,
0.04187982529401779,
-0.16582927107810974,
-0.013993842527270317,
0.018062865361571312,
0.08370109647512436,
-0.17120859026908875,
0.05264512449502945,
0.12627080082893372,
0.048741456121206284,
0.1692986935377121,
0.032746195793151855,
-0.08497708290815353,
0.019152222201228142,
0.03068951703608036,
-0.06903351098299026,
-0.1483229547739029,
-0.025290952995419502,
-0.0011660570744425058,
-0.14498518407344818,
-0.045088570564985275,
0.12777189910411835,
-0.046681009232997894,
-0.004163356032222509,
-0.0012294932967051864,
0.030721792951226234,
-0.017106587067246437,
0.2002180814743042,
-0.014285234734416008,
0.09296132624149323,
-0.0672258734703064,
0.11118049919605255,
0.0779813677072525,
-0.10498631745576859,
0.051219966262578964,
0.016978127881884575,
-0.04310154169797897,
-0.00143936718814075,
-0.023126043379306793,
0.055173736065626144,
0.01571253128349781,
-0.040039293467998505,
-0.08363735675811768,
-0.12174911051988602,
0.07259329408407211,
0.013348285108804703,
0.019152307882905006,
0.022196277976036072,
-0.05554991960525513,
0.018917905166745186,
-0.1001288965344429,
0.09255263209342957,
0.07965261489152908,
0.03169229254126549,
-0.09868402779102325,
0.14382658898830414,
0.02558285929262638,
-0.014812353067100048,
0.01845325529575348,
-0.01918347179889679,
-0.0756760910153389,
0.02293442189693451,
-0.09503436833620071,
-0.004189712926745415,
-0.008693956770002842,
0.005389200057834387,
-0.0008333895239047706,
-0.007108390796929598,
-0.042359914630651474,
0.04255509376525879,
-0.09365624189376831,
-0.06278298050165176,
-0.013522560708224773,
0.07044100761413574,
-0.11861243098974228,
-0.0028608141001313925,
0.05094561725854874,
-0.13478092849254608,
0.08609005063772202,
0.04300963133573532,
0.025617266073822975,
0.021493682637810707,
-0.05710049346089363,
0.016922615468502045,
0.030156385153532028,
0.024244684725999832,
0.009255870245397091,
-0.14931480586528778,
0.00206449581310153,
-0.043131712824106216,
-0.009796923026442528,
0.00909886322915554,
-0.023258330300450325,
-0.12867970764636993,
-0.021277733147144318,
-0.02827555313706398,
-0.05496194213628769,
-0.04739305377006531,
0.04819618910551071,
0.04631854221224785,
0.05528970807790756,
0.1595342457294464,
-0.054748717695474625,
0.06881695985794067,
-0.24088099598884583,
-0.021392589434981346,
0.01323732454329729,
-0.0031598450150340796,
-0.025080690160393715,
-0.036914221942424774,
0.07254542410373688,
-0.05534958839416504,
0.0892983078956604,
-0.015742193907499313,
0.050612762570381165,
0.03767065331339836,
-0.07082647830247879,
0.002031003125011921,
0.02007514238357544,
0.1708000898361206,
0.062318600714206696,
-0.013037372380495071,
0.0663379579782486,
-0.03639115020632744,
0.058053143322467804,
0.15784403681755066,
0.09998046606779099,
0.1795678436756134,
0.03165546804666519,
0.07581675052642822,
0.025520984083414078,
-0.129344180226326,
-0.13158650696277618,
0.10177473723888397,
-0.054589662700891495,
0.12154245376586914,
-0.031839754432439804,
0.18083159625530243,
0.12000883370637894,
-0.21275269985198975,
0.06922361999750137,
-0.03856601193547249,
-0.07785316556692123,
-0.12346187978982925,
-0.09539748728275299,
-0.07456404715776443,
-0.1345362812280655,
0.03586820140480995,
-0.10510804504156113,
0.06420586258172989,
0.10630116611719131,
0.02590298466384411,
0.028979890048503876,
0.11393387615680695,
-0.050906650722026825,
-0.02962501347064972,
0.09078928083181381,
0.02665916085243225,
0.016934068873524666,
-0.055505234748125076,
-0.06640625,
0.06804181635379791,
-0.030220486223697662,
0.09051522612571716,
-0.039002105593681335,
-0.0036845284048467875,
0.04134722799062729,
0.01113983429968357,
-0.060111407190561295,
0.013963711448013783,
-0.012268752790987492,
0.05221652612090111,
0.06672350317239761,
0.06800315529108047,
0.021859221160411835,
-0.04800814390182495,
0.25935858488082886,
-0.07763863354921341,
-0.032966498285532,
-0.15806332230567932,
0.13091671466827393,
0.06141791120171547,
0.02339993789792061,
0.04901295527815819,
-0.10011167079210281,
-0.02026607096195221,
0.13388679921627045,
0.09929877519607544,
-0.011504986323416233,
-0.014393623918294907,
-0.003545534098520875,
-0.011013038456439972,
-0.03888476639986038,
0.07102355360984802,
0.07782198488712311,
0.04138568416237831,
-0.03340337052941322,
0.015668649226427078,
-0.006908017676323652,
-0.054152898490428925,
-0.013869530521333218,
0.12484520673751831,
-0.0074617923237383366,
-0.00398440845310688,
-0.03968921676278114,
0.06822360306978226,
0.029343845322728157,
-0.1716303676366806,
0.0816032737493515,
-0.17638356983661652,
-0.1589345633983612,
-0.010085531510412693,
0.04458615183830261,
0.004393469076603651,
0.07652714848518372,
-0.006280762609094381,
-0.03658205270767212,
0.1338438093662262,
0.00913894921541214,
-0.04908449575304985,
-0.1411067545413971,
0.07189346104860306,
-0.06061861291527748,
0.23343834280967712,
0.000593575940001756,
0.04076436534523964,
0.09083303809165955,
0.043419525027275085,
-0.09360134601593018,
0.018369490280747414,
0.08400800079107285,
-0.09071214497089386,
0.005853387527167797,
0.15423054993152618,
-0.049162544310092926,
0.1500038057565689,
0.06744302064180374,
-0.11081207543611526,
0.014054196886718273,
-0.012404510751366615,
-0.016658462584018707,
-0.0762336403131485,
-0.02204335667192936,
-0.03449143469333649,
0.12510530650615692,
0.22549398243427277,
-0.03417380899190903,
-0.012041274458169937,
-0.038625314831733704,
0.032187994569540024,
0.027901409193873405,
0.09740931540727615,
-0.006402093451470137,
-0.1952965259552002,
0.03522305563092232,
0.015328248962759972,
0.007885458879172802,
-0.21039065718650818,
-0.12187894433736801,
0.08062320202589035,
-0.08112655580043793,
-0.021778907626867294,
0.1239355131983757,
0.057142700999975204,
0.020491579547524452,
-0.04343920201063156,
-0.14046292006969452,
-0.01087791658937931,
0.15562817454338074,
-0.12208327651023865,
-0.05256786197423935
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.